Skip to content

Feature Request / Question: Support for reasoning_effort with Ollama provider #1100

@jonathanGGa

Description

@jonathanGGa

​Context
I am using @copilot-kit/sdk, and I want to configure Ollama as my LLM provider.

​The Issue
I would like to know if it is possible to configure the reasoning_effort parameter (low, medium, high) through the SDK when using the Ollama provider (the "think" parameter in ollama takes only 2 values (True/False)) .
​Currently, I cannot find a clear way to pass this specific parameter in the provider configuration or the request options.

​Proposed / Asked behavior
Is there an existing way to pass provider-specific parameters like reasoning_effort to Ollama? If not, are there plans to support this to allow better control over "thinking" tokens?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions