0

I'm trying to use Continue.dev with a local model running on Ollama, but keep getting the "file exceeds context length" error.

My config looks like this:

  - name: qwen3-coder
    provider: ollama
    model: qwen3-coder
    apiBase: http://localhost:11434/
    contextLength: 256000
    roles:
      - chat

Even when setting contextLength to an inaccurately large value (2560000), I get the same error on a file that should fit in context (29 KB). The contextLength setting seems to be ignored.

1 Answer 1

0

The problem was on the Ollama side. Ollama has a default context window size of 4096 tokens. To see the context window size for the current model, run:

ollama show qwen3-coder

If there is no entry that says:

Parameters
   num_ctx           256000

then the default is 4096.

You need to override this somehow. For a one-time change:

ollama run qwen3-coder /set parameter num_ctx 256000

Or to make it permanent:

ollama show qwen3-coder --modelfile > temp.modelfile
echo "PARAMETER num_ctx 256000" >> temp.modelfile
ollama create qwen3-coder -f temp.modelfile

Make sure you have the right context length in your Continue.dev config too:

  - name: qwen3-coder
    contextLength: 256000

Then edit the Continue.dev config.yaml to trigger a model reload, and it should solve the Continue.dev "file size exceeds context length" error, unless the file exceeds the true context length.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.