Deep Research - Content too Large

I just began getting an execution error running deep research (modified). The latest change I made was to add one more doc to the knowledege files. Now the I’m seeing an error in the Compile Report block. What can I do to manage this?

Hi @mikeboysen,

It seems like Claude 4 Sonnet returned an error saying the context is too long. That message is coming straight from the model.

You can either reduce the amount of data you’re sending to fit within its context window or switch to a model with a larger context window.

I ran it on the default sonnet 3.7 first, then 4. Same thing. Also tried Gemini but Sonnet is the only one with a Max Response Size of 128,000. It’s strange that Gemini is only half that. Is there another model with a larger response size?

Hi @mikeboysen,

I believe the Max Response Size isn’t the main issue here, it’s more about the Context Window in your current setup.

  • Gemini 2.5 Flash and Pro have a smaller Max Response Size (56k tokens), but they support a much larger context (1 million tokens)
  • Claude Sonnet models allow for a higher Max Response Size (128k tokens) but have a smaller Context Window (200k tokens)

So while Anthropic models can output more data in a single response, they handle much less context overall, which seems to be the problem in this case.

1 Like