Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yesterday, I had it think for 194 seconds. At some point near the end, it said "This is getting frustrating!"


I must not be hunting the right keywords but I was trying to figure this out earlier. How do you set how much time it “thinks”? If you let it run too long does the context window fill and it’s unable to do anymore?


It looks like their API is OpenAI compatible but their docs say that they don’t support the `reasoning_effort` parameter yet.

> max_tokens:The maximum length of the final response after the CoT output is completed, defaulting to 4K, with a maximum of 8K. Note that the CoT output can reach up to 32K tokens, and the parameter to control the CoT length (reasoning_effort) will be available soon. [1]

[1] https://api-docs.deepseek.com/guides/reasoning_model




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: