tmountain 2 days ago

I often ask the LLM for a concise summary of the discussion so far—formatted as a prompt. I then edit it appropriately and use it to start a new conversation without the baggage. I have found this to be a very effective technique, but I imagine it will be automated sometime soon.

2
drewbitt 1 day ago

Cursor tried doing this automatically - it may still if you're not on a large context model like gemini 2.5 pro - but I found the summary was just missing too many details to use out of the box.

maleldil 2 days ago

Claude Code has a /compact command that summarises the conversation so far to save on context tokens.