Why can't Github Copilot readily access context from previous chats? #150189
Replies: 5 comments 2 replies
-
|
🕒 Discussion Activity Reminder 🕒 This Discussion has been labeled as dormant by an automated system for having no activity in the last 60 days. Please consider one the following actions: 1️⃣ Close as Out of Date: If the topic is no longer relevant, close the Discussion as 2️⃣ Provide More Information: Share additional details or context — or let the community know if you've found a solution on your own. 3️⃣ Mark a Reply as Answer: If your question has been answered by a reply, mark the most helpful reply as the solution. Note: This dormant notification will only apply to Discussions with the Thank you for helping bring this Discussion to a resolution! 💬 |
Beta Was this translation helpful? Give feedback.
-
|
Well, I am gonna bump it then. Could we get some broad roadmap at least? According to the docs and the settings there IS already a feature for context memory but right now it's not rolled out to the Chat module. When can we count on it? Another issue: GitHub Copilot in web vs in VSCode CANNOT access the same chats. That's, mildly put, idiotic and ruins a whole workflow / leads to repetitions. It's the same tool. I am working in the very same repo. It should ALWAYS remember everything across everywhere. Gotta be honest, I know that the focus is on code & CLI but I will think twice about renewing my subscription because no memory across chat and tool integrations is just painful and I am not happy about paying to get frustrated.# Also, no text to speech for the Copilot Chat? And no speech to text? (And yes, I know about the VSCode plugin -- that's still a joke. No chat integration, no continuous TTS etc.) EVERY other AI chat has this. EVERY single one. Also a HUGE L for folks like me dealing with ADHD e.g. It is literally easier to open a new tab with Claude (Free Tier) and just tell it whatever tf I want from it bc it remembers everything than to work with Paid GitHub Copilot. |
Beta Was this translation helpful? Give feedback.
-
|
GitHub Copilot (including Copilot Chat in editors like Visual Studio Code) is designed with session-based context rather than persistent conversational memory. This means each chat interaction is treated as an independent request unless the environment explicitly provides context. There are four primary technical reasons for this behavior.
Large language models used by Copilot are stateless by default. What this means: The model does not store previous conversations permanently. Each request is processed only with the information included in the current prompt or session window. The system works like this: User Prompt If the previous conversation is not included in the context builder, the model simply cannot see it. Why GitHub uses stateless architecture Stateless systems provide: better security better privacy lower infrastructure complexity predictable token usage Because of this design, Copilot does not maintain a long-term memory of conversations.
AI models can only process a limited number of tokens per request. For example: Model Context WindowSystem prompt
|
Beta Was this translation helpful? Give feedback.
-
|
This is expected with GitHub Copilot. Copilot Chat only has access to the current chat session and workspace context (like open files or selected code). It does not persist or recall conversations from previous chats, even if they happened recently. In editors such as Visual Studio Code, starting a new chat means the previous messages are not included in the model’s context window. If you want Copilot to use earlier information, you’ll need to continue the same chat or paste the relevant context again. |
Beta Was this translation helpful? Give feedback.
-
|
Thank you for the suggestions. I advised Copilot Chat to summarize the chats as instruction files, now this stupid thing RELIGIOUSLY sticks to them instead of simply referencing them and expanding them where needed, completely ignoring my new instructions because the instruction file said otherwise. Now I am indeed convinced that I am not renewing. This thing just got even dumber. Consider my bump closed. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
Question
Body
One of the best parts of other AI models is their ability to access information I've already given them to give contextualized responses. My experience with Girhub Copilot so far is that if I ask it to access context from a chat I sent within the hour, it can't access it. For a bot designed to lean towards back-end servicing, how does this make sense? It's possible I just don't know what I'm doing, so let me know if that's the case too.
Beta Was this translation helpful? Give feedback.
All reactions