Microsoft Copilot: Voice Chat and Think Deeper now free and unlimited use
Microsoft is making Copilot Voice and Think Deeper available free of charge and without limits. However, registration is required.
Microsoft's Copilot Voice is also easy to use on your cell phone.
(Image: Screenshot / dmk)
Microsoft is making access to the Copilot functions Voice and Think Deeper available free of charge. The services can be used indefinitely. However, when trying them out without prior registration, the period for voice chat, for example, in which Copilot can be used in a natural language comparable to a human contact, is limited to two minutes. However, after registering with a free consumer account, for example, no time limit is displayed, as previously announced.
Helper for everyday tasks
Visiting the URL copilot.ai redirects to copilot.microsoft.com/chats/, where the Copilot chat is directly available. By clicking on the “Think Deeper” button, the OpenAI o1 language model, which is somewhat slower but has more depth, can be switched on to answer questions. Clicking on the microphone icon, on the other hand, switches to using the voice chat.
(Image:Â Screenshot / dmk)
Copilot immediately gave a correct answer to an anecdotal scientific question about the test with Think Deeper. In voice chat, however, the first answer was technically incorrect, but after pointing out the error, the AI corrected itself based on a given keyword.
Videos by heise
In the announcement on the Copilot blog, Microsoft mentions use cases such as practising sentences in a new language when visiting a new country or meeting new people. Or that Copilot Voice can help with job interview training. Think Deeper, on the other hand, recommends Microsoft for support with purchasing decisions, such as choosing future-proof electric cars or tips on which renovation work will add the most value to a property on a limited budget.
Jan-Keno Janssen has already demonstrated the use of AI voice chats in c't 3003. However, users should not place blind trust in all of this, even if the “normal language use” may tempt them to do so. Microsoft therefore also writes directly on the Copilot chat page “Copilot can make mistakes”. We have already been able to provoke this with a simple scientific question from a not-too-complex high school course.
(dmk)