New functions for Google AI Gemini herald the end for Samsung's Bixby

Gemini can now use multiple apps for a single request. This prompts Samsung to sideline its own assistant Bixby.

listen Print view
Google Gemini Live on smartphone with multiple apps

(Image: Google)

4 min. read

To mark the launch of Samsung's new Galaxy S25, S25+ and S25 Ultra triumvirate, Google has given its AI assistant Gemini enhanced functions. The artificial intelligence can now use several apps as part of a single request. This has apparently also convinced Samsung, so Gemini is now the standard assistant for the Galaxy S25 series. Although the previous Galaxy assistant Bixby is still pre-installed, it is rarely seen.

In the photo gallery, for example, a click on the eye icon wakes it up. Bixby then analyzes the photo just like Gemini and provides information about it, even in spoken language if desired. It remains unclear why the new Google AI is not also used here. Otherwise, Bixby is hidden in an app folder. In the past, the Galaxy Assistant was activated by a long press on the power button, but on the Galaxy S25 series this now opens Google's Gemini.

Google's AI assistant, which has now been expanded, can process more complex queries across app boundaries. For example, you can ask the assistant for the best restaurant in the area in one go and have it send a message to a contact from the address book for a meal together, which is entered directly in the calendar. According to Samsung, the user can say goodbye to using individual apps, as the AI can now decide which app is used for the desired actions.

Google Gemini uses several apps

(Image: Google via The Verge)

Gemini has been able to use different apps across the board since spring 2024, when Google released AI extensions for Google services. This allows users to link generative AI with Google services such as Gmail, Maps, Google Drive and Google Docs. There are also some apps such as WhatsApp and Spotify that the AI assistant can use via Gemini extension. Samsung's own apps such as Calendar and Notes are now also included, at least for S25 owners.

Users of new smartphones such as Google's Pixel 9 phones or Samsung's S25 and S24 series can also use Gemini Live to communicate with the AI assistant by voice. According to Google, Gemini Live will now also be able to process images, files, and YouTube videos on these devices. More Android smartphones are set to receive this Gemini Live function in the coming weeks. Google is also planning to integrate the extended capabilities of Project Astra in the coming months, so that screen sharing and live video streaming should then be possible with the Gemini app.

Videos by heise

The Circle-to-Search search function, which was introduced almost exactly a year ago, has also been updated. This allows you to start a Google search by circling a section of the smartphone display with your finger. Phone numbers, email addresses and URLs are now also automatically recognized, making it easier to use this information. According to Google, the AI overview is also being expanded when responding. It is now more likely that places, objects, and works of art will be recognized correctly so that the AI will throw up the relevant details.

While the extended Circle-to-Search should be available for all suitable Android smartphones, Gemini Live is initially limited to the high-end devices mentioned. According to Google, the new Gemini AI functions described above will be made available to all Gemini users, in the internet browser, for Android and for iOS.

(fds)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.