Medicine: Google presents AI models "MedGemma" and innovations in "AMIE"
Google has unveiled new AI models for medical content based on Gemma 3 to accelerate the development of healthcare applications.
(Image: Art Stock Creative/Shutterstock.com)
At this year's Google I/O, Google presented MedGemma, two large AI language models designed to analyze medical texts and images. MedGemma is based on the current Gemma 3 architecture and is designed to significantly accelerate the development of new healthcare applications. The multimodal MedGemma 4B model can be used to create AI applications for analyzing radiological images and summarizing clinical data, or for other medical tasks.
Due to its compact size, MedGemma can be efficiently fine-tuned for specific use cases. In the MedQA benchmark, MedGemma 27B performed at a similar level to much larger models such as GPT-4o in clinical knowledge and medical reasoning, according to Google, although DeepSeek R1, Gemini 2.5 Pro and GPT-o3 are ahead of MedGemma 27B.
The MedGemma models are freely available, can be self-hosted and can be used both locally and on the Google Cloud platform. The MedGemma 4B and MedGemma 27B (text only) variants are now available on Hugging Face and in Model Garden, an AI, and machine learning model library from Google.
The models can be combined with other tools to solve complex tasks, such as web search for looking up up-to-date medical information or a FHIR interpreter for processing and creating standardized health data in FHIR format. Further information can be found in the official MedGemma documentation.
Videos by heise
In addition, Google presented the latest developments of AMIE (Articulate Medical Intelligence Explorer) – an AI agent for medical diagnosis conversations developed together with Google DeepMind. The new multimodal version of AMIE can also interpret medical image information such as photos, laboratory findings or ECGs and use this information for more precise diagnoses. The system is designed to specifically ask for such data, analyze it and integrate it into the conversation.
Disease management
The AMIE AI agent is designed to support the management of chronic diseases across several visits to the doctor. Two agents are used for this, a dialog agent and a management agent that creates structured treatment and monitoring plans based on clinical guidelines.
The entire system is based on the current Gemini models, which have been optimized for multimodal processing and complex medical reasoning. Medical findings, summaries, and real clinical conversations were used for AI training. According to a study by Google, AMIE performed better in simulated chat consultations with patients than real GPs –, at least when it came to interpreting multimodal data (such as images, texts, findings). The system is also said to have shown more empathy, for example.
(Image:Â Google)
(mack)