Google’s Gemma AI models have surpassed 150 million downloads since their launch in February 2024, marking a significant milestone in the company’s ongoing expansion into the open-source AI ecosystem.
Highlights
The update was shared by Omar Sanseviero, a developer relations engineer at Google DeepMind, who also noted that more than 70,000 community-created variants of Gemma have been developed on Hugging Face, a widely used platform for collaborative AI research.
Introduced as Google’s response to Meta’s Llama models, Gemma is part of a broader push to provide open-access alternatives for developers and researchers. Despite the milestone, Gemma still lags behind Llama, which recorded approximately 1.2 billion downloads as of April 2025.
Model Capabilities Of Gemma AI
Gemma has steadily evolved since its debut, with the latest version, Gemma 3, bringing a series of technical upgrades aimed at increasing its versatility and appeal across a range of applications:
- Multimodal Functionality: Gemma 3 supports both text and image inputs, enabling more advanced use cases that rely on multimodal reasoning and content generation.
- Expanded Context Window: The model can process up to 128,000 tokens, allowing it to handle complex tasks involving long-form content or multiple inputs.
- Multilingual Support: Gemma 3 is equipped to handle more than 140 languages, broadening its accessibility for developers working on international applications.
- Function Calling: This feature enables automated task execution, paving the way for building AI agents and more dynamic applications.
TxGemma in Drug Discovery
In addition to general-purpose updates, Google has introduced specialized variants such as TxGemma, aimed at supporting AI applications in the pharmaceutical and biomedical fields.
TxGemma is designed to improve predictions related to therapeutic properties, helping streamline the drug discovery process. By targeting specific sectors, Google is positioning Gemma not just as a general AI model, but as a tool adaptable to industry-specific challenges.
Community Engagement
The strong presence of Gemma on Hugging Face—where thousands of developers have created and shared model variants—underscores its appeal within the research and developer community.
Google has emphasized open participation, encouraging experimentation and adaptation, which has contributed to its rapid adoption.
Licensing Challenges
Despite the rapid growth in downloads and community involvement, concerns persist regarding Gemma’s licensing framework.
Similar to Meta’s Llama models, Gemma is distributed under custom, non-standard licensing terms.
While these licenses are designed to provide a balance between openness and platform control, some developers and businesses view them as a source of legal ambiguity—particularly for commercial use.
This licensing complexity may hinder wider enterprise adoption, as organizations seek clarity around redistribution rights, derivative works, and liability. The extent to which Google can address these concerns may significantly influence the model’s competitiveness in commercial markets.
The popularity of Google’s Gemma models highlights strong demand for open AI tools and community-driven development.
The ability to compete at scale with platforms like Meta’s Llama may depend less on technical improvements alone and more on how effectively Google addresses ongoing licensing concerns and supports commercial deployment pathways.