Why It’s Important
IBRS believes that this Google launch will spur more vendors to develop smaller-sized AI models, which require lower resource requirements and can be trained and fine-tuned using readily available computing resources. This is driven by the need to eliminate massive, specialised infrastructure typically needed for larger models like LaMDA or Megatron-Turing NLG. Organisations can better leverage existing infrastructure for AI development and deployment. This should further increase cost-effectiveness and efficiency through significantly less computational power requirements that make it deployable on standard hardware like laptops and workstations.
The release of Gemini Enterprise signifies Google’s intensified efforts to compete in the burgeoning generative AI market. While Microsoft’s early entry with ChatGPT and its successful enterprise offering established it as a leader, Google aims to catch up by leveraging the advanced capabilities of Gemini. While details about Gemini’s specific capabilities are limited, it’s likely designed to address the limitations identified in Bard and potentially surpass the functionalities of ChatGPT in certain areas. Google is paving the way for a future where AI benefits a broader spectrum of developers and users.
Who’s Impacted
- CEO
- AI developers
- IT teams
What’s Next?
- Analyse internal workflows and business processes to identify tasks suitable for Gemma’s capabilities.
- Collaborate with internal development teams to explore the potential integration of Gemma into existing applications or workflows, considering factors like data security, user experience, and ongoing maintenance.
Related IBRS Advisory
1. VENDORiQ: Google’s Gemini Pro and Imagen 2 to be Deployed on Samsung S24 Series