With Google’s Gemini 1.5 flash large language model (LLM), Indian organizations across various sectors, including the public sector, will now have the option to store their data at rest and carry out machine learning processing entirely within India. This development addresses the increasing demand from Indian enterprises for localized data processing.
Bikram Bedi, Vice President and Country Managing Director of Google Cloud India, noted that this option will be especially beneficial for public sector and financial services enterprises, given the sensitive nature of the data they manage.
Must Read: AI Knows Everything: Know If You Are In Depression Through Eyes
During a media roundtable on Thursday, Bikram Bedi highlighted that local data storage and processing with Gemini 1.5 flash would also benefit Indian startups by reducing the costs associated with using the models and improving latency in certain sectors.
“Google is the only vendor out there providing the entire AI stack end-to-end. Every other vendor is either buying GPUs, marking them up and selling them, or outsourcing other AI technology, marking it up and selling it. The reason I point this out is that price performance becomes a very important factor when you are starting to go into production and scale. Hence, Google’s ability to provide the right performance for the price,” Bedi explained.
He further noted that regulatory requirements and increasing demand from the industry were major factors behind Google’s decision to implement local processing.
Also Read: AI-powered augmented reality (AR) and virtual reality (VR)