Google is making a big move to rival Nvidia’s domination of artificial intelligence tech. Google has been paying most of its attention to making its chips friendlier to developers with the help of Meta. Google aims to allow everyone to be able to run popular AI applications on Google’s hardware without having to put in much work. This could be a big shift in the tech industry, where one firm has always controlled everything.
The Problem With “Switching” Chips
Today, most firms creating AI technology are using Nvidia’s chips. The chips are fast and powerful, but their dominant factor is actually their software components. An aid named CUDA assists AI software in working properly on Nvidia technology. Most developers are using an application called PyTorch, which works best when used along with CUDA.
Because it is linked in this way, it is difficult for consumers to change brands when it comes to the chip. Businesses have had to rework parts of their code or retrain their team. This is time-consuming and costly, with neither resource being abundant for companies competing to put the latest AI technology into their products even if another chip is performing well.
Google also has its own processing chips, called Tensor Processing Units or TPUs. They have been used within Google for years powering search, Gmail, and other services. They are not commonly used by other companies. This is because they were designed to work well with Google software, not necessarily the software skills developers are already familiar with.
A New Push: TorchTPU
However, Google is finally doing something to correct this. It has started an internal project named “TorchTPU.” It is a combination of PyTorch and TPU. What it intends to do is offer its developers the ability to utilise PyTorch in the same way they currently do when they use Nvidia chips in Google’s hardware. What it essentially means is that companies won’t have to rebuild when they decide to switch brands.
To achieve this at a faster pace, Google is collaborating with the Meta corporation itself. The Meta Corporation is the proprietor of Facebook, Instagram, and the WhatsApp messaging service. Additionally, the corporation also supports the PyTorch framework, which gives it a keen interest in the functionality of the software with the use of different chips.
Meta wants lower prices and greater flexibility to choose how to employ the online power of their machine learning technology. Collaboration with Google’s chips would enable the cost-cutting measure of not having to rely on a single supplier. Discourse between the companies involves the increased adoption of TPUs by Meta either through the Google Cloud Platform or by setting them up locally.
Why This Matters for the Future
This is much more than a money-making exercise. This is about who controls technology. For a long time, Nvidia has had a stranglehold on all AI computing. Of course, some companies sell chips. But no one else comes close to Nvidia for a complete product. This puts them in a fantastically powerful position to control how technology develops.
Google and Meta think one company shouldn’t have so much power. By working together, they think they can create real competition. If developers find it simple to use Google’s chips with tools they already know, they might become more interested in using Google’s chips. This is because they might be able to enjoy lower prices, faster innovation, and greater freedom.
Google claims that it is motivated by the need to offer choice. It says that the demand for AI computing processing is increasing rapidly. Google just wants developers to be offered the freedom to choose either GPUS or TPUS. Switching is critical when gaining popularity.
Stay updated with the latest news and top business news around the world. Follow Inspirepreneur Magazine for more