Alphabet Inc., the parent company of Google, has rapidly approached a market capitalization of $4 trillion amid active negotiations with Meta regarding the supply of specialized artificial intelligence chips — TPU (Tensor Processing Unit). According to industry experts, Meta is in talks for a multi-billion dollar deal aimed at acquiring chips for use in its own data centers, as well as a potential lease of Google Cloud’s capabilities starting in 2026.
This is reported by Finway
Google’s TPU as an Alternative to Nvidia: Market Impact
The deal between Meta and Google is viewed as a strategic move aimed at reducing dependence on the dominant market graphics processors from Nvidia. The company’s GPUs continue to set the standard for training and deploying large language models of artificial intelligence; however, the tech sector is actively seeking alternatives. Google already has experience with large-scale deliveries of its TPUs, for example, to the company Anthropic, which demonstrates its ability to compete with Nvidia and attract new institutional orders.
Amid expectations for the deal, Alphabet’s shares rose by 1.6% over the past day, bringing the company closer than ever to a market capitalization of $4 trillion. Meanwhile, Nvidia’s shares fell approximately 4% in pre-market trading due to investor concerns about declining demand for GPUs and diversification among customers in the AI sector.

Nvidia commented on the situation, acknowledging the achievements of its competitors and emphasizing the technological superiority of its own solutions:
“We are excited about Google’s success – they have made significant strides in AI, and we continue to supply Google. NVIDIA is a generation ahead of the industry – it is the only platform that runs every AI model and executes it wherever computations are performed.”
Meta’s Investments and AI Infrastructure Development
Analysts predict that a potential deal with Google will strengthen the latter’s position as a key alternative supplier of accelerators for deploying artificial intelligence models. According to Bloomberg Intelligence, Meta’s capital expenditures in 2026 could exceed $100 billion, with $40–50 billion allocated for the development of AI infrastructure.
Experts note that the choice of TPUs will only be viable if their effectiveness is proven compared to other solutions, as well as considering performance at Meta’s hyper-scale data centers. TPUs are specialized integrated circuits that Google has been developing for over a decade exclusively for AI tasks. Demand for them is rising due to concerns about excessive reliance on Nvidia and limited access to GPUs.
Currently, TPUs are used not only for Google’s internal models, including Gemini, but are also available through Google Cloud for third-party developers. The corporation emphasizes that it will support both its own chips and Nvidia’s graphics processors.
