Exclusive: Google deepens Thinking Machines Lab ties with new multi-billion-dollar deal

Published: (April 22, 2026 at 08:00 AM EDT)
2 min read
Source: TechCrunch

Source: TechCrunch

Deal Overview

Former OpenAI executive Mira Murati’s startup, Thinking Machines Lab, has signed a new multi‑billion‑dollar agreement to expand its use of Google Cloud’s AI infrastructure, including systems powered by Nvidia’s latest GPUs. The deal, valued in the single‑digit billions, provides access to Google’s newest AI systems built on Nvidia’s GB300 chips, along with infrastructure services for model training and deployment.

Background

  • Google’s cloud strategy: Google has been striking a series of cloud deals with AI developers, bundling its cloud offerings with services such as storage, Kubernetes Engine, and Spanner. Earlier this month, Anthropic signed an agreement with Google and Broadcom for multiple gigawatts of TPU capacity.
  • Competitive landscape: Anthropic also signed a separate agreement with Amazon to secure up to 5 GW of capacity for training and deploying Claude.
  • Thinking Machines Lab’s history: Earlier this year, Thinking Machines partnered with Nvidia in a deal that included an investment from the chipmaker. This is the first time the lab has struck a deal with a cloud services provider. The agreement is non‑exclusive, allowing the lab to use multiple cloud providers over time.

Company Profile

Murati left her role as OpenAI’s chief technologist and founded Thinking Machines in February 2025. The company quickly raised a $2 billion seed round at a $12 billion valuation. In October, it launched its first product, Tinker, a tool that automates the creation of custom frontier AI models (wired article).

Technical Details

  • Reinforcement learning support: Google’s press release highlighted that the cloud platform can support Thinking Machines’ reinforcement‑learning workloads, which are central to Tinker’s architecture. Reinforcement learning has driven recent breakthroughs at labs such as DeepMind and OpenAI, and the scale of this deal reflects the computational intensity of such work.
  • Hardware performance: Thinking Machines is among the first Google Cloud customers to access GB300‑powered systems, which Google claims deliver a 2× improvement in training and serving speed compared to prior‑generation GPUs.

“Google Cloud got us running at record speed with the reliability we demand,” said Myle Ott, a founding researcher at Thinking Machines.

0 views
Back to Blog

Related posts

Read more »