Meta Launches MTIA 300 Custom AI Chips to Power Recommendations and Generative AI

Read the original article →

Meta Launches MTIA 300 Custom AI Chips to Power Recommendations and Generative AI

Meta announced four new custom AI chips on March 12, 2026, marking a significant expansion of its in-house silicon program. The chips are designed to handle different stages of Meta's AI workloads across its family of platforms.

MTIA 300 and the Chip Lineup

The MTIA 300 is the lead chip in the new lineup and is purpose-built to train the ranking and recommendation systems that determine what users see across Facebook, Instagram, and Threads. These recommendation systems represent some of the most computationally intensive workloads Meta runs at scale.

Later chips in the series are intended to broaden coverage to generative AI inference — handling the compute required to run AI-generated content, conversational features, and other generative capabilities — through at least 2027.

Why Meta Is Building Its Own Chips

Like Google (TPUs) and Amazon (Trainium/Inferentia), Meta is investing in custom silicon to reduce dependence on NVIDIA hardware, lower the cost per compute unit, and optimize performance for its specific workload patterns. Recommendation and ranking models have different characteristics than language model training, and purpose-built chips can deliver meaningful efficiency gains over general-purpose GPUs at Meta's scale.

Context

The chip announcement came alongside Meta's acquisition of Moltbook, an AI agent social network, and follows the company's broader push to establish leadership across both AI infrastructure and AI-native consumer products.

References

This article was originally published at Tech Startups. For the full piece, read the original article.

Discussion

  • Loading…

← Back to News