TMN - Meta

Meta Introduces New In-House MTIA v2 AI Accelerator Chip

US tech giant Meta Platforms Inc. has revealed its new in-house custom artificial intelligence (AI) chip aimed at enabling more efficient training and easier inference.

The next-gen chip is the successor to the Meta Training and Inference Accelerator (MTIA) v1. It is mainly designed to accelerate learning of the company’s ad ranking models and recommenders running across Facebook and Instagram.

The Silicon Valley firm said MTIA v2 focuses on providing ‘the right balance of compute, memory bandwidth, and memory capacity.’

Internally referred to as ‘Artemis,’ the latest chip has been in development for quite some time and was previously reported to only help with reasoning tasks.

Early trials conducted by the social media group showed that the chip, which features a 256MB memory and a 1.3GHz clock speed, is three times more powerful than its predecessor across the four models the firm had assessed.

The latest MTIA chip has also been deployed to 16 data center areas, according to Meta.

Tech Firms’ Growing In-House Chip Movement

The demand for computing power has strengthened further following Meta’s move to provide AI services.

In 2023, the Facebook parent introduced the Llama 2, its own large language model (LLM) aimed at rivaling the dominance of OpenAI Inc.’s ChatGPT. The company also included new generative AI functions in its social media apps, including custom stickers and celebrity AI chatbots.

Furthermore, Meta has invested significantly in the development of software required to maximize its infrastructure’s capabilities efficiently.

The firm stated in October that it intends to invest about $35 billion in infrastructure, including data centers and hardware, to back AI. Meta chief executive Mark Zuckerberg expects AI to represent the largest part of their spending this year.

A considerable portion of the investment is still seen being allocated to US chip giant and AI darling Nvidia Corp., which is also the developer of the H100 graphic processor.

Zuckerberg said earlier this year that the firm plans to buy 350,000 H100 chips. In combination with other suppliers, they are likely to purchase an equivalent of 600,000 of those chips by the end of 2024.

Other AI companies are also considering building their chips in-house as compute power demand strengthens alongside AI use.

In 2017, Alphabet Inc.’s Google LLC revealed its TPU chips, while Microsoft Corp. announced its Azure Maia 100 chips last year. US e-commerce giant Inc. also unveiled its second-gen Trainium chip in the same year, which is expected to train models with four times the speed of its predecessor.

The race to acquire powerful chips emphasized the strong demand for custom AI chips. The need for chips has increased substantially that Nvidia, which is currently leading the AI competition, became the world’s third most valuable tech company with over a $2 trillion valuation.

User Review
0 (0 votes)


Leave a Reply