Financial Markets


In its pursuit to make strides in the artificial intelligence (AI) technology realm, Meta recently showcased the latest innovation in its technological arsenal - the next-gen Meta Training and Inference Accelerator (MTIA). Establishing itself in the generative AI space has become imperative for Meta, as it looks to keep pace with its competitors and redefine its operational dynamics.

An extension of its previous work, the next-gen MTIA is an amplified version of MTIA v1, the chip introduced by Meta last year. The reinvigorated model comes packed with more processing cores and higher internal memory, making it more potent than its predecessor. Its enhanced speed and power set it apart, offering three times the performance of MTIA v1. In practical terms, this AI chip supports Meta's advertising models, particularly in ranking and recommending display ads across Meta-owned platforms including Facebook.

Despite its advanced features, the next-gen MTIA is currently not deployed for generative AI training workloads. Meta also emphasized that the use of the MTIA wouldn't render GPUs obsolete but rather provide a complementary solution to ensure smooth operations.

Building in-house hardware such as the new chip is a strategic move for Meta, as the company is looking at $18 billion in GPU costs by 2024 to operate its AI models. By developing its own AI chips, Meta not only builds on its technological prowess but also stands a chance to significantly trim operational costs.

Nevertheless, the race in the AI hardware development space is a crowded one. Tech titans like Google, Amazon, and Microsoft have already made significant strides in self-reliant AI chip technology. Google’s fifth-generation AI chip, the TPU v5p, is now available to Google Cloud users, and the tech giant has also announced Axion, its first chip dedicated to running AI models.

Although it took less than nine months for Meta to transition from the inception of the first silicon to the production stage of the next-gen MTIA, the road towards complete self-reliance is long and arduous. In the grand scheme of things, it is considered a small step. Meta's test lies in disassociating from third-party GPUS completely and forging its path in a platform crowded by formidable competitors.

The future landscape of AI technology appears to be in the hands of the chip revolution. As Meta ventures further into developing its proprietary AI hardware, it opens up potential prospects of performance optimizations, cost reductions, and potential dominance in the rapidly growing AI space.