MediaTek Enhances On-Device Generative AI Processing

August 28 2023, 01:10
MediaTek is already powering more than two billion connected edge devices every year and now the company confirmed that it is working closely with Meta's Llama 2 next-generation open-source Large Language Model (LLM). Combining this recently announced LLM with its latest APUs and NeuroPilot AI Platform, MediaTek aims to build a complete edge computing ecosystem designed to accelerate AI application development on smartphones, consumer connected devices, vehicles, smart home, etc.
 

Llama 2 was recently introduced in partnership by Facebook and Microsoft as the next generation open source large language model. Llama 2 is free for research and commercial use, and it was already offered to support a broad set of companies and projects as a way to foster momentum for AI research and applications away from the prompting interface.

Presently, most Generative AI processing is performed through cloud computing; however, MediaTek’s use of Llama 2 models will enable generative AI applications to run directly on-device as well. Doing so provides several advantages to developers and users, including seamless performance, greater privacy, better security and reliability, lower latency, the ability to work in areas with little to no connectivity, and lower operation cost.

To truly take advantage of on-device Generative AI technology, edge device makers will need to adopt high computing, low-power AI processors and faster, more reliable connectivity to enhance computing capabilities. Every MediaTek-powered 5G smartphone SoC shipped today is equipped with APUs designed to perform a wide variety of Generative AI features, such as AI Noise Reduction, AI Super Resolution, AI MEMC and more.

Additionally, MediaTek’s next-generation flagship chipset, to be introduced later in 2023, will feature a software stack optimized to run Llama 2, as well as an upgraded APU with Transformer backbone acceleration, reduced footprint access and use of DRAM bandwidth, further enhancing LLM and AIGC performance. These advancements promise to accelerate new use cases for on-device Generative AI.

"The increasing popularity of Generative AI is a significant trend in digital transformation, and our vision is to provide the exciting community of Llama 2 developers and users with the tools needed to fully innovate in the AI space," says JC Hsu, Corporate Senior Vice President and General Manager of Wireless Communications Business Unit at MediaTek. "Through our partnership with Meta, we can deliver hardware and software with far more capability in the edge than ever before."

MediaTek expects Llama 2-based AI applications to become available for smartphones powered by the next-generation flagship SoC, scheduled to hit the market by the end of the year. The next logic step will be to integrate lower power models for wearables and hearables, which will be the next larger categories of connected devices to clearly benefit from running AI on-device.
https://www.mediatek.com
related items