Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Tech

Meta and Qualcomm team up to run big AI models on phones

Cristiano Amon, president and CEO of Qualcomm, speaks during the Milken Institute Global Conference on May 2, 2022 in Beverly Hills, California.

Patrick T. Fallon | AFP | Getty Images

Qualcomm and Meta will enable the social network’s big new language model, Llama 2, to run on Qualcomm chips in phones and PCs from 2024, the companies announced today.

Until now, LLMs have mostly run large server farms on Nvidia GPUs due to the technology’s huge computing power and data needs, boosting Nvidia stock, which is up more than 220% this year. But the AI ​​boom has largely bypassed companies that make advanced phone and PC processors like Qualcomm. Its shares are up about 10% in 2023, lagging the NASDAQ’s 36% gain.

Tuesday’s announcement suggests Qualcomm wants to position its processors as AI-ready, but “at the edge,” or on the device, rather than “in the cloud.” If large language models can run on phones instead of in large data centers, this can significantly reduce the cost of running AI models and can lead to better and faster voice assistants and other applications.

Qualcomm will make the Meta Llama 2 models open source available on Qualcomm devices, which it believes will enable applications such as intelligent virtual assistants. Meta’s Llama 2 can do a lot of the same things as ChatGPT, but it can be packaged into a smaller program that allows it to run on a phone.

Qualcomm’s chips include a “tensor processor unit,” or TPU, which is well-suited for the kinds of computations that AI models need. However, the amount of computing power available on a mobile device pales in comparison to a data center stocked with advanced GPUs.

Meta’s Llama is unique in that Meta has published its “weights,” a set of numbers that help guide how a particular AI model works. This will allow researchers and eventually commercial enterprises to use AI models on their computers without asking permission or paying. Other well-known LLMs, such as OpenAI’s GPT-4 or Google’s Bard, are closed source and their weight is shrouded in secrecy.

Qualcomm has worked closely with Meta in the past, most notably on chips for Quest virtual reality devices. It also demonstrated some of the artificial intelligence models running slowly on its chips, such as the open-source image generator Stable Diffusion.

Source by [author_name]

Related Articles

Back to top button