Making Siri Smarter: Apple Explores AI on iPhone

Apple's latest research suggests it plans to catch up with rivals in AI technology by running large language models directly on iPhones.

ADVERTISEMENT

Apple's Research on Smartphone AI

Apple's recent research paper titled 'LLM in a Flash' indicates the company's intention to improve its AI capabilities by running large language models (LLMs) on smartphones. The paper proposes a solution to the computational bottleneck that currently exists when running LLMs on devices with limited memory.

Traditionally, LLMs, including those used in apps like ChatGPT, require the computing power of vast data centers. However, Apple's research aims to pave the way for effective LLM inference directly on an iPhone.

The paper was published on December 12 and has gained attention within the AI research community, following Apple's earlier efforts to enable image-generating models on its custom chips.

Apple's Catch-Up Strategy

While Apple was one of the first companies to introduce a virtual assistant, Siri, it has been viewed as lagging behind its Big Tech rivals in the field of generative AI. However, Apple's research suggests a different approach. Instead of relying on cloud computing platforms, Apple aims to develop AI that can run directly on iPhones.

This strategy aligns with the plans of other smartphone manufacturers, such as Samsung, who are preparing to launch AI-focused smartphones. Counterpoint Research estimates that over 100 million AI-focused smartphones will be shipped in 2024, and 40 percent of new devices will offer AI capabilities by 2027.

By bringing AI functionality directly to smartphones, Apple and other manufacturers hope to revive the declining smartphone market and offer users more advanced features and capabilities.

Technical Challenges and Privacy Benefits

Running large AI models on personal devices presents significant technical challenges as smartphones have limited computing resources and energy compared to data centers. However, solving this problem could result in faster AI responses and the ability to work offline.

In addition to technical advantages, running AI on personal devices can provide privacy benefits by answering queries without sending data to the cloud. This aligns with Apple's emphasis on privacy in recent years, distinguishing itself from its competitors.

Apple's research, although not directly indicative of product features, offers insights into the company's ongoing efforts to optimize AI performance on personal devices and leverage the full potential of LLMs.