Apple aims to run AI models directly on iPhones, other devices

2 min read Original article ↗

“As far as the chips in their devices, they are definitely being more and more geared towards AI going forward from a design and architecture standpoint,” said Dylan Patel, an analyst at semiconductor consulting firm SemiAnalysis.

Apple researchers published a paper in December announcing that they had made a breakthrough in running LLMs on-device by using Flash memory, meaning queries can be processed faster, even offline.

In October, it released an open source LLM in partnership with Columbia University. “Ferret” is at present limited to research purposes and in effect acts as a second pair of eyes, telling the user what they are looking at, including specific objects within the image.

“One of the problems of an LLM is that the only way of experiencing the world is through text,” said Amanda Stent, director of the Davis Institute for AI at Colby College. “That’s what makes Ferret so exciting: you can start to literally connect the language to the real world.” At this stage, however, the cost of running a single “inference” query of this kind would be huge, Stent said.

Such technology could be used, for example, as a virtual assistant that can tell the user what brand of shirt someone is wearing on a video call, and then order it through an app.

Microsoft recently overtook Apple as the world’s most valuable listed company, with investors excited by the software group’s moves in AI.

Still, Bank of America analysts last week upgraded their rating on Apple stock. Among other things, they cited expectations that the upgrade cycle for iPhones will be boosted by demand for new generative AI features to appear this year and in 2025.

Laura Martin, a senior analyst at Needham, the investment bank, said the company’s AI strategy would be “for the benefit of their Apple ecosystem and to protect their installed base.”

She added: “Apple doesn’t want to be in the business of what Google and Amazon want to do, which is to be the backbone of all American businesses that build apps on large language models.”

© 2024 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.