Arm wants to upgrade the brains inside our devices.
Armsaid the models run seamlessly on its compute platforms.
The smaller, text-based LLMs Llama 3.2 1B and 3B are optimised for Arm-based mobile chips.

Consequently, the models can deliver faster user experiences on smartphones.
Processing more AI at the edge can also save energy and costs.
These enhancements offer new opportunities to scale.

By increasing the efficiencies of LLMs, Arm can run moreAIdirectly on smartphones.
For developers, that could lead to faster innovations.
It’s free, every week, in your inbox.
Arm expects endless new mobile apps to emerge as a result.
LLMs will perform tasks on your behalf by understanding your location, schedule, and preferences.
Routine tasks will be automated and recommendations personalised on-machine.
Your phone will evolve from a command and control tool to a proactive assistant.
Arm aims to accelerate this evolution.
The UK-based business wants its CPUs to provide the foundation for AI everywhere.
Arm has an ambitious timetable for this strategy.
By 2025, the chip giant wants more than100 billionArm-based devices to be AI ready.
Story byThomas Macaulay
Thomas is the managing editor of TNW.
He leads our coverage of European tech and oversees our talented team of writers.
Away from work, he e(show all)Thomas is the managing editor of TNW.
He leads our coverage of European tech and oversees our talented team of writers.
Away from work, he enjoys playing chess (badly) and the guitar (even worse).