
"During the Snapdragon Summit on Maui, Cristiano Amon, CEO of Qualcomm, gave a glimpse into where the (mobile) ecosystem they provide with chips is heading. Qualcomm envisions a future in which AI moves from the cloud to your devices, taking care of everything for you in every possible way. Qualcomm invited us to attend the Snapdragon Summit, where two new chips were presented: a new smartphone and a new compute chip. The latter is primarily intended for laptops and mini PCs."
"These new chips feature the standard improvements in terms of CPU and GPU, but the focus seems to be mainly on the so-called NPU. This is used by AI models. The better the NPU, the higher the number of TOPS, and the better the chip can perform AI tasks (inferencing). AI currently comes mainly from the cloud Currently, most AI models are still run in the cloud."
"A few years ago, we believed that AI models only improved as they grew in size. That theory is now outdated, as we are seeing more and more smaller models emerging that are many times better. Several models can be run on laptops and high-end smartphones. OpenAI has the gpt-oss-20B model, Google has Gemini Nano, and Meta has smaller Llama models."
Qualcomm unveiled two new Snapdragon chips, including a compute chip for laptops and mini PCs, with standard CPU/GPU gains and a pronounced emphasis on NPU performance. Higher NPU TOPS improve on-device AI inferencing. Currently most AI runs in the cloud via services like ChatGPT, Claude, CoPilot and Gemini, but cloud-only delivery cannot scale globally. Expect migration of simple tasks—photo editing, email drafting, task planning, calendar scheduling—to run locally. Smaller, optimized models (gpt-oss-20B, Gemini Nano, Llama variants) are improving rapidly and can operate on high-end smartphones and laptops, enabling increasingly capable on-device AI.
Read at Techzine Global
Unable to calculate read time
Collection
[
|
...
]