AI hardware supporting local large language models, generative AI, and more needs vendors to facilitate local AI apps’ development and accessibility.
At MWC 2024, Qualcomm unveiled the Qualcomm AI Hub with over 75 AI models optimized for Qualcomm and Snapdragon platforms, showcasing a seven-billion-parameter local LLM on a PC and Snapdragon phones.
However, PC and chip vendors must present real-world AI examples to encourage local AI usage, particularly amid fierce competition from smartphones and the cloud. Microsoft’s focus on cloud-based AI with minimal commitment to local AI further underscores the need for software-driven solutions.
The answer is apps: lots and lots of apps
Microsoft’s AI Hub in the Microsoft Store offers limited chatbot “apps” requiring subscriptions, overshadowing the potential for local AI apps to leverage PC resources effectively.
While some vendors like MSI have introduced AI applications like “AI Artist,” there remains a scarcity of local AI options, despite AMD and Intel emphasizing AI performance on their chips.
For local AI to gain traction, chip vendors must transition from development initiatives to consumer-oriented strategies like offering trial subscriptions for AI-powered apps or bundling software with AI-centric PCs to enhance user experience and adoption.
Ultimately, the success of local AI hinges on its simplicity, affordability, and widespread availability, urging vendors to prioritize software-centric marketing approaches to propel the adoption of local AI apps.