AI PCs need apps with broad use cases to gain traction
There are many ways local AI on PC hardware can help users, but the broader use cases aren't there yet. Learn about the emerging AI PC market and where it still needs to grow.
In an effort to resist being part of the AI hype cycle, I've tried to focus my efforts around generative AI and its intersection with the day-to-day life of end users and IT administrators.
While there have been some advancements on the admin side of the equation, the end-user side has been mostly limited to an endless stream of web services and copilots -- lowercase -- with Microsoft Copilot leading the charge.
As we enter the summer of the first year of the AI PC, I've been looking for use cases that both showcase the power of local compute dedicated to AI workloads and are broadly applicable. Effectively, I'm looking for where the AI rubber meets the road of end-user computing.
After all, these AI PCs will be more costly, so deploying them to all users requires a value proposition that goes beyond some of the eye candy demos we've seen so far.
Last year's eye candy is this year's table stakes
Windows Studio Effects was among the very first local AI demonstrations and is effectively the minimum expectation delivered by AI at the endpoint. In the desktop virtualization world, it would be akin to publishing Notepad. The thing is, while it's cool to see and hear, it's not likely to drive increased budgets. Sure, it can blur the background without burdening the CPU, but I can do that without an AI PC.
There are more advanced features, such as audio and video enhancements, but what is that really worth to a company? Does it increase productivity? Does it lower costs after buying a new PC to take advantage of these features? For me, it's more of a nice side effect as opposed to a foundational reason to shift budget to AI PCs.
The same can be said for OmniBridge, which uses AI to perform real-time translation for users that depend on American Sign Language to communicate. This is an amazing, life-changing technology that's incredibly useful but only to a limited number of people. That's not to say it's not deserving of investment, but it's not the kind of thing that will drive widespread AI PC adoption.
There are other things related to communications that are both interesting and useful to some degree, but a killer app that flat-out requires AI PCs to move the business along has yet to appear in that area.
Security emerges as the next-lowest branch
After this year's RSA Conference, it's pretty clear that security has emerged as the next-lowest branch on tree of end-user-facing AI use cases. Though the demos are less eye- and ear-catching than their audio and visual counterparts, the impact on end users and IT security initiatives was tangible.
Everyone had something about AI in their booths, which I thought was odd since, if any group should've known that AI has been around for a very long time, it would've been the RSA crowd. For me, it was proof that we're still ascending in the AI hype cycle.
While there was an endless stream of "chatbot this" and "natural language query that," a few things did emerge that could help forecast advanced AI PC adoption. For example, CrowdStrike spoke about how its CPU consumption dropped from 35% to 1% when used on machines equipped with an Intel neural processing unit (NPU). This, coupled with the ability to reduce the amount of data sent to the cloud, results in improved security, faster detection and less dependency on the cloud service itself.
Another use case that I've come across is Bufferzone NoCloud, which uses local NPU resources to analyze websites for phishing scams using computer vision and natural language processing. This local approach results in fast performance without the need to send sensitive data to the cloud and back.
We still need to see the value
These are just a few examples of what is happening today with regard to technology that uses local NPU hardware. As more emerge, I'll be particularly interested in examples that are universally applicable that could drive demand for AI PCs. We're already seeing new devices entering the market, and if AI tera operations per second (TOPS) hasn't entered your lexicon yet, it will soon.
Just as CPU speed isn't the only important factor in overall performance, there are many other factors that determine AI performance besides TOPS, but in general that's where most conversations seem to be starting.
Since the hardware is ahead of the software, this feels odd. For widespread adoption of these workloads, we need universally applicable capabilities that go beyond what we've seen so far. I think we're on the right track, but hardware and software vendors need to do more to show the value to the business in addition to the less tangible "possibilities." Hopefully that gets easier as momentum builds.
Gabe Knuth is the senior end-user computing, endpoint security, and email security analyst for TechTarget's Enterprise Strategy Group. He writes publicly for TechTarget in addition to his analyst work. If you'd like to reach out, see his profile on LinkedIn or send an email to [email protected].
Enterprise Strategy Group is a division of TechTarget. Its analysts have business relationships with technology vendors.