Apple’s push into artificial intelligence, branded “Apple Intelligence,” has arrived with significant fanfare but also notable controversy, particularly surrounding the highly anticipated personalized Siri features. While Apple touts a new era of helpful, on-device AI, the delayed rollout of core functionalities and questions linger about promises made a year ago.
The discussion among tech commentators centers on whether Apple is truly delivering on its AI vision or if key aspects remain, as some critics contend, “vaporware”—products promised but not yet shipped.
The “Vaporware” Accusation and the 2024 Demo
At the heart of the debate is the personalized Siri demo showcased during the WWDC 2024 keynote. Features like Siri understanding personal context, such as knowing who “mom” is and checking her flight details by accessing calendar or email data, were highlighted as key capabilities. However, these specific features have not shipped as initially suggested (some were advertised as “coming fall 2024,” others “over the next year”).
Critics, including veteran Apple watchers, argue that the 2024 demo itself fell short of Apple’s historically stringent internal standards for keynote presentations. Typically, Apple requires demos to show features working in a single, unedited take, even if the final keynote video is edited for pacing. The personalized Siri segment from 2024, however, was heavily cut, switching between presenter and purported iPhone screen results without showing the interaction flow in real-time. This editing style, unusual for Apple, raised red flags.
Adding fuel to the fire, sources within Apple’s software engineering teams have reportedly indicated they did not see internal builds of iOS featuring this personalized Siri functionality in a working state before the WWDC 2024 keynote. While some “working code” might have existed, insider accounts suggest it was likely unreliable and far from shippable quality, breaking the fundamental rule that keynote demos must be possible to replicate in a live setting.
The term “vaporware,” defined simply as something promised but not yet released, is therefore applied because features announced in 2024 remain unavailable in 2025 or beyond. For users, whether the feature was partially functional internally a year ago is less relevant than the fact it still hasn’t arrived on their devices.
Execution Challenges and Delays
Apple has acknowledged delays for the personalized Siri features, stating they are taking longer than anticipated to meet the company’s quality and performance standards. These are the same features requiring Siri to have enhanced awareness of a user’s personal context, understand what’s on screen, and perform actions within and across different applications.
The technical challenges behind these delays might be substantial. One significant risk factor, particularly for features accessing personal data and interacting with apps based on user input (like emails), is security vulnerability, such as prompt injection attacks. Ensuring the AI can safely process potentially untrusted text inputs without allowing malicious instructions to compromise user data or exfiltrate information is critical, and this deep system integration likely complicates development and rigorous testing.
Furthermore, while the highly anticipated contextual Siri features are delayed until iOS 19 (pegged possibly for 2026 despite Apple’s vague “in the coming year” phrasing, which some feel is misleading), the Apple Intelligence features that have shipped currently are receiving mixed, often underwhelming, reviews. Tools like Image Playground, Genmoji, Writing Tools, and Visual Intelligence are functional but often criticized as slow, limited, and inferior to existing third-party alternatives from companies like OpenAI (ChatGPT), Google (Gemini, Google Lens), and Meta. Even the core Siri experience with current Apple Intelligence integrations has been described by some users as feeling worse or still struggling with basic tasks, frequently relying on offloading complex requests to ChatGPT rather than demonstrating deep on-device intelligence.
Apple’s Strategic Position in AI
Commentary suggests Apple is navigating a complex AI landscape. Unlike its long-term, iterative, and integration-focused approach with Apple Silicon chips (which ironically position Macs with M3 Ultra and high RAM as powerful local AI machines capable of running large models), its push into AI features appears more reactive to the current market moment.
While Apple’s initial strategy focused on leveraging unique on-device private data for specific, safer use cases, the delay in delivering these core features indicates the difficulty in executing this vision reliably and securely. Some analysts propose Apple might be better served by acting as an AI Platform—opening its powerful on-device models and ecosystem capabilities to developers—rather than primarily an AI Aggregator that uses developer data for its own limited features.
The current situation paints a picture where Apple seems to be playing catch-up on the features front, facing skepticism over past demonstrations, and releasing initial AI tools that don’t yet live up to the promise or compete effectively with established alternatives. As Apple continues development, the tech world watches to see if the eventual release of personalized Siri features will finally justify the hype and overcome the “vaporware” label they currently bear.
References
- <a href="https://daringfireball.net/2025/06/thetalkshowlivefromwwdc2025″>daringfireball.net
- daringfireball.net
- stratechery.com
- tech.yahoo.com
- appleinsider.com