Thursday, March 12, 2026

Apple barely talked about AI during its vast iPhone 17 event

Share

The expected Apple event was full of news about Airpods, Apple Watch, iPhones and others. But during presentationwhich was much shorter than usual after just an hour and 15 minutes, there was a lack of a worthy meaning of a fashionable word in the disputed part of the fanfare: AI.

Although CEO Tim Cook said during a live broadcast that the company “dealt with the biggest jump on the iPhone”, during the fresh iPhone there were cursory mention of the Apple intelligence.

Yes, phones represent “progress in Apple Silicon, Hardware and Software”, according to Apple, which means that they are better for games, photos, speed, battery life and others. But most of the specific Apple artificial intelligence tools addressed to consumers advertised-as visual intelligence functions and live translation in iMessage and guy-guy-they were great talks in June during WWDC 2025. And they are not necessarily groundbreaking, in the fact that Apple competitors, such as Google and Samsung, introduced comparable functions a year ago.

The presentation was far from the weighty mention of Apple about AI at the unveiling of the iPhone 16, which led to public disappointment when some flagship functions did not appear as promised.

This year, Apple spoke more about how artificial intelligence helps in background power functions, and less about how it puts artificial intelligence in front of the faces of consumers – unlike Google Pixel 10 unveiling last month and the Samsung event in January. Apple is too delayed in its agency AI efforts to make the tool assistant in the same way in the same way, so it focuses on this event on the equipment and how AI helps to supply things behind the scenes.

The management talked about how the updated neuron engine powers Apple Intelligence intelligence and how local models of vast languages ​​drive a better game with higher numbers of frames per second. They mentioned how Apple is now building neural accelerators in each GPU core to ensure “MacBook Pro-Levels of Compute in the iPhone”, enabling intensive AI loads.

But by mentioning how AI they refer to fresh Airpods, Apple was strongly based on live translation and heart rate monitoring instead of accepting Google’s approach with the fresh budget Pixel Buds 2a, where it was discussed how they can be used to talk to Gemini AI. The management mentioned how the advanced calculation model of devices connects to the Apple Intelligence model operating on the iPhone for live translation, and in the case of heart rate sensors emphasized the role of machine learning algorithms in the AI ​​model power on the device for activity and calorie tracking, thanks to over 50 million hours of training data from over 250,000 participants in the Apple study.

The AI ​​arms race has never been more high for companies fighting for the first place-part because they invested so much in their efforts AI, and these efforts are not budget-friendly. Openai has been priced at $ 300 billion this year and apparently Expected to burn $ 115 billion by 2029. Anthropic recently raised $ 13 billion with a valuation of $ 183 billion. Only in the last few months, the finish has spent billions of dollars to employ the best industry researchers after over $ 14 billion in AI. And only to mention only a few.
Apple has long been criticized for being behind in the AI ​​race. Part of this was reflected at least 10 reports reported Recently from the AI ​​research department, including four last week. Jian Zhang, the main robotics in the study, apparently went to the finish line, and three other AI researchers reportedly left the Apple Foundation Model Team, and two went to Openi and one to Anthropik.

Latest Posts

More News