Apple has introduced some exciting features backed by AI on its iPhone 15 models. But can they be called generative AI or is Apple working towards a new AI?
At the recently held Wonderlust event, Apple stunned the world with its most innovative iPhone Pro models ever. However, the Cupertino-based tech giant was conspicuously silent on anything remotely related to artificial intelligence. Tech pundits are seeing this as Apple’s way of looking over the current generative AI wave that has swept the tech industry.
Apple has clearly opted for something known as Intuitive AI in place of generative AI. The main function of Intuitive AI is to offer some nuanced and subtle AI-backed changes to daily use cases such as photography and answering calls.
With its latest event, Apple has put a lid on all the rumours and speculations surrounding its plan for infusing generative AI on its latest iPhones. However, it is to be noted that Apple is reportedly continuing its work on its generative AI framework Ajax AI, its answer to OpenAI’s ChatGPT. The company had earlier chosen to ignore generative AI during its developer conference earlier this year.
What is Intuitive AI?
One can gain a better understanding of Apple’s AI with reference to the new chip it designed for the iPhone 15 Pro models. Apple is shipping the Pro devices with what it calls its most powerful chip yet – the A17 Pro Bionic chip. The latest chip has been designed by the tech behemoth to add more power to its machine-learning algorithms.
The company has been evidently focussed on AI that is intuitive and not generative and this was evident with the features that it demonstrated during the event. These subtle changes include smoothening glitches to predictions that are seemingly harmless.
How is Apple using Intuitive AI?
The most noteworthy feature is the use of machine learning for voice recognition of the user. This enables the device to quieten the background noise on calls. Besides, the camera and computational photography also use AI features. These include automatic detection of people and pets in a frame to offer insights that could enable a user to turn these images into portraits at a later stage.
This is not just it, Apple is also planning to introduce some more exciting features with its latest iOS 17 operating system. These features include extensive predictive text suggestions from the keyboard, automated transcription of voicemails, etc. While these additions may not be as sensational as an AI chatbot, however, they offer greater convenience to users.
AI and accessibility
Intuitive AI is just not constrained to the above-mentioned use cases; Apple has also introduced it in its new accessibility features. The Point and Speak feature in the Magnifier app allows those with low vision to read labels on objects. They simply need to point the phone in the direction of the object and the device will read out. For users with speech issues, the latest OS from Apple can produce a synthetic voice similar to their voice. To do this, they simply need to read out 15 minutes of text inputs.
AirPods have also been updated with some stellar Intuitive AI features that fuse music or calls with background voices in the adaptive audio feature. Similarly, the much-talked-about double tap feature on Watch Series 9 is backed by machine learning.
At a time when generative AI is growing by leaps and bounds through tools like ChatGPT, DALL-E and many others, Apple’s A17Pro Bionic chip – which has a neural engine backed by machine learning algorithms – is paving the way for a new kind of AI. These features that lay more emphasis on ease of use are likely to find resonance with more companies in the future. Although Apple has not officially described or used the term Intuitive AI, the above-listed features and varied views have led to the coinage of the term.
Source:indianexpress.com