AI subtly modifies the iPhone and Apple Watch.
Reuters || Shining BD
As opposed to other businesses that are attempting major AI transformations, Apple (AAPL.O) is leveraging this new technology to enhance the fundamental features of its new devices.
While showcasing a new line of iPhones and a watch that features enhanced semiconductor designs that power the new AI features, Apple refrained from using the term "artificial intelligence" to describe the emerging technology. Most of the features enhance simple tasks like making calls and taking better pictures.
Artificial intelligence didn't come up at its June developer conference either but has for months been quietly reshaping Apple's core software products behind the scenes.
In contrast, Microsoft (MSFT.O) and Alphabet's (GOOGL.O) Google set ambitious goals for the level of transformation with their AI efforts. Industry leaders have warned about the potential harms of the unchecked development of new tools such as generative AI.
Apple built the Series 9 Watch with a new chip that includes improved data crunching capabilities, notably adding a four-core "Neural Engine" that can process machine learning tasks up to twice as quickly. The Neural Engine is what Apple calls the building blocks for its chips that accelerate AI functions.
The AI components of the watch chip make Siri, Apple's voice assistant, 25% more accurate.
But including the machine learning chip components also enabled Apple to launch a new way to interact with the device: people can "double tap" by finger-pinching with their watch hand to do things like answer or end phone calls, pause music, or launch other information like the weather.
The idea is to give people a way to control the Apple Watch when their non-watch hand is busy holding a cup of coffee or walking a dog. The feature works by using the new chip and machine learning to detect subtle movements and changes in blood flow when users tap their fingers together.
The iPhone maker also showed off improved image capture for its lineup of phones. The company has long offered a "portrait mode" that can blur the backgrounds using computing power to simulate a large camera lens. But users had to remember to turn the feature on. Now, the camera automatically recognizes when a person is in the frame and gathers the data needed to blur the background later.
Apple is far from the only smartphone maker to add AI to its hardware. Google's Pixel phones, for example, allow users to erase unwanted people or objects from images.