by Max A. Cherney
(Reuters) – Unlike other companies that are attempting significant change with artificial intelligence, Apple is using the emerging technology to improve basic functions in its new gadgets.
Without using the term “artificial intelligence” to describe the emerging technology, Apple showcased a new series of iPhones and a new watch that include improved semiconductor designs that power new AI features. Features massively improve basic functions like taking calls or capturing better pictures.
Artificial intelligence didn’t even come up at the June developer conference, but has been quietly reshaping Apple’s core software products behind the scenes for months.
In contrast, Microsoft and Alphabet’s Google have set ambitious goals for the level of transformation with their AI efforts. Industry leaders have warned about the potential harms of uncontrolled development of new tools such as generative AI.
Apple has built the Series 9 Watch with a new chip that includes better data crunching capabilities, specifically adding a four-core “neural engine” that can process machine learning tasks twice as fast. The Neural Engine is what Apple calls the building blocks for its chips that speed up AI tasks.
The watch chip’s AI components make Apple’s voice assistant Siri 25% more accurate.
But the inclusion of machine learning chip components also helped Apple launch a new way to interact with the device: People use their watch to do things like answer or end a phone call, pause or launch music, and more. You can “double tap” by pressing your finger against the needle. Other information like weather.
The idea is to give people a way to control the Apple Watch when their non-watch hand is busy holding a cup of coffee or walking the dog. The feature works by using a new chip and machine learning to detect subtle movements and changes in blood flow when users tap their fingers together.
The iPhone maker also showed off improved image capture for its series of phones. The company has long offered a “Portrait Mode” that can blur the background by using computing power to simulate a large camera lens. But users need to remember to turn on this feature. Now, the camera automatically detects when a person is in the frame and subsequently collects the necessary data to blur the background.
Apple isn’t the only smartphone maker to add AI to its hardware. For example, Google’s Pixel phones allow users to erase unwanted people or objects from images.
(Reporting by Max A. Cherney in San Francisco; Editing by Lisa Shumaker)