skip to content

Apple’s Visual Intelligence was an unexpected surprise. Here’s what it will do in Apple iPhone 16 series

Apple’s recent launch of the iPhone 16 series has been notable for its design and for introducing a groundbreaking feature: Visual Intelligence.

Announced on September 9, this new AI-powered tool promises to change how users interact with their environment, offering capabilities similar to Google Lens. The difference is that while most of Google’s processing takes place on the cloud, the processing of Visual Intelligence takes place on the iPhone itself.

What is Visual Intelligence
Visual Intelligence is part of iOS 18’s Apple Intelligence suite and aims to instantly enhance users’ ability to learn about their surroundings. This feature works similarly to other multimodal AI systems provided by Google and OpenAI. It allows users to gain information about objects and locations by photographing.

For instance, if you snap a picture of a restaurant, Visual Intelligence can provide details like its operating hours, reviews, and menu options. Similarly, if you photograph an event flyer, the feature can automatically capture and record information such as the event’s title, date, and location.

Craig Federighi from Apple highlighted that Visual Intelligence leverages a combination of on-device processing and Apple services, ensuring user images are not stored, thereby protecting privacy.

This feature also integrates with third-party models, suggesting that users could search for information on platforms like Google or get assistance with study materials by photographing notes.

How to Use Visual Intelligence
The key to activating Visual Intelligence is a new hardware button or feature, Camera Control, added to the iPhone 16 and 16 Pro models. This capacitive button, located on the right side of the device, was initially speculated to be a standard camera button. However, Apple has revealed it serves a dual purpose.

To use Visual Intelligence, users must click and hold the Camera Control button and point the iPhone’s camera at the object or text of interest.

Although Apple did not specify an exact release date for Visual Intelligence, it has indicated that the feature will become available later this year. This addition underscores Apple’s commitment to integrating advanced AI technologies into its devices, promising to provide users with a more intuitive and interactive experience.

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed