Apple’s Visual Intelligence is coming to the iPhone 16 and 16 Pro later this year.
Effectively Apple’s take on Google Lens, it allows users to point their iPhone 16 camera at almost anything and receive information. The new touch-sensitive button on the right side of the device, Camera Control, is used to activate the feature.
Visual Intelligence kinda sold me. I'd use it all the time. pic.twitter.com/MsuJ9eqbLZ
— Jared Davidson (@Archetapp) September 9, 2024
For example, you can point it at a restaurant to receive opening hours and menu, at a car to identify the make and model, or at a dog to find out what breed it is. You can snap a flyer to add the event details to your calendar. Or point and click on a bike to search for it and purchase it online.
You can even use it to get help with homework via ChatGPT. Visual Intelligence will also work with third-party apps.
Apple’s new visual intelligence feature on the iPhone 16 pic.twitter.com/9tMyZWJCrC
— highsnobiety (@highsnobiety) September 9, 2024
Apple says that Visual Intelligence is private, so the company doesn’t know what you’ve clicked.
The feature won’t be available until sometime later this year. However, it promises to significantly improve the built-in AI functionality of the iPhone 16 and later models — you won’t need the Google app or other third-party software to identify what’s in front of you.