Apple’s ‘Visual Intelligence’ for iPhone 16 Explains What You See

Why Trust Techopedia
Key Takeaways

  • Apple revealed “Visual Intelligence” coming to iPhone 16 later this year.
  • It’s Apple’s answer to Google Lens and can provide details on what you see.
  • It works with third-party apps and comes later this year.

Apple’s Visual Intelligence is coming to the iPhone 16 and 16 Pro later this year.

Effectively Apple’s take on Google Lens, it allows users to point their iPhone 16 camera at almost anything and receive information. The new touch-sensitive button on the right side of the device, Camera Control, is used to activate the feature.

For example, you can point it at a restaurant to receive opening hours and menu, at a car to identify the make and model, or at a dog to find out what breed it is. You can snap a flyer to add the event details to your calendar. Or point and click on a bike to search for it and purchase it online. 

You can even use it to get help with homework via ChatGPT. Visual Intelligence will also work with third-party apps.

Apple says that Visual Intelligence is private, so the company doesn’t know what you’ve clicked.

The feature won’t be available until sometime later this year. However, it promises to significantly improve the built-in AI functionality of the iPhone 16 and later models — you won’t need the Google app or other third-party software to identify what’s in front of you.