Apple has unveiled a new feature called Visual Intelligence with the iPhone 16, positioned as a competitor to Google Lens. Announced at the September 2024 event, Visual Intelligence aims to enhance how users interact with their environment through advanced image recognition technology.
This feature is accessed via a new touch-sensitive Camera Control button located on the right side of the iPhone 16. By tapping this button, users can activate Visual Intelligence to identify objects, provide relevant information, and suggest actions based on what the camera is focused on.
For example, if you point the camera at a restaurant, it will display the menu, hours, and ratings. Scanning an event flyer will allow you to add it to your calendar directly. The feature can also identify dog breeds or help you search for products online.
Later this year, the Camera Control button will expand its functionality to include third-party tools with specialized expertise, according to Apple’s press release. Users will be able to access services like Google for product searches or ChatGPT for problem-solving, while still retaining control over their interactions and the information shared.
Apple has emphasized that Visual Intelligence is designed with privacy as a priority. All data processing happens directly on the device, ensuring that Apple does not have access to what users are identifying or searching for.
By keeping data on the device, Apple ensures that user privacy is upheld, with no external tracking of what users click on or search for through Visual Intelligence.
LOOK AT THE AIRPODS PRO 2 and its amazing new features.