Apple is rolling out Visual Intelligence, a powerful feature bringing real-world object and information recognition to your iPhone camera. Similar to Google Lens, it allows users to point their phone at objects – such as restaurants to see their hours, or plants to identify them – and instantly receive relevant information. However, this feature’s availability is currently limited to specific iPhone models and software versions.
To utilize Visual Intelligence, users will need either iOS 18.2 on the iPhone 16 series (iPhone 16, iPhone 16 Plus, iPhone 16 Pro, iPhone 16 Pro Max), iOS 18.3 on the iPhone 16E, or iOS 18.4 on the iPhone 15 Pro and iPhone 15 Pro Max. Apple Intelligence also must be enabled within the Settings app under “Apple Intelligence & Siri.”
The activation process differs based on the device. iPhone 16 models equipped with a dedicated Camera Control button can launch Visual Intelligence by long-pressing the button. This action will open the camera interface and activate visual recognition capabilities. For iPhone 16E, iPhone 15 Pro, and iPhone 15 Pro Max users, the Action Button can be customized. Users can navigate to Settings, select “Action Button,” and swipe to locate and assign the Visual Intelligence function. Additional details can be found on The Verge.