Photo credit: www.theverge.com
One of the standout features of Apple Intelligence that remains on track for release is Visual Intelligence. This innovative tool harnesses the capabilities of your iPhone’s camera to recognize and provide information about objects and scenes in your environment.
For instance, if you capture an image of a pizza restaurant, Visual Intelligence can reveal its operating hours. Alternatively, pointing your camera at a plant will identify its species and offer care tips. Users familiar with Google Lens will find this feature similar.
However, it’s important to note that Visual Intelligence is not universally accessible. It requires specific iOS updates: users need to be on iOS 18.2 for the iPhone 16 series or iOS 18.3 for the iPhone 16E. Additionally, iOS 18.4 is necessary for both the iPhone 15 Pro and iPhone 15 Pro Max models. Proper functionality also depends on having Apple Intelligence activated via Apple Intelligence & Siri in your device settings.
How to Launch Visual Intelligence
If you own an iPhone 16 equipped with a Camera Control button on the right side, you can activate the camera and Visual Intelligence by pressing and holding this button.
For users of the iPhone 16E, iPhone 15 Pro, or iPhone 15 Pro Max, there are multiple methods available to access this feature.
How to Use Visual Intelligence
Visual Intelligence offers a variety of functionalities. Typically, it can identify and respond to inquiries about nearly any item you present to it; experimenting with different objects can lead to interesting discoveries.
In addition to the general recognition capabilities, two dedicated features will appear as buttons on your screen whenever Visual Intelligence is activated and observing an object.
To exit Visual Intelligence at any point, simply swipe up from the bottom of your screen.
Source
www.theverge.com