Photo credit: www.phonearena.com
In recent updates, the new iPhone 16e has garnered attention for its features, notably lacking the Camera Control button that many users expected. Despite this omission, the device still provides access to visual intelligence, a capability that has sparked a bit of envy among some iPhone 15 Pro Max users, including myself.
The visual intelligence feature enhances user experience by allowing individuals to gather information about their surroundings effortlessly. By simply pointing the camera at a particular location, such as a restaurant or shop, users can discover details such as operating hours, menus, or even translate text they encounter. This feature is versatile; it can summarize text for quick comprehension or read it aloud for added convenience. Moreover, it enables users to interact with printed materials— calling phone numbers from posters, adding events to calendars, initiating emails, or browsing websites directly from signs.
Additional functionality extends to the identification of plants and animals and even performing image searches online, enriching everyday experiences by bridging the gap between the physical and digital realms.
Interestingly, my iPhone 15 Pro Max is also set to gain access to the visual intelligence feature through its Action Button. This button, located on the left side of the device, allows users to execute various functions with a long press. These include options like Silent Mode, Focus, Camera, Flashlight, Voice Memo, Magnifier, Translate, Accessibility, and more. The upcoming addition of visual intelligence is highly anticipated by those reluctant to upgrade to the latest models.
There has been some skepticism regarding the motivations behind such features, particularly as they seem to constitute a means for Google to expand its search ad revenue. Critics have pointed out that visual intelligence parallels existing tools like Google Lens and Circle to Search. However, my enthusiasm stems from a genuine interest in maximizing my device’s capabilities without needing to invest in an upgrade.
As it stands, the visual intelligence feature is expected to make its debut on the iPhone 15 Pro and Pro Max with the release of iOS 18.4, which could roll out as early as April, along with a beta version available in the near future. This potential update offers hope for users like me who are eager to leverage the advanced functionalities initially reserved for the latest iPhone models.
Source
www.phonearena.com