Apple’s iOS 15 adds Google Lens-like features, but there are some differences, such as privacy, the necessity for an internet connection, device support, and more.
Apple’s annual developer conference, WWDC 2021, began on Monday. Apple has released software updates for iOS, iPadOS, WatchOS, and the all-new macOS Monterey. The latest updates come with a slew of new features, including a redesigned Siri, a feature-rich FaceTime, and more. Apple has launched the Live Text feature, which uses on-device intelligence to recognize text in photos and provide consumers options. Is that a feature you’re familiar with? In case you didn’t know, Google offers a comparable function called Google Lens that was introduced two years ago.
Apple’s new Live Text feature, which will be available in iOS 15 and was revealed at the company’s virtual WWDC developer conference, appears to be a version of Google’s brilliant computer vision smarts embedded into Lens.
To begin, the functionality will recognize text in images or via the Apple Camera app, and it will distinguish seven languages. The machine vision-based technology, similar to Google Lens, will be able to search for text in photos.
In 2017, Google Lens debuted with the company’s Pixel 2 smartphone. Text recognition software had been around for a while, but Google’s version merged it with Google Assistant and the company’s search engine capabilities. Actions based on text found in images and live camera views are more powerful than prior OCR (optical character recognition) technologies that simply extracted text from a photo. Object recognition adds another layer of complexity, detecting humans and animals for use in Google Photos, the company’s online photo storage service.
Apple added similar object recognition to its Photos app for the purpose of categorizing photographs, but there was no genuine competition for Google Lens until Live Text was unveiled.
The spotlight may also search for content in photographs using these image recognition skills. If you start looking for text, the iPhone will look for it in photos you’ve shot earlier, making it easier to find precise information.
At the moment, Live Text supports text in seven distinct languages. English, Chinese, French, Italian, German, Spanish, and Portuguese are among them. Another resemblance to Google Lens is the ability to interpret text on a photo.
However, Live Text is much more than just text recognition. Certain objects, animals, and works of art will also be recognized by Apple devices.
What about some unique Google Lens applications? This type of technology may take on a life of its own, and Google has discovered a few more out-of-the-box scenarios. After discovering that an old army badge behind the bar belonged to their grandfather, one user was apparently able to assist a bartender in researching their family history – they used Lens to help them pinpoint the exact infantry unit after discovering that an old army badge behind the bar belonged to their grandfather.
But probably the most intriguing new feature, especially for lovers of augmented reality games like Pokemon Go, is Google’s “unique gamified scavenger hunt experience,” which will debut in early July.