Among the many new features in iOS 15, some might sound like little changes but are really helpful in your everyday use. Whether it is screen sharing in FaceTime, iCloud+, offline Siri, etc., there are plenty of new features to explore. One such feature in iOS 15 is Live Text, which Apple showed off at WWDC 2021. So, if you are wondering whether Live Text is as good as Google Lens, here is a detailed comparison of Apple’s Live Text vs Google Lens.
We will compare Live Text vs Google Lens on multiple fronts in this article. We will be discussing language support, offline availability, and other features of both Live Text and Google Lens. As always, use the table of contents below to navigate between the different sections in this article.
- What is Live Text in iOS 15?
- Live Text vs Google Lens: Basic Features
- Live Text vs Google Lens: Integration
- Live Text vs Google Lens: Accuracy
- Live Text vs Google Lens: Language Support
- Live Text vs Google Lens: Device Support
What is Live Text in iOS 15?
For anyone who hasn’t watched Apple’s developer conference, let me give you a quick overview of the new Live Text feature in iOS 15.
To kick things off, let’s take a look at the basic features offered by both Live Text and Google Lens. That way, we can see which out of the two brings more to the table right off the bat.
- Apple Live Text
Apple Live Text, as I mentioned above, can identify text, phone numbers, emails, etc., from pictures in your gallery, as well as straight from the camera app. There’s also Visual Lookup, which can identify animals and well-known landmarks, so you can get more information about them by tapping on them via your viewfinder.
Apple Live Text
Perhaps most useful for me is the fact that Live Text can understand when there’s a tracking number in a picture, and can let you directly open the tracking link, which is quite impressive.
Google Lens
On the other hand, Google Lens can do a lot of neat things as well. Obviously, it can identify text within images or straight from your camera app. You can then copy-paste the highlighted text, or, similar to Live Text, make a phone call, send an email, and more. You can also create a new calendar event straight from Google Lens, which can come in handy.
Thanks to Google’s expertise with search, and image search, Google Lens can tap into all that knowledge and identify pretty much any object you see around you. Whether it’s a plant, pen, or a handbag you spotted your favorite celebrity wearing. Just scan the picture with Google Lens, and you will get search results for it. That’s a cool feature of Google Lens that Apple’s Live Text/ Visual Lookup doesn’t have.
Moving on, let’s talk about the integration of these image recognition features into the operating system as a whole. Well, Apple has always had a unique focus on integrated experiences, and that extends to Live Text as well.
On your iPhone, running iSO 15, Live Text is baked into the default Photos app, as well as the camera. While that’s also the case for Google Lens on Android phones, the difference is that you have to tap to enable Google Lens in order to start scanning text, objects, or whatever else you are trying to do. Live Text, on the other hand, is pretty much always on. You can just long-press on the text you are trying to copy-paste or the phone number you want to call, and you can get on with it.
A similar feature exists with Google Lens as well, but in that case, you will have to first switch over to your camera app, head into Google Lens, scan the text and copy it, and then go back to the original app and paste it there. That’s a lot of extra steps that you don’t need to bother yourself with if you are using Live Text.
Clearly, Live Text is better in this regard, but I hope Google Lens brings a similar kind of integration soon. Because when good features are copied from other places, it ends up making the products better for us, and I’m all for it.
As far as accuracy is concerned, both Google Lens and Apple Live Text are equally good. I have used Google Lens extensively, and I have been using Live Text on my iPhone for the past couple of days, and I am yet to notice any issues with accuracy on either of these image recognition software.
Overall, Google Lens has better accuracy than Apple’s Live Text feature in iOS 15. It’s, however, a close competition as far as text, emails, phone numbers, etc., are concerned.
Since both Google Lens and Live Text support translation, it’s important to consider which languages they both work in. That also extends to which languages they can even identify text in, and well, when it comes to these metrics, Live Text is miles behind Google Lens.
On the other hand, Google Lens supports translation in every language that Google Translate can work with. That is over 103 languages for text translation. In contrast to this number, Apple’s seven language support fades to nothingness.
Apple’s new Live Text feature is available on iPhone with iOS 15, iPad with iPadOS 15, and macOS 12 Monterey.
Google Lens is available to use on all devices with Android 6.0 Marshmallow and above. Plus, it is integrated into Google Photos and the Google app, meaning you can use it on iOS devices as well.
Now that we have gone through all the different comparison points for both Live Text and Google Lens, the question remains, which one should you use? That answer will vary from person to person. As someone who is deep into the Apple ecosystem, I have used Live Text more in the last week than I have ever used Google Lens. However, Google Lens offers unique capabilities that Live Text and Visual Lookup don’t.