Apple Live Text vs Google Lens: Best Image Recognition Tool? (2021)
We will evaluate Live Text vs Google Lens on a number of fronts on this article. We will probably be discussing language assist, offline availability, and different options of each Live Text and Google Lens. As at all times, use the desk of contents beneath to navigate between the completely different sections on this article.
What is Live Text in iOS 15?
For anybody who hasn’t watched Apple’s developer convention, let me offer you a fast overview of the brand new Live Text characteristic in iOS 15.
Basically, as soon as you put in iOS 15 in your iPhone, your system will be capable of establish textual content inside photos – that too straight from the digicam. That means, you possibly can copy-paste stuff from the actual world, lookup that enterprise you took an image of, and extra. It’s basically Google Lens however from Apple.
Live Text vs Google Lens: What Can It Do (Basic Features)
To kick issues off, let’s check out the fundamental options provided by each Live Text and Google Lens. That means, we are able to see which out of the 2 brings extra to the desk proper off the bat.
Apple Live Text, as I discussed above, can establish textual content, cellphone numbers, emails, and so forth., from photos in your gallery, in addition to straight from the digicam app. There’s additionally Visual Lookup, which may establish animals and well-known landmarks, so you may get extra details about them by tapping on them by way of your viewfinder.
Once you’ve recognized textual content in an image, you should use it in a wide range of methods. Obviously, you possibly can copy-paste the textual content wherever you prefer to, however there’s extra that you are able to do. If you’ve recognized an e mail, Live Text will instantly offer you an choice to create a brand new e mail to that tackle. You will get the choice to name if the textual content recognized is a cellphone quantity, open any recognized tackle in Apple Maps, and extra.
There’s additionally translation built-in due to the Apple Translate app. You can faucet on the “Translate” button within the pop-up context menu to translate the recognized textual content into English (or any of the supported languages).
Perhaps most helpful for me is the truth that Live Text can perceive when there’s a monitoring quantity in an image, and might allow you to instantly open the monitoring hyperlink, which is sort of spectacular.
On the opposite hand, Google Lens can do a whole lot of neat issues as properly. Obviously, it will probably establish textual content inside photos or straight out of your digicam app. You can then copy-paste the highlighted textual content, or, much like Live Text, make a cellphone name, ship an e mail, and extra. You also can create a brand new calendar occasion straight from Google Lens, which may turn out to be useful.
Another neat characteristic of Google Lens is enterprise card scanning. If somebody fingers you their enterprise card, you possibly can scan it with Google Lens, and you’ll get the choice to add a brand new contact with all their particulars stuffed in mechanically. That’s actually cool, and it’s one thing we regularly used whereas attending launch occasions and networking with individuals.
Now, Live Text’s Visual Lookup characteristic might be able to establish well-known landmarks, books, animals, and so forth., however Google Lens is on a complete different stage with regards to object recognition.
Thanks to Google’s experience with search, and picture search, Google Lens can faucet into all that information and establish just about any object you see round you. Whether it’s a plant, pen, or a purse you noticed your favourite celeb sporting. Just scan the image with Google Lens, and you’ll get search outcomes for it. That’s a cool characteristic of Google Lens that Apple’s Live Text/ Visual Lookup doesn’t have.
Moreover, Google Lens also can assist you together with your homework. That’s one other factor that Live Text merely can’t do. You can scan a math drawback with Google Lens, and it provides you with step-by-step options for it. It additionally works for different topics, together with physics, biology, historical past, chemistry, and extra.
Clearly, Google Lens has means extra options on provide than Apple’s Live Text does at this level. Hopefully, Apple will make Live Text extra helpful and add extra options into the combo because the years go by. But as of proper now, Google Lens is extra feature-rich and provides extra capabilities than Live Text.
Live Text vs Google Lens: Integration
Moving on, let’s speak concerning the integration of those picture recognition options into the working system as a complete. Well, Apple has at all times had a singular concentrate on built-in experiences, and that extends to Live Text as properly.
On your iPhone, working iSO 15, Live Text is baked into the default Photos app, in addition to the digicam. While that’s additionally the case for Google Lens on Android telephones, the distinction is that it’s a must to faucet to allow Google Lens so as to begin scanning textual content, objects, or no matter else you are attempting to do. Live Text, however, is just about at all times on. You can simply long-press on the textual content you are attempting to copy-paste or the cellphone quantity you wish to name, and you may get on with it.
What’s extra, Live Text can be comes baked into iOS 15 itself. So, if you’re utilizing a messaging app like WhatsApp, or some other app like Notes, or the e-mail app on iPhone, and wish to scan and paste some textual content from the actual world, you possibly can use the “Text from Camera” characteristic to enter the textual content instantly. Check out this characteristic in motion proper right here:
The same characteristic exists with Google Lens as properly, however in that case, you’ll have to first swap over to your digicam app, head into Google Lens, scan the textual content and duplicate it, after which return to the unique app and paste it there. That’s a whole lot of further steps that you just don’t must hassle your self with if you’re utilizing Live Text.
Apple is often actually good at integrating its options into the units it sells, and Live Text isn’t any exception. It works wherever you want it to, and makes itself helpful in ways in which you’ll want to use it. I can’t say the identical for Google Lens in Android, particularly not in on a regular basis use.
Clearly, Live Text is best on this regard, however I hope Google Lens brings an analogous type of integration quickly. Because when good options are copied from different locations, it finally ends up making the merchandise higher for us, and I’m all for it.
Live Text vs Google Lens: Accuracy
As far as accuracy is worried, each Google Lens and Apple Live Text are equally good. I’ve used Google Lens extensively, and I’ve been utilizing Live Text on my iPhone for the previous couple of days, and I’m but to note any points with accuracy on both of those picture recognition software program.
That mentioned, I’ve seen that if you’re scanning textual content with Live Text, it generally errors “O” (the letter) for a “0” (the digit) and vice-versa. That generally is a little annoying and never one thing I wish to give a cross to, however contemplating that is nonetheless a beta launch, I’m not going to think about it a huge difficulty.
On the opposite hand, with regards to object recognition, Google Lens has the whole higher hand. It cannot solely establish extra objects and animals than Visual Lookup on an iPhone, nevertheless it’s additionally correct with the outcomes. Visual Lookup has utterly failed at recognizing any objects for me, and I even fed it an image of the Golden Gate Bridge, which is the precise instance Apple utilized in its WWDC 2021 keynote. But that didn’t work both, so clearly, the characteristic wants a whole lot of work.
Overall, Google Lens has higher accuracy than Apple’s Live Text characteristic in iOS 15. It’s, nonetheless, a detailed competitors so far as textual content, emails, cellphone numbers, and so forth., are involved.
Live Text vs Google Lens: Language Support
Since each Google Lens and Live Text assist translation, it’s necessary to think about which languages they each work in. That additionally extends to which languages they will even establish textual content in, and properly, with regards to these metrics, Live Text is miles behind Google Lens.
Here are the languages supported by Live Text for translation:
On the opposite hand, Google Lens helps translation in each language that Google Translate can work with. That is over 103 languages for textual content translation. In distinction to this quantity, Apple’s seven language assist fades to nothingness.
Live Text vs Google Lens: Device Support
Apple’s new Live Text characteristic is out there on iPhone with iOS 15, iPad with iPadOS 15, and macOS 12 Monterey.
Google Lens is out there to make use of on all units with Android 6.0 Marshmallow and above. Plus, it’s built-in into Google Photos and the Google app, that means you should use it on iOS units as properly.
It’s no shock that Google Lens is out there on extra units than Apple Live Text. However, in case you use a number of units and need a single resolution for picture recognition, Google Lens has the next likelihood of being obtainable throughout completely different ecosystems.
Google Lens vs Live Text: Which One Should You Use?
Now that we’ve gone by means of all of the completely different comparability factors for each Live Text and Google Lens, the query stays, which one do you have to use? That reply will differ from individual to individual. As somebody who’s deep into the Apple ecosystem, I’ve used Live Text extra within the final week than I’ve ever used Google Lens. However, Google Lens provides distinctive capabilities that Live Text and Visual Lookup don’t.
It all depends upon your use instances and your workflow. I don’t usually discover myself questioning the place I can purchase a pair of denims that Chris Hemsworth was sporting. But I do discover myself wanting so as to add textual content to my WhatsApp dialog with out copy-pasting it off my Mac, and Live Text lets me do this far more simply than Google Lens. So, what do you concentrate on Google Lens vs Live Text? Which one do you suppose is best and why? Let us know within the feedback.