It’s no surprise that when you think about online search, the first thing to pop into your head is Google. Yet this company thinks that search can be done a little better and a little faster, but this isn’t really news to the rest of us, as they have always thought this way and in most cases, lived up to their own ideals but now they want to take things in a directions none of us were even thinking about.

Google wants to add Camera and Augmented Reality functionality to its Search Engine. This was announced at the recently held 2019 I/O where they exhibited the new technology with more examples at the keynote event.

It’s time for a new kind of SafARi

Just recently, a few users had some interesting experience using Google search, the tech giant has rolled out a new card for Search that can put animal (virtual animals) in the real world, to users with an ARCore or ARKit-ready Android phone or iPhone.

Currently, you can view a tiger, a lion, a giant panda, a rottweiler, a wolf, and a bunch of others, with more, said to be on the way. This augmented reality feature comes into play with a button to View in your space.

On the first launch on your device, users are asked Give access to camera and device storage to Get a better look. Users who have experienced mobile AR in the past will be familiar with prompts to point at the ground and move around. A button in the bottom-right corner returns you to the animal against a white background.

A glimmer of hope for the future sales

Seeing these new features brings about a few questions, such as will developers be able to create their own 3D models and add to search, will third-party vendors be able to take advantage of this and when will developers get to play around with it? To these, we know from CNET that more 3D AR objects will be seen in search results later in 2019 and developers will be able to add support for their own objects by just keying a few lines of code.

Sadly we are still in the dark about the extent that this new features within Google search will mean to end users and developers plus how quickly developers can start pushing out their own 3D models, but at I/O, Google noted that the feature would be used for some more practical things, like shopping, where you could see what a product looked like without actually having it in hand with the ability to place them within a given space such as you can do with some existing shopping apps, or if you wanted to check out how muscles looked on a person you could have the search result overlayed in AR.

The prediction: what’s next?

That’s not, however, the full extent of Google’s AR ambitions, it’s been testing an AR navigation feature for Google Maps and has been releasing AR Playmoji stickers for users to play with. Google is also partnering with NASA, New Balance, Samsung, Target, Visible Body, Volvo, Wayfair, and others to surface third-party content in the future.

This could include shopping use cases or human anatomy as a study tool. This past February, Google also announced a new feature called the AR Visual Positioning System that uses AR to give real-time directions to users. According to the published reports, the company made that feature available only to a few numbers of users.

Share with: