Google used its annual I/O developer event on Wednesday (17 May) to reveal Google Lens, an augmented reality application which uses a smartphone's camera to recognise and understand whatever it is pointed at.
The underlying technology is not new, and even appears on the Galaxy S8 smartphone in the form of Samsung's new Bixby Vision. But if the on-stage demonstrations are anything to go by, Samsung will be working quickly to get Bixby back in the race once Lens arrives later this year.
How does Google Lens work and what can it do?
Google Lens taps into the same artificial intelligence engine used to power Google Assistant, the Siri-like personal butler available on the Google Pixel and (as of today) the iPhone.
Google showed how Lens can be used to identify flowers by showing them to the camera - useful if you have an allergy. It is easy to imagine this feature working on a wide range of objects, animals and anything else you throw at it.
Lens can also, as Bixby Vision does, offer up information based on your surroundings. Point the camera at your local high street and you will be given star ratings of each restaurant or bar in front of you.
But Google Lens is a fair bit smarter than that. Show Lens the Wi-Fi network name and password on the back of your router, and the phone will automatically ask if you'd like to connect to the network, using the password on the router and saving this so that you connect automatically next time.
Point Google Lens at a billboard outside a theatre and it will recognise it, then offer up places to buy tickets, or more information about that particular show. Save the upcoming concert to your calendar with one tap, or find music by that artist to stream online with another.
Google is yet to say when Lens will be available, but we imagine it will arrive either as a software update to Google Assistant later in the year, or as part of Android O, also due later in 2017.