Google Lens is coming soon to all Pixel phones in the coming weeks. It’s artificial intelligence-powered Lens tool as part of an update to Google Assistant, the company announced today in a blog post. Lens, was first unveiled at the companies I/O developer conference way back in May, it is an computer vision system that lets you point your Pixel or Pixel 2 camera at an object and get information about it in real time. It is an AI-powered algorithm is capable of recognizing real-world items.
Lens was first made available within Google Photos last month as part of the Pixel 2 launch, and now Google says Lens will soon arrive as a built-in feature of Google Assistant starting in the US, UK, Australia, Canada, India, and Singapore “in the coming weeks,” the blog post reads.
Right now, Lens won’t be able to identify everything around you. Google says it’s best used on simple items to start. It can identify text, for when you want to save information from business cards, save a URL from a poster or flier, call a phone number written down on paper, or open Google Maps with directions to a written address. Lens can also identify notable landmarks and can pull up information websites.