Google’s latest mobile search feature is aimed at window shoppers and friends who browse each other’s closets. The new multisearch ability in Lens lets you perform a visual search with a bit of textual aid to help steer the search engine. It’s not perfect every time, but it can help determine where your friend got her patterned skirt or whether that Pikachu pillow you saw in the window is for sale anywhere online.
Google says you can also use the multisearch feature to search for matching furniture in your home, which could prove helpful if you’re the kind of person who decorates piece-by-piece.
The new lens ability is a beta feature available to anyone with the Google search app for iOS and Android. Here’s how it works: tap into the app, then tap on the camera search icon. If you’re on Android and you use a camera app with an integrated Google Lens shortcut, you can access the feature from there, too.
Snap a photo of what you’re looking at with Google Lens, then make sure the option is set to Search and swipe up on the page. You’ll see all of Google’s visual matches, but you’ll also see an option to Add to your search at the very top. Tap that item, and then input a color, brand name, or anything else that could be identifying enough to narrow it down.
If you’re shopping for something you see and know its manufacturer or brand, you could try entering the name to surface an official storefront. Or, if you’re hoping to find it on Amazon, add that as your text, and Google will try and return links to sellers.
I tried the new feature with some of the collector’s items and toys I have strewn around my home office. I took a photo of my Aggretsuko Squishme, then typed in “squishies” as the text aid, and Google returned related listings on Mercari and eBay. Of course, there were plenty of useless links thrown in there, too.
Google said multisearch works with both screenshots and live photos, and it works best when the refining text search is a color, brand, or a visual search attribute—something like “modern” or “bohemian,” or in the case of a pair of shoes, “boots” and “flats.” Google also said that it’s exploring ways in which the new multisearch feature might be enhanced by MUM, the AI model it introduced last year at its developer conference.