Visual search still hampered by image issues
NEW YORK — Imagine using your phone to snap a photo of the cool pair of sunglasses your friend is wearing and instantly receiving a slew of information about the shades, along with a link to order them.
It's a great idea — but it doesn't quite work.
Though many companies are trying to make “visual searches” a reality, this seemingly simple notion remains elusive.
Take Amazon, which made visual searches a key feature in its new Fire smartphone. The e-commerce company said the feature, known as Firefly, can recognize 100 million items. It's similar to a Flow feature Amazon has on its apps for other phones.
So far, Firefly can reliably make out labels of products such as Altoids or Celestial Seasonings tea. That makes it easy to buy items such as groceries online.
But try it on a checkered shirt or anything without sharp corners, and no such luck.
“It works really well when we can match an image to the product catalog,” said Mike Torres, an Amazon executive who works on the Fire's software. “Where things are rounded or don't have (visual markers) to latch on to, like a black shoe, it's a little harder to do image recognition.”
Visual search is important to retailers because it makes mobile shopping a snap — literally.
It's much easier to take a picture than to type in a description of something you want. Shopping on cellphones and tablets is still a small part of retail sales, but it's growing quickly. That makes it important to simplify the process as much as possible — especially as people look to visual sites such as Instagram and Pinterest as inspiration for purchases.
“Retailers are trying to get the user experience simple enough so people are willing to buy on their phones, not just use it as a research tool,” eMarketer analyst Yory Wurmser said.
Mobile software that scans codes, such as QR codes and UPC symbols, is fairly common.
Making apps that consistently recognize images and objects has been more challenging.
Forrester analyst Sucharita Mulpuru believes it could take at least three more years.
Since 2009, Google's Goggles app for Android has succeeded in picking up logos and landmarks. But Google says on its website that the app is “not so good” at identifying cars, furniture and clothes in photos.
What's holding visual search back?
The technology works by analyzing visual characteristics, or points, such as color, shape and texture. Amazon's Firefly, for example, identifies a few hundred points to identify a book and as many as 1,000 for paintings. U.K. startup Cortexica uses 800 to 1,500 points to make a virtual fingerprint for the image. It then scans its database of about 4 million images for a match.
Without easily identifiable markers, non-labeled objects are difficult to identify. Lighting conditions, photo quality, distance, angles and other factors can throw the technology off. Visual search works best when there is a clearly defined image on a white background.
Some retailers are finding success with visual search by keeping the selection of searchable products limited.
Target's “In a Snap” app works only with items from its Room Essentials furniture, bedding and decor line. And it works only when snapping a product image in a magazine ad, not when you see the actual product on a shelf. When a shopper scans the ad, items pop up for the shopper to add to a shopping cart.
Heels.com, an online shoe retailer, keeps visual search limited to shoes. Shoppers upload pictures or send links of shoes and are offered similar pairs for sale on the company's website.
“People shop through images nowadays,” said Heels.com CEO Eric McCoy. “We want to give them the exact shoe, or something similar.”
So, the race is on to perfect the technology that will lead to smartphone apps that easily recognize objects in a real-world environment.
Cortexica's founders spent seven years on academic research before forming the company in 2009. Since then, the business has been trying to mold its technology to work more like the human brain when it comes to identifying objects.
“Someday you'll be taking a picture of a whole person, and it will identify the different things they're wearing and offer recommendations,” said Iain McCready, CEO of Cortexica. “That's really challenging technically, but that's what people tell me they really want to do.”
Superfish CEO Adi Pinhas said he thinks it will be normal in two or three years to use your smartphone to search for things visually.
“Your camera will be as smart as the rest of your smartphone,” he said.
Once that happens, Forrester's Mulpuru said, it will “unleash a whole new type of e-commerce.”