Search In The Metaverse: Why Brands Should Optimize For Visual Search
For over a decade, Google search has been the first place we go whenever we need to find information on virtually anything. It’s used countless times a day on our computers, our smartphones, in our cars and even in our homes. Whether it’s to make a dinner reservation or to look up flight information, saying “Hey Google” or “Alexa” has become second nature. Most recently, our interactions with Google have typically revolved around screens and voice, but if you ask experts what the next advancement for interaction is heading, they’ll say it’s visual. Being able to make buying decisions instantly and simply, it’s not surprising that Google, Amazon and other tech companies are making a massive effort to make their services accessible via your camera.
The human brain processes visuals 60,000 faster than text and 90 percent of the information transmitted to the brain is visual, it only makes sense visual search will resonate with consumers.
Leveraging AR navigation on our mobile phones has become increasingly popular, so much that Google Lens has trained its system so you can scan your environment and it will instantly use its visual search capabilities to identify virtually anything. For instance, you can take a photo of a fern and Google Lens will instantly identify exactly what variety of fern it is and provide options of where to buy it. This is a prime example of how Google and Amazon have enabled companies to quickly and easily manage machine learning models, allowing them to keep pace with consumer needs.
“Brands putting SEO on the back burner will have a lot of catching up to do as visual search takes precedence with consumers,” said Lisa Buyer, author of Social PR Secrets and CEO of The Buyer Group.
Social Networks And The Future Of E-Commerce
The recent trend of social networks leveraging visual search has also enabled us to see more broadly how they fit within the large e-commerce framework. With Instagram and Facebook offering virtual storefronts, platforms like Snapchat and TikTok have to keep pace by directly integrating with AR. For example, consumers could visit a virtual storefront on Instagram, see an influencer use AR to promote the store offerings, and then instantly buy and convert. When you consider going from texts, audio information and manual search to visual search, we're just beginning to scratch the surface of future AR capabilities.
As for e-commerce and the larger retail industry, Google Lens gives consumers the option to easily shop, which has rapidly expanded customer adoption.
“E-commerce brands, especially D2C, are notorious for lacking in optimization efforts. The good news is those embracing a modern visual search strategy will gain the organic benefits of reaching consumers in their journey,” said Buyer. “This is a lot less expensive in the long run than trying to buy your audience as paid media becomes more complicated at a higher cost.”
According to recent data out of Shopify, businesses are seeing conversion rates from 50% to 250% from trials being able to embed those experiences directly on the retailer's website. Google Lens has also directly incorporated branding to further expand their reach, such as advertisements where a consumer could use Google Lens to scan the ad and it then launches experiences or links directly to a product. Leveraging this type of AR is also particularly relevant for airport navigation.
Taking A Page From Snapchat: Traditional Search Going Visual
Gone are the days of traditional retail. Social media platforms have become heavily involved in visual search, especially because the buyer’s discovery component directly plugs into a virtual storefront, which instantly converts the consumer to a customer. Furthermore, a user might share an experience with their friends on social media, which converts the digital medium directly into a sales funnel. It’s clear that AR can impact every step of the customer journey—from brand awareness to point of sale, even to post-purchase.
If you visit Snapchat and scroll to the bottom of the carousel, you’ll see they have changed the navigation to incorporate a vast amount of AR tools. When launching the Snapchat camera it scans your environment and targets specific filters based on your location. We’re seeing Snapchat advance in terms of using features and technology like LIDAR. In fact, they are one of the only companies that have this capability. Although the mass market can’t purchase these just yet, Snapchat recently introduced a pair of 3D spectacles which will let you see the world in true augmented reality. This is ultimately an experiment for Snapchat and AR creators to show how advanced experiences in AR are going to unlock a new way to experience AR hands-free.
But the move toward visual search also brings with it new challenges and potential issues that experts point out.
“Everyone is training their models from the same datasets and very similar architectures (neural nets) so we're about to hit the limit of Baye's optimal error. There are blog posts and GitHub Repos floating around showing how to generate "fake" images based on GANs (Generative adversarial network). Fraud detection will be the key to user adoption. I'd like to see more research and construction of less biased, larger open databases to more accurately reflect real-world conditions,” said Anne Ahola Ward, O’Reilly Author, Futurist, and CEO of CircleClick.
One potential answer could rest with volumetric video and photogrammetry and being able to train models on datasets of volumetrically captured or photogrammetry scanned assets, which could get us closer to the real items than ever before.
The Next Wave Of Retail
Whether it’s Google, Amazon, Instagram, TikTok or Snapchat, in order to provide better purchase predictions, businesses need to efficiently discover consumer patterns and anomalies in order to be more agile in the wake of shifting market dynamics.
As with the visual search function in AR, data scientists and engineers must replicate the real world in the cloud, so that the algorithm knows that the building you’re looking at is, for example, Starbucks. This is also essential when using Google Street View. A virtual search function will recognize this view and know exactly where you are because there is a replica inside the database of the exact street view and its surroundings. This way, businesses can directly present relevant information and products to you, based on your exact location.
“Today’s smart CMOs have their third eye on the opportunity of getting found in the metaverse, including optimizing for visual search,” said Buyer.
In the future machine learning and AR will play an important part in retail and they are rapidly changing our everyday decisions and ultimately, how people will probably engage with the world around them.
This article was written in collaboration with David Ripert, Poplar Studio CEO & Co-FounderSource