24-26 March 2019, Berlin

with the main Summit 25-26 March

Navigation
07.03.2018

Snap Tech’s Jenny Griffiths on how visual search can be transformative for publishers

By Ashley Norris

Monetising content sympathetically, rather than alienating users by being too disruptive, is perhaps the biggest challenge facing media companies today.

One unique and innovative London based company believes it has created a solution which will could be transformative for many consumer focused publishers. Snap Tech’s visual search tool enables publishers to offer its readers shoppable versions of the items which have been displayed on editorial pages.

At DIS, company founder and CEO Jenny Griffiths will be speaking about ‘How the rise of Visual Search will impact on your media business.’ Here she explains how the technology has developed, where visual search is likely to go in the future and why Augmented Reality and Artificial Intelligence might further enhance its functionality.

 ***Registration for DIS 2018 (19-20 March in Berlin) is now available. Save hundreds of euros on the registration price when you sign up before 13 March 2018. Secure your place here***

Can you briefly explain what Snap Tech is and how the company has developed?

Snap Tech is changing the way the world shops. We offer world-class visual search tools for publishers, retailers and influencers in the fashion industry and beyond, which are proven to drive revenue. 

Snap Tech leads consumers to exactly what they want to buy online through visual search. From using social media content as inspiration for a search, to discovering alternative items within a retail catalogue. We work with retailers, publishers and influencers to achieve two main goals: to make the online shopping experience as satisfying as shopping in the real world, and ultimately to add to their bottom line through increasing conversions or providing brand new revenue streams based on your existing content.

The company really hasn’t changed much since its inception – we were the first in the world to do cross platform visual search in fashion on mobile and web, and we’re continuing our streak of world firsts – with our instore technologies, chatbots and AR work last year. It’s great to see the industry embracing a future featuring AI and visual search, so it’s a really exciting time for us as a company.

What would you say are your core solutions - and how are they currently being implemented in the media? 

Snap Tech’s core API focuses on finding similar fashion from an input image; be it a product shot, a model shot, user generated content or celebrity editorial photography.

For instance, Marie Claire magazine use our technology to write “get the look” style content for their publication. Using our tool, editors can source similar looks to celebrity shots in under five minutes, curating collections which remain shoppable for the duration of that content being online, rather than the duration of an item being in stock, through our Snap Similar functionality.

LOOK Magazine took a different approach. In additional to the product above, they also used our off the shelf shop to create a whole new ecommerce platform in under two weeks – with full visual search integrated, and picking which retailers to work with from our selection of over 250.

Visual search has been a buzzword in technology for nearly a decade now? Why do you think it has taken so long to establish itself?

Generally speaking, there are a number of factors that always impact the adoption of new technology; but in the case of computer vision, I would say that it was namely the performance of the technology and then the appetite for user adoption.  

Consequently there are two factors in recent years that have really tipped the balance for the popularity of visual search. The first is the increased availability of training data, which has resulted in a leap of accuracy of results, and the flexibility in the applications of visual search, not just constricting search to one image type scenario. 

Secondly, the way that people are consuming content, with mobile interactions surpassing actions on desktops and laptops. That’s naturally encouraging to shift their ways of searching from the tiny keyboards on their mobiles, to the incredibly powerful sensor that is the camera.

And how do you think it will change the media in the medium and long term?

The obvious, immediate challenge for media at the moment is sustaining advertising revenue, especially among audiences who are becoming increasingly cynical or intolerant of more traditional advertising formats.  Visual search allows custom advertisements to be generated per user and per item – so that instead of being bombarded with advertising content which can feel irrelevant at worst, or too in-your-face at best, consumers can be served relevant advertising content based on an item they’re already engaging with. Simply put, our solutions allow publishers to monetise content sympathetically, rather than at the cost of user experience.

What are the key barriers that prevent publishers from experimenting with visual search platforms like your own and others?

Like in any business, the key barrier is normally resource.  As you can imagine, publishing houses small and large don’t have developers just kicking their heels looking for new development tasks – they normally have fully scheduled backlogs and high-priority tasks which are stacking up like all of us!  It’s why we invested over half a year standardising the development standards ourselves – so that it now only takes a day (at most) to install and try.  That’s definitely making it easier for publishers to experiment!

The other one is technology performance. I’m always astounded by the number of really great companies who have used resources to try new technology but haven’t done their due diligence – how long have they been experts in the field, is the company financially secure, does the technology perform as expected, does the company fully understand the environment that they’re designing products for? For instance, fashion editorial products have to be beautiful, glossy, and deliver top results for shape as well as colour and texture, and quickly.  

Always ask your suppliers for live demonstrations of their vision tech, because there’s a lot of companies around with very glossy videos full of examples, but no real world implementations. And it’s virtually impossible to convince someone to give a technology a new trial if they already feel they’ve wasted time and energy in the past on a sub-standard implementation of visual search technology.

How do you see publishers using Augmented Reality in visual search, shopping extensions etc.? Is this the technology that will drive this type of online shopping into the mainstream?

There are so many different applications for visual search. For instance, at Snap Tech we have products for web (find similar items from product images), mobile (find similar items on the move), editorial tools (get the look and monetize in perpetuity), instore (bringing visual search to fitting rooms), chatbots, AR and many, many more. It’s not because we’re horribly unfocused as a company; it’s because users want to interact with technology in different ways dependant on anything from their mood to where they are at that moment in time. Your readers are exactly the same, so I always find it strange when people expect there to be a “magic solution” on one platform which fits everyone perfectly. 

How might Artificial Intelligence impact on Visual Search in the future? Are there any other technologies that you believe will also empower visual search as a medium?

Artificial Intelligence is already very much at the heart of visual search, as it’s effectively the computer making automated recommendations based on its own rulesets. These rulesets can be formed through machine learning, or defined using mathematical heuristics. In our case we use a combination of the two, and it’s amazing that after seven years of educating people about visual search for fashion, it’s finally becoming a part of the standard lexicon – it’s now less about educating people about AI, we’re now able to have more in-depth conversations around business impacts.  

There are so many other technologies that can empower visual search as a medium. For instance, wearables can help add more context to the input image – from our location to environmental conditions. AR and VR will change the way that we present results to consumers, and how we really make them feel that visual search is an extension of their normal browsing behaviour.  And finally, everything’s down to camera and image quality; the better the image, the more information we have to work with.

***Registration for DIS 2018 (19-20 March in Berlin) is now available. Save hundreds of euros on the registration price when you sign up before 13 March 2018. Secure your place here***

Copyright © 2017 Digital Innovators' Summit. All rights reserved.