We created something wonderful and kind of nerdy. It’s called BeerRecognition. It’s an immersive experience with a central role for beer bottles. To do so we mixed pattern recognition, augmented reality and beer brands together in one concept. A recipe for success.
Please note: This article and it’s materials are published here courtesy of Mirabeau, the company I wrote this article for.
As Strategic Innovator I have the honour and responsibility to create excuses to tinker with new technology for the coolest humans of all: normal, generic people like you and me. And because we’re a digital design agency we tend to use new technologies as inspiration to create something fun, useful, and valuable.
We like to call this flavour of applying innovative technology and human centred ideas ‘applied innovation’
BeerRecognition came to be when we decided: “When drinking awesome beers, people should be enveloped by the full beer experience. A label and fancy website can only do so much.”
And so we applied our process to figure out how we can make the beer tell it’s own story. We mostly followed the Human Centered Design steps to get to the answers, although to make room for our discovery we also included ‘tactile tinkering’ into our method.
Human Centered Design consists of these activities that are iteratively run through: Empathise, Define, Ideate, Prototype, Test. In our innovation program we use prototyping as a leverage to further the other activities. That’s the process of ‘tactile tinkering’.
Tactile tinkering basically is touching and manipulating tech and design and figuring out how it can be applied even better. Holding your experiment, re-ideate, re-empathise and re-structuring the story is something that drastically improved our outcomes AND learnings.
We try to start our discovery projects with a ‘what if’-scenario. This way it’s open to interpretation and makes us think about the possible outcomes. For BeerRecognition it was:
What if we immerse people into the story and inspiration around a beer, or other drink.
Beers seems to be a logical choice because their labels are very distinguishable and pretty fun. There are real fans, materials and backstories of beer, so plenty of inspiration to go around.
Did I mention fun must be part of our discovery projects? Well it is! 😀 We found that intrinsic motivation when designing an idea is very helpful if you want to push the enveloppe of what’s possible, but also to drive energy in innovation teams and stake holders.
So ‘just recognizing beers’ isn’t cool enough, or rather it was not enough for the concept to work. We wanted something that blows people away, not just inform them with cute factoids.
Pushing technology to discover usage, mythbusters style
Most of the time we have a set goal in mind before we start. Like: discover a technology concept like: recognizing things based on unique physical properties.
We know how things SHOULD work, but most of the time we have not seen a good example based on existing technology.
In true Mythbusters style we work the challenge from two sides:
- Can we create our idea with existing technology?
- What do we need to replicate the ideal scenario?
Our goal: how can we combine common design trickery and bleeding edge technology to build the perfect human centred experience.
Under the hood
Augmented Beerreality in Unity We wanted to emerge people into the world of each unique beer. Adding Augmented Reality (AR) seemed as a logical step to take.
Unity is a (mobile) visual engine used to build games, AR apps and all kinds of visual powerful experiences. Advised by our AR expert we build an environment where somebody can step in, show the beer and gets immersed with inspiration and information.
Recognizing beers with Tensor Flow
At the backend we used the Artificial Intelligence tool Tensor Flow to ‘learn’ the labels of our beers. Imagine over 730 images of each beer bottle to learn each angle of a beer beer bottle!
There are AI tools available that recognize bottles in general, but we needed to recognize specific brands of beer. Like the general cloud services of Amazon, Google, Apple and Microsoft we had to learn a specific Model of -for example- a ‘Lost in Spice’ beer.
Innovation is improvisation
We hoped to use a new Real Sense Intel camera with depth perception to distinguish people from the background of the environment that people step into.
Alas, the camera release was delayed and we had to use a technology that weatherman use since the 90’s: green screens. When you step into the experience you also step in front of a bright green screen. This specific colour is replaced in Unity with a beautiful, fun and animated backgrounds.
Learnings and next steps, with you?
This was a really cool project to work on, but it’s not finished by a long shot because we’ve got an arm-long wishlist to incorporate. To mention some: ditch the green screen and use camera’s with depth perception, use gestures to interact with the experience, use more interactive animations, track the body in the experience, so we can even put some elements ON you.
Not to mention the technical learnings we got and want to elaborate on: get the learning of bottles and labels to greater heights, experiment with faster hardware, but also: create this on a mobile platform. Perhaps this should be a SnapChat plug-in? Who knows! Lot’s to experiment!
All in all we think we’re ready to make the next step and both implement AR and learning our models with Tensor Flow for production ready applications and we want to invite you to build and re-imagine the beer-experience from the ground up!
Imagine the potential to not only create inspiring environments for your experience driven marketing, but also empowering employees to see more with the help of AI, AR and high definition optics we find in every mobile tool in the field.
Do you want to know more about our Innovation Process, AR, AI, or Applied Innovation? Drop us a line! We’re sure you won’t be disappointed!