Posted on

Improve service with emojis and reading between the lines

Can we transform visual- and auditive emotional cues into emoji’s? And if so, can we improve your digital experience by reading between the lines? I think so. This is the pitch I send my colleagues to get them to research the subject.

Please note: This article was written by me, posted on the now defunct Mirabeau.nl, the digital agency I was part of.

Mirabeau constantly researches how digital services can improve the human experience. One of the biggest hurdles in interpreting what a person needs is the ‘you know what I mean’-factor. Although people can say one thing, often times they need, or mean totally something else.

My pitch basically asks: could the recording and ‘reading into’ emotions make digital services more empathetic, efficient and powerful? This pitch is one of the directions I found interesting enough to explore.

The next step to intent detection: combine inputs and experience

One part of the research is to see how we can better detect intent. Although there are some interesting services that try to analyse your intent based on text, I think we need to combine a couple of technologies to make sure ‘we know what you mean’.

So perhaps if we combine facial expressions, gestures and sounds to determine your tone of voice.

Basically we want to see if we can ‘put emoji’s between the lines’ based on your face, voice and gestures, so digital services can better ‘read between the lines’.

While we’re at it we probably must also see if we can use machine learning to hone our digital skills to ‘read’ your intent. So there’s an I.A. aspect to this as wel.

Using emoji’s to annotate intent

To interpret emotions we also need a way to record them together with the words we express ourself.

So could emoji’s be the music-score to our lyrics?

Although emoji’s are a cultural outing and might be interpreted in many ways, we think we can use some of them as clear cues to express stress, happiness, jokes, sarcasm anger, excitement or even despair.

Imagine you’re a bit peckish. You’d probably say: “I’m hungry”. A robot build to fulfil your every need would start cooking a full brunch right away, but is that the right response? Well, that depends, right?

Read between the words

You might have meant: “I’m kind of hungry, so I might want to get a cookie in a while”. In an alternative scenario you might have skipped breakfast and are borderline ‘hangry’ (a fierce form of hunger expressed with a lot of curse words).

There’s a big difference between “I’m hungry 😅” and “I’m hungry 😡”.

Emoji’s could be a great way to record your intent ‘between the words’, rather than ‘between the lines’. With this added ‘intent’, machines can help you just like ‘they know what you mean’.

That’s our second part of the research: can we use emoji’s to predict, personalise and help you in a better way?

TL;DR: Let’s see if we can detect emotions, transcribe them to emoji’s and use them to read between the lines to better digital and physical services alike.