The future of human interaction with machines

The future of human interaction with machines

The future of human interaction with machines

EMOTIONS MATTER. WE know it intuitively, but our machines don’t. They do not know how we feel; they have no idea as to what we mean, and they are clueless in deciphering if a quickly uttered “have a nice day” actually is genuine. This emotional blind-spot is not due to a lack of trying. As a matter of fact, researchers and computer scientists have been trying for decades to endow machines with these unalienable qualities we humans take for granted.

In the past 10 years, there have been great breakthroughs in the area of voice synthesis (now even with human-sounding inflections, e.g. a rising tonality at the end of sentences where appropriate), speech recognition, and text-to-speech. But here too, the focus has been merely on words, as speech recognition is a bit like the reverse of text-to-speech — the computer hears what you’re saying, and converts it into text. Nevertheless, the computer does not register the tremble in your voice, the enthusiasm in your inflection, the sorrow reflected in your modulation, or the assertiveness in which you guide your vocal intonation.

Read Also : How social media functions in the growth of business

Researchers and Computer Scientist have been trying for decades to endow machines with these unalienable qualities we humans take for granted. They might have found the way. They are studying how people interact with computers and to what extent computers are or are not developed for successful interaction with human beings. They started working on a new field called “Emotions Analytics”. It is a field that focuses on identifying and analysing the full spectrum of human emotions including mood, attitude and emotional personality. Now imagine Emotions Analytics embedded inside your mobile applications and devices, opening up a new dimension of human-machine interactions. Think of an “emotional analytics engine” that takes a 20-second snippet of your spoken vocal intonation and nearly simultaneously offers you a succinct analysis of the underlying emotional communication via a dashboard.

One such initial example is Moodies, which asks individuals to speak their minds, and then analyses the speech into primary and secondary moods, the latter usually pertaining to a subliminally underlying emotional state. Since the app analyses vocal cues and intonations, not the content of your words, the software can, at present analyse any language (it was tested on more than 30).

One important Human-Computer Interaction factor is that different users form different conceptions or mental models about their interactions and have different ways of learning and keeping knowledge and skills (different “cognitive styles” as in, for example, “left-brained” and “right-brained” people). In addition, cultural and national differences play a part. Another consideration in studying or designing Human-Computer Interaction is that user interface technology changes rapidly, offering new interaction possibilities to which previous research findings may not apply. Finally, user preferences change as they gradually master new interfaces.

Close Menu