Here is the latest version of a headset that is capable of detecting and influencing your emotions.
The headset, developed by imec and Holst Centre, is promoted as “breakthrough technology to advance neuro research, e-learning and virtual gaming”.
Before we get too carried away with this idea, and before we delve into the true implications of this technology, keep in mind that we have been using technology to modify our emotions for decades. We are emotionally influenced by movies and music and may choose to alter or augment our moods by matching them to the appropriate media output. Even the simplest video games can influence our emotional states in positive and negative ways. Those who have experienced games in virtual reality know that reason can be bypassed in VR environments. There are still games my gamer son won’t play because they are simply too scary, and even though I logically know I am not on the edge of a high building, I have great difficulty making my mind believe that I can take that first step into the abyss. So how will this headset change anything?
Here is what the company says about the integration of music and emotion detection.
“With the integration of music playback, the system can not only measure, but also influence the emotions of the person that is wearing the headset. With the help of Artificial Intelligence our headset can learn the personal musical preferences of the wearer and compose and playback, in real-time, music that fits his preferences and influences his emotions to achieve the wearers’ desired emotional state.”
Isn’t this what drugs do? If drugs are used to alter our emotional states and change our perceptions of reality, won’t these headsets do more or less the same thing? The somewhat surprising phrase in the quote is that the device will interact with AI to “compose” music to achieve the user’s desired emotional state. This begs the question: Can a nefarious actor use the same technique to alter a person’s personality in order to manipulate them?
The new headset is certainly a breakthrough in terms of comfort and use. In the past, in order to access a person’s emotional states, electrodes had to be ‘glued’ to a person’s skull and placed in precise locations. The new headset uses ‘dry’ detectors and is designed to fit in a way that enables the embedded electrodes to be placed above the precise brain regions that need to be monitored. It would seem but a matter of time before VR headsets, such as the Sony Playstation VR headset, come with similar emotion detecting electrodes.
But why wait? The future is already here. Looxidlabs has already integrated brain sensors with a dual camera VR headset.
The system can detect what a user is looking at on a VR screen and simultaneous chart the emotional response to it. The diagram below shows how the device is integrated.
Currently, the device is being marketed as a research tool. Marketers could get a quick insight into what ads had the most effect on potential customers. Game designers could determine which images created the desired emotional impact. Fashion designers could target market niches more precisely. The list goes on and on. The device will be available for pre-orders on February 1, 2018.
All first, I couldn’t really understand how game developers could use this technology to make any substantial difference to gaming. Would users be able to preset emotional options such as what level of fear they want to tolerate? Possibly, but my guess is that marketers have more financial goals in mind, and I don’t say this without a reason.
Take a look at this leaked screenshot from a marketing company that specializes in marketing within the gaming environment. From information given in the leak, the marketing may be associated with EA games. Notice how they detect the psychological state of the gamer and use it in microtransactions to gain income.
In short, the marketers determine a gamer’s psychological state by using the gaming device microphone to analyze the gamer’s vocal characteristics. Apparently, a depressed gamer will have a high purchase rate for in-game products (microtransactions) but will then tend to experience buyer’s remorse, which may inhibit future purchases. Thus, if the marketers can manipulate the user into a non-depressed state, they would increase long-term revenues.
The leaked documents also show how the marketers would analyze the sound of a car’s engine that they picked up on a user’s smartphone. Using a combination of algorithms, they were able to determine the brand of car the gamer used and, thus, calculate the social status of the individual, making them easier to target for marketing. Numerous other data gathering tools were mentioned in the leak as well as how the data could be used for specific purposes.
It is interesting to note that EA games initially dropped microtransactions in its Star Wars Battlefront II game when they were criticized by the EU for encouraging gambling within the gaming environment. In any event, it doesn’t take much imagination to see how game developers could use an emotion detection VR headset to further their marketing success and gain income for the company. The development of data mining via the gaming vector should contribute to a marked increase in free online gaming in the years to come.
Of course, there will be positive uses for these emotion detector headsets. An array of psychological problems could be addressed and even cured. Phobias could be overcome. Social relationships could be improved, and learning could be enhanced. However, there is a disturbing undercurrent that comes with this emotion-on-demand technology. Would game developers be able to make games more addicting? Could gaming be used to manipulate an individual’s viewpoints in a manner similar to brainwashing? These are questions yet to be answered, but, disturbingly enough, the questions have now become valid.