Over the past years, progress in Artificial Intelligence and Neuroscience has made possible brain activity interaction with computers and other devices. In particular, the advancement of various signal processing methodologies such as Electroencephalogram (EEG), combined with AI-powered algorithms, have enabled us to delve into the world of Brain-Computer Interfaces and to talk about a new era of human factor design and research.
Brain-Computer Interfaces refer to devices that allow users to interact with computers, measuring brain activity through EEG, which recognizes the energy and frequency patterns of the brain. There are currently two types of Brain-Computer interfaces: invasive and non-invasive, and although both have their benefits, in this article we will focus on the non-invasive BCIs. By combining knowledge from Artificial Intelligence and specifically Machine Learning, Brain-Computer Interfaces have become a vital tool in aiding the accuracy and reliability of usability testing and user experience research, allowing us to talk about a new era of human factor design.
As a matter of fact, this method is sort of a paradox. This is because in this case, AI and BCI, replace humans in order to design better products… for humans. Let us elaborate a bit further by looking at some examples.
Brain-Computer Interfaces and Medical Devices
Brain-Computer Interfaces combined with AI-powered algorithms and Neuroscience analysis is a relatively new method that attempts to revolutionize and replace the traditional usability testing of medical devices which often includes questionnaires, interviews, surveys, heuristic evaluations, and think-aloud protocols amongst other widely used evaluations (the list can go on!). These ‘traditional’ methods most of the time – if not always- require a specialist who interprets the emotions and thoughts of users when testing a product or service. This interpretation is based on extensive training and expertise. AI and BCI replace the specialists who interpret the emotions and thoughts of focus groups as all the information is derived directly from the brain rather than from human interpretation.
We often link this process to the process of storytelling. Let’s say for example that you have been on a trip abroad and then you attempt to narrate your experience to your best friend back home, but even though you are trying to portray it as vivid as you felt it, your words cannot exactly match what you felt. BCIs combined with AI, will actually show the feelings that you feel without the need to narrate them.
Arguably, this is a method that is mostly needed in medical device manufacturing. Over the past years, there has been an influx in the complexity of new medical devices being introduced to the market, and thus a greater need to acquire bias-free, objective and fast usability testing. The strict regulations for FDA approval, don’t make this process easier either. Adding to these, over the last 10 years, there have been 2 million injuries and 80 000 deaths related to faulty medical devices, out of which 36% are attributed to poor usability testing. Objective and reliable usability testing is thus a matter of life and death.
By using a BCI application such as EEG signal processing, the cognitive load and certain emotions such as frustration, joy and relaxation of patients or healthcare providers can directly be extracted,in real-time, leading to better decisions in the manufacturing and design of a medical device. The device can be optimized early during the usability testing process, avoiding extra costs and long waiting time for approval. This new method attempts to increase the safety of medical devices that are being introduced to the market.
AI don’t lie
There is a common saying that when human interpretation is involved, biases are inevitable. This is where AI and BCI come in, to benefit patients who have communication difficulties, physical hindrances or fall in a certain age group that can hinder usability interpretations such as the elderly or children of young ages who are unable to talk. In addition, healthcare providers can be included in the usability testing of the medical devices, so that the devices can be easier to use and more adaptable to their needs.
Brain-Computer Interfaces and the gaming industry
The gaming industry is without a doubt, one of the most innovative in terms of introducing new technologies that enhance user experience, introducing human-computer interaction concepts such as gesture-based game controllers, VR, AR, and Cloud-based gaming.
Games reflecting the mental state of the player
Through the use of BCIs, gaming experiences have come to be much more intensified and personalized because games are adjusted according to the cognitive state of the gamer. For example, if the gamer feels surprise, frustration, excitement or perplexity, his or her character in the game will adjust to these feelings automatically. An example of this can be found in the game World of Warcraft® where the avatar changes appearance according to the user’s feelings.
Tailoring games to what gamers really want
Additionally, BCIs can be used in the development process of games as through technology such as EEG, the various cognitive and emotional states of users can be detected while they are playing the game. This means that gaming developers can extract these features through Artificial Intelligence and brain analyzing algorithms and understand what the users feel and think at different stages of the game. This can lead to tailoring the game according to the criteria of the target group, ensuring greater success for when the game is launched and most importantly user satisfaction.
Brain-Computer Interfaces and the Advertising industry
Artificial Intelligence is already widely being used in the advertising industry. A/B testing allows enterprises to test which campaign works better according to the visual content of the ad which is then tested amongst different demographics. Algorithmic Advertising has also been a prominent practice that is now used across many industries to automatically place ads around the web, on platforms that the algorithm thinks are relevant according to the product and target audience. User-generated content, provision of dynamic pricing, and assessment of customer sentiment are other AI-powered examples that are commonly used by advertisers.
BCIs take the process of the personalization of ads a step further. This form of evaluation will be suitable for advertisers that launch expensive campaigns and want to test their likability and appeal before they invest tons of money. Let’s say for example that your firm wants to launch a new billboard and display campaign across the country, but they are not sure if this campaign will perform according to expectations as there can always be room for optimization to drive better results. The reality is that no one can be sure if a campaign will be a 100% success or will be doomed as soon as it is launched. Look at the notorious example of the ‘Are you beach body ready?’ campaign in 2015.
With the use of BCIs it is possible to trace the thoughts and emotions of users when interacting with the campaign before it is launched, and consequently optimize those parts that make the user excited without being overly provocative or in many cases…boring. Perhaps the user is alienated by the colours used in the campaign, the text, the graphics or the implied message. There is also the possibility that he or she doesn’t understand what the campaign is about. These processes can be improved through the use of BCI and AI-powered analysis to ensure that your audience gets the most value out of your campaign.
To conclude, AI in BCI is a new innovative method of human factor design and research because it places humans at the centre of the usability testing and user experience, limiting human error while maximizing performance and ease of use of products and services.