Guest post Shu Ishida, Maris Serzans – University of Oxford Imagine Cup 2018 UK Finalist
Where to find our Imagine Cup project
Our Imagine Cup project is built on Microsoft Bot Framework using Node.js and is deployed to Azure websites - http://soothey.azurewebsites.net/. Please try our working chatbot on the website. Alternatively, Soothey is also available on Facebook Messenger at https://www.facebook.com/soothey.
Meet Soothey, a lovable and friendly companion that is always ready to listen to your thoughts and problems. Soothey is a casual and free chat bot which supports the mental state of students. Southey allows students express their thoughts and problems privately and anonymously from any location in their critical time of need.
We believe that real-time feedback is critical in early prevention of mental illness since people, especially students, can be trapped in vicious cycles unless their initial negative thoughts are suppressed.
To make it easy for anyone to fine-tune their mental state, and to make the process casual and enjoyable, we developed Soothey, a chat bot that assists in understanding our thoughts better.
We will introduce important features that makes Soothey valuable.
1. Soothey knows how to respond depending on your emotion and the type of problem you have.
2. During the chat, Soothey gives out cues which encourage you think from a different perspective.
3. Soothey tells you about mental welfare events happening in your location.
We have also developed a framework which allows existing mental health teams to quickly adopt using Soothey. Peer support groups at university can inform users of Soothey about welfare events and provide the right contacts when more help is needed.
Soothey is powered by Microsoft Bot Frameworks, running on Node.js, and published on Azure App Services. The bot is registered on Facebook so that you can talk to Soothey on Facebook messenger. For natural language understanding, we developed a custom classification model with Keras.
We are a team of two Oxford students studying Engineering and Physics.
Shu Ishida is a third-year undergraduate engineer at Christ Church, and team lead for the Microsoft Student Partner Chapter at the University of Oxford. Shu is into web development and machine learning, and his ambition is to help break down educational, cultural and social barriers by technological innovation. Shu’s current interest is in applications of Artificial Intelligence to e-Learning and mental healthcare. Shu also has passion towards outreach and teaching, and has mentoring experiences in summer schools as an Oxford engineering student ambassador.
Maris Serzans is a fourth-year undergraduate reading Mathematical and Theoretical Physics at St Catherine’s college, and is currently the president of the Oxford University Physics Society. Apart from applied mathematics and physics, Maris is interested in Artificial Intelligence and natural language processing.
The two of us met at a Micro-Internship scheme organised by the University. In winter 2015, we worked together on a Django-based content management system at a company called Torchbox, and in spring 2017 we met again by coincidence to work at the National Physical Laboratory.
Mental illness is a serious health issue in modern society, and it is more common than diabetes, heart disease or even cancer. Options for treatment are limited and it becomes more difficult for the patient to recover as the illness progresses.
We combat this by early prevention using an RNN-powered mental healthcare bot. The bot has a conversation with the user, borrowing ideas from Cognitive Behavioural Therapy (CBT). CBT identifies thought distortions within the patient, using conversation to address their negative thoughts and make them come up with alternative ways of viewing their problems.
What makes early prevention difficult is the stigma surrounding mental illness. Although mental illness is such a common phenomenon, it is still ill-recognised and looked down upon, so that it is difficult to see a therapist the same way we would visit a hospital. Our solution of providing a chatbot as a casual and free alternative will allow people have easier access to CBT methods privately and anonymously from their rooms in their time of need. The philosophy of CBT is to give patient the chance to come up with alternative views themselves rather than telling them what to think; there is no risk of the user losing independence in their decision, or becoming addicted to using the bot.
Our chatbot can also help develop positive thinking for people without noticeable symptoms of mental illness. Our aim is to make the bot accessible to students everywhere, regardless of their mental state. Casual users can converse with the bot for pleasure without any stigma.
Mental illness is a serious health issue among students – one in every four university students suffers from it. Depression, anxiety, stress, and panic disorder are common symptoms.
As students, we are exposed to this reality. Several of our friends are receiving therapy and medication because of mental illness they have acquired, or which has intensified during university. Reasons can include exam stress, family, tutors, relationships, lack of self-esteem, and reality not meeting expectations.
Mental illness can easily affect productivity, life routine and relationships, and cause discomfort comparable to physical pain. While overcoming hardships could add to a person’s development, sufferers who failed to fight back that hardship fall into a vicious cycle and could result in permanent mental illness. Prevention of mental illness has never been this important and could have a large positive impact on the happiness and quality of life for all of us.
Cognitive behavioural therapy (CBT) is a type of conversational treatment which focuses on how a person’s thoughts, beliefs and attitudes affect their feelings and behaviour. The treatment teaches people skills for dealing with different problems. CBT method focuses more on shedding new light onto the current situation rather than identifying causes in past events and memories.
An important concept in CBT is cognitive distortion. It is a set of thinking patterns that cause individuals to perceive reality inaccurately by reinforcing negative thoughts or emotions. Cognitive distortions tend to make a person interpret a situation or an event in a negative way, starting a cascade of sequences of negative emotional and physical responses.
The aim of CBT is to interrupt this vicious cycle of negative thoughts and feelings. It challenges the patient’s views or thoughts, encouraging them to come up with alternative interpretations of their problems. It has been shown to be effective for alleviating depression and anxiety.
Table 1: Examples of typical cognitive distortions
|Polarised thinking||Treating something as a failure if something falls short of standard.|
|Overgeneralisation||Seeing a single negative event as a never-ending pattern of defeat.|
|Mental filter||Magnifying negative details while filtering out positive aspects.|
|Jumping to conclusions||Interpreting events negatively based on gut feelings without evidence.|
|Catastrophising||Dwelling on the worst-case scenario.|
|Should statements||Feeling guilty about yourself OR feeling resentful towards others.|
|Personalisation||Blaming oneself of negative events, which they are not responsible for.|
While Cognitive behavioural therapy (CBT) is a reliable method with low side-effects and has been shown to be effective for depression and anxiety, it targets patients, who are diagnosed with mental illness, which makes it less accessible for individuals who are in need. Stigma surrounding mental illness makes people reluctant to admit or accept that their mental health is suffering. People choose to disregard their symptoms, sometimes trying to justify their negative thoughts, which triggers a cascade of negative emotions and consequences. Going to see a therapist often comes as a last resort, by which time things are already out of hand.
Another factor that makes CBT less accessible is that it requires commitment to the process to get the most out of it. Attending regular CBT sessions and carrying out extra work between them could take some time and effort. It is not a very casual way of receiving help whenever you want it.
While CBT could help people diagnosed with mental illness, and more online materials and tools e.g. meditation, therapy and health monitoring apps available for those who are actively seeking out solutions, the process usually requires effort and self-initiative.
We believe that more actions must be taken at an earlier stage to prevent people from falling into the vicious cycle. It is easier for people to recover from thought distortion if they are even before they develop to be diagnosed as mental illness.
There are many problems that could be dealt with before they evolve into mental illness. Jealousy, resentment, lack of self-esteem or confidence, stress, and loneliness are some of the typical emotions we encounter daily. Often, these negative emotions could be alleviated by consulting friends, family or teachers. The problem is that, friends, family and teachers themselves could sometimes be the cause of the issue, which makes it harder to solve the issue. Some problems may be difficult to confide in others, because of the fear of exposure, being judged, intensifying the situation, or harming their relationships.
Our mission is to provide an easy-to-use, casual and open tool for university students to fine-tune their thoughts. The tool is designed to cultivate a positive thinking thought pattern and will also help people without noticeable symptoms of mental illness.
Microsoft Bot Framework proved to be an ideal platform to develop a chatbot intended to be accessible to many university students as possible. Without Microsoft Bot Frameworks, it wouldn’t have been possible for us to reach out to many social platforms students use with this much ease. Access via Facebook messenger, Skype, Slack and Twitter direct messaging are essential parts for our service, together with an online webchat that everyone can access annonymously.
The first step of interacting with the user and giving the correct response is to understand their emotions. Although there are many libraries available for sentiment analysis, all of them were focused on just identifying positive and negative sentiments and assigning weights, rather than giving a detailed description of what sort of emotion the sentences express.
Another shortcoming of using existing sentiment analysis libraries is that most of them use a lexicon-based approach. Each word is attributed a weight to indicate its sentiment, with negations treated separately. This works for sentences such as “I am angry”, but with chat messages users tend to express their emotions indirectly, such as “She didn’t reply”, in which case the lexicon approach wouldn’t work. It becomes even more difficult to analyse sentiments when the same word could have different nuances depending on the context, e.g. “I lost my wallet” and “I lost weight”.
As shown above, rule-based sentiment analysis lacks robustness to variations of context and word order. Therefore, we decided to implement Natural Language Understanding using Recurrent Neural Networks (RNN). RNN is suited for sequence input such as language and audio, and has the advantage that it could be trained on variable input length (of the sentence). Amongst the RNN family, we used LSTM (Long Short-Term Memory) since it could handle grammatical components in a sentence that has to be remembered, such as negations.
To achieve a more accurate prediction with fewer training data, we used a word vector library called GloVe (Global Vectors for Word Representation). These vector representation of words are determined by an unsupervised learning algorithm trained on Wikipedia and Gigaword corpora. Words with similar meanings tend to be clustered close together, so that even when a neural network is trained on limited training data, this would be able to capture test data which share the same structure with similar meanings. Another useful property is that relationships between words are also captured in the vector representation. For example, a vector pointing from “man” to “woman” would be roughly the same as a vector pointing from “uncle” to “aunt”.
Out of a list of emotions human, we identified 10 classes of emotions that suit our purpose of correctly responding to negative emotions: angry, jealous, stressed / worried, lonely, having a crush, lack of self-esteem, sad / hurt, neutral and positive. For each category of emotion, we provided a set response so that the bot can select the most appropriate response out of the 10 options upon receiving a message. Since we were not able to find a good labelled corpus to suit our purpose, we hand-labelled 300 sentences. We achieved good prediction results for non-trained data generated ourselves. The challenge is that the variation of texting style from person to person is very large. For better prediction accuracy, we need to collect more datasets from a wider range of students, and establish a sustainable automated cycle where we could get hold of labelled training-sets. The idea is that once we have the bot working with relatively good performance, we randomly ask users to label their emotions by asking “which best describes your feelings” after they send a message. This way, we would be able to increase our dataset with minimum maintenance effort.
A more difficult task was to train the neural network to ask appropriate questions to the user to help them digest their thoughts. For proof of concept, we selected 10 responses that are general enough to fit a wide range of scenarios and trained the RNN on 200+ messages. The complications are (a) there could be multiple responses that could be correct, and (b) how appropriate a response is could also be affected by previous exchanges. (a) makes it difficult for us to obtain a labelled corpus, or to increase our training dataset. A possible solution to this is to add the set of user-input and bot-response to the training data only when the next user-input is detailed enough without a sign of confusion, indicating that the bot has given an appropriate response.
Example of (b) is if a user has already given a reason of why they are unhappy, the bot shouldn’t ask “how are you feeling” or “why are you unhappy” but instead ask something like “what do you think you can do about it”. Here we implemented a model where the prediction weights build up instead of refreshing after every sentence. To avoid the bot giving the same response repeatedly, we assigned negative weights to responses that have already been given.
Communication between node.js and python was necessary to have Bot Framework interact with Keras and use its model. We used a node package called python-shell to allow continuous communication. The asynchronous nature of node.js meant that we have to create a promise to wait for Keras to respond with a prediction. Deployment involved pip installation of Keras and other python packages onto a python extension on Azure App Services.
Our main target audience is university students, the context we are most familiar with. University students are more likely to use social media or websites as their source of information and aid. Since universities already have an inbuilt system of welfare and peer support, we could maximise their range of outreach by integrating with Soothey. However, the chat bot can also be applicable in other contexts such as sixth form schools, hospitals or elderly communities.
We have developed a framework which allows existing mental health teams to quickly adopt using Soothey. Peer support groups at university can register themselves as peer supporters and inform users about welfare events. They could also provide the right counselling contacts when more help is needed – this will be as simple as adding a hashtag keyword to the messages sent to Soothey.
To promote our chat bot, we intend to form partnerships with Student Unions from universities across the UK and organisations such as Student Minds. To raise brand awareness of Soothey, we intend to sponsor targeted ads on the chat platform, beginning Facebook Messenger.
The experience we provide and the users we target are different from other mental healthcare services. While most meditation apps and health monitoring apps require self-initiative and perseverance to make it work, Soothey is made to be accessible even for passive users who do not actively seek out a solution. The cute profile image and conversation flow are designed so that students will find it easy and fun to talk to Soothey as opposed to seeing it as a therapy session.
Two major therapy bots on the market are Woebot and Wysa. Both bots follow a well-structured pre-defined flow of conversation based on CBT. It behaves predictably since they do not attempt to understand what the user inputs. The bots do most of the talking. While this approach is robust, it is up to the user to take the bot seriously and think carefully about the prompts or not. Another characteristic of these bots is that one conversation cycle is quite long, hence it is targeted towards people who are seeking for help, rather than casual users hoping for a brief conversation. With the aid of Natural Language Understanding, Soothey focuses on being a good listener rather than being a speaker, so that users can express and digest their thoughts and feel that they are appreciated.
One of the main advantages of our chat bot platform is the integrated service we offer. By narrowing down the target audience to students, we were able to focus on student-specific needs, and design the platform so that it could easily integrate into existing welfare and peer support systems within universities and colleges.
One of our goals is to increase the mental capacity of Soothey by providing a larger training data set. We plan to devise a way to classify a larger corpus of entries into response categories. This will enable Soothey to deliver a more sophisticated and lively responses.
Another useful feature of Soothey will be to automatically produce a diary from the chat record. One element of CBT is to keep a thought diary so that patients could easily keep track on their thought patterns and experiences. Since Soothey is focused on asking daily occurrences and emotions of users, this would be straightforward to integrate. Integrating with Soothey’s ability to detect and classify emotions, we could keep track of daily emotions by displaying them as a graph. This would require user login or Facebook authentication, so some users may be cautious to use this feature due to privacy concerns.
Through partnerships with student organisations we can also address specific needs that can be programmed into Soothey. For example, our chat bot can be set up to provide specific advice on academic matters or give specific contact details for counselling.
Additionally, we believe it will be worthwhile to develop the Soothey character into an animation. Soothey will be able to respond not only with text messages but also with funny emoticons of itself. This will assist in raising awareness of our chat bot and will make it more fun to use.