Health apps are a common sight in app stores these days, with a recent surge in mental health apps that go beyond basic fitness and calorie-counting to also help with emotional well-being. Since global health care services are swamped, it is not surprising that the demand for these self-help apps has increased in recent years. In the UK, getting NHS support for a mental health problem involves several months on a waiting list. Many of us are seeking other ways to look after ourselves in the meantime.
Luckily, a small group of geniuses in India have developed an intelligent chatbot for us to talk to. Meet Wysa, the self-help app that provides a toolbox of CBT (cognitive-behavioural therapy) techniques to support your well-being.
Wysa was introduced by Touchkin, an family business that was co-founded by Jo Aggarwal back in 2015. Stigma and secrecy surrounding mental health is common in Indian cultures and the makers of Wysa wanted to meet demands of individuals who felt they could not talk to a doctor or loved one about their mental health but needed some form of support.
Wysa quickly became popular in the east and the west, either because of stigma or because of inaccessible health care (or both). Wysa is anonymous and available. This appeals universally, especially to the young people who have grown up in a fast technological world.
Now, I’m all for self-help. However, being a psychology graduate, I wanted to find out more. Where was the evidence that this app worked? I was somewhat reassured by a recent study supporting the benefits of the app. The study concluded that Wysa helped people with depression manage some of their emotional difficulties but should not replace support from a mental health professional.
Still curious, I thought I would download the app and see what the fuss was about.
After shooting onto Google Play to download Wysa (the free, limited access version), I was welcomed by a friendly penguin avatar who asked me to answer some questions about my well-being.
I was presented with questionnaires that screen for depression and anxiety. Having worked in a CBT clinic, I was familiar with these and appreciated that standardised measures were used, as they are supported by research.
Based on my answers, Wysa was able to identify aspects of my emotional health that might need work. I sped through a lot of the jargon about the importance of sleeping and eating but appreciated that it was there. Starting with behavioural basics seems like a smart way to check in on someone’s mental health in case anything can be done from the outset to improve things.
Wysa then asked me to pick some personal stressors, saying a toolkit was handy with tailored self-help exercises. Wysa also made sure to inform me that the app was no replacement for professional help, which I think is an important disclosure to open with.
Despite Wysa covering a lot of key ground, I was struck by how robotic it did feel to talk to artificial intelligence. I wondered if I simply preferred human coaches.
As per my usual reaction to social media, I felt harassed by Wysa’s notifications on day one (even though I had opted for them) and failed to use the app at all. It was the same story with Headspace last year…
Determined to persevere, I opened Wysa and told the chatbot that I had enjoyed my belly-dance class that day. The penguin avatar clearly had an agenda and sent me straight to the toolkit without acknowledging what I had said.
I felt a bit put out, but I decided to try the ‘organise your tasks’ exercise. This launched a chat with the avatar, which asked me to state a couple of tasks that I needed to do that day. The bot then asked me which task was the most important and sent me back the items in a short list.
I felt this was a bit basic as I already had a prioritized list of tasks in my diary.
At the end of this exercise, Wysa asked how I was feeling. I told the chatbot about a basic physical pain issue and it asked me how long it would take for me to feel better. I cannot imagine a worse question. If the pain was chronic and I was in a worse state of mind, this would have only made me feel hopeless.
Perhaps this was not my cup of tea. I was getting quite angry with this intelligent chatbot and we were only on day two.
Nevertheless, I went back to my toolkit and tried again. There was a mindfulness meditation video which I tried and actually felt quite calmed by. The video included visualisation instructions, which I think are super helpful if you are not used to meditating and clearing your mind.
My hope was somewhat restored. This app had limitations, but it was not totally useless. I wondered if mindfulness exercises might simply be a safer bet for a computer than CBT. I was reminded of a recent stir in the media where a similar health app (Woebot) had failed to pick up on a user experiencing child sexual abuse. This BBC article highlights how the incident influenced makers to enforce an 18 plus age limit to Woebot and Wysa.
Clearly, caution is important with health apps. As highlighted in my recent social media article technology can benefit mental health if it used in the right way and does not replace professional support where that is needed.
Since I had adjusted my expectations at this point, I had a much better experience on day four. I tried out the ‘gratitude’ exercise, which asked me to list things in my life I was grateful for.
When I mentioned a person, Wysa even gave me the option of forwarding a cartoon picture to that person to say thank you for their support. This was a nice feature, and I could see how it might be a fun way to open up avenues for support and connection outside of the app.
Wysa asked me for some feedback and I expressed positive feelings about today’s ‘gratitude’ exercise. I was rewarded with a cartoon of a penguin blowing kisses and could not help but feel uplifted by this.
Straight off the bat, Wysa wanted to help me with any negative thoughts I might be having. I was wary of trying out anymore CBT with this app but reluctantly consented to having my brain picked.
Mr penguin avatar started with some general pointers about observing rather than fighting negative thoughts (good advice). It then invited me to list some of my negative thoughts in a ‘thoughtpad’ exercise, which I did. I received some stock responses (‘I understand how you feel’ came up twice) and did not feel overly validated.
Once again, it seemed that Wysa was failing me on the cognitive side of things. Thankfully, the intelligent chatbot seemed to recognise its own inadequacy at this point and pointed me back towards mindfulness meditation exercises.
Despite the ‘thoughtpad’ being a dissatisfying experience so far, I decided to have another go.
I was impressed that Wysa had remembered some of the thoughts I had mentioned before, so we were not starting from square one. Today, the chatbot seemed to be more on the ball with responses, asking crucial CBT questions like ‘If a friend was with you, would they see it in the same way?’
The avatar then invited me to spot biases in my negative thoughts, giving me a long list of possible cognitive distortions. I think this would have been a redundant exercise if I did not have prior knowledge of CBT. Thankfully, I knew what ’emotional reasoning’ was (assuming feelings mean facts, regardless of other evidence) and identified this as a distortion in one of my thoughts. To Wysa’s credit, it went on to give a more thorough description of emotional reasoning. Perhaps we could have done with this level of detail sooner in the conversation but at that point I did not know which tools allowed me to ask for this.
The chatbot then began to talk about ‘positive intent’, which I was not familiar with and requested an explanation (you can read more about it here). I was presented with an example worksheet with sections labelled ‘thought’, ‘intent’ and ‘re-framed thought’. This made things a lot clearer and I was able to challenge the ‘emotional reasoning’ in my original ‘thought’, appreciate the ‘intent’ behind my distortions (self-protection) and come up with a more balanced way of looking at things (re-framed thought).
Finally, we had CBT in action and it was not a total disaster.
I was keen to explore more of the CBT features of Wysa while we were on a roll. The ‘conversation planner’ was a bust (Wysa said ‘I am not sure how to help you with that’). However, the ’empty chair’ exercise was a great way of helping me organise my thoughts and see alternative viewpoints. This basically allowed me to put someone in the chair who I disagreed with and have a hypothetical conversation without any real-life negative consequences.
Wrapping up the week
So, did I feel like the epitome of emotional health after a week of Wysa? Well, not necessarily, but I have to say that this particular health app grew on me, more so than other CBT and mindfulness meditation apps that I have tried.
I can only speak for the free version of the app, of course. However, I think to get the most out of Wysa, you have to ask the chatbot to explain everything as you go along, rather than just clicking the suggested ‘yes’ and ‘no’ responses to whatever it says (and it helps to remember the hashtags for different features!). This means that you learn more about CBT and other self-help tools, giving you more space to share your emotional patterns. Over time, there is more and more information about how you think, feel and behave stored in the app. This offers better opportunities for reflection and change.
Essentially, it’s about expectations. I found it useful to think of Wysa as less about having a conversation, and more about looking into a very chatty mirror. Wysa certainly organised my reflections better than if I simply wrote lists in a journal. For everyday emotional challenges, it was definitely better than no help at all.
Saying that, the effectiveness of the app depended on the type of problem I shared with it. The minute things got deep, I could tell Wysa was getting out of depth. I imagine that suicidal thoughts and self-harm would not be safe in the hands of this app, nor should they be. Some things are meant to be shared and dealt with face-to-face, with a mental health professional.
The bottom line
So, next time you are having a rough day at work, feeling anxious about your to-do list, or wanting to get some perspective on a short-term issue, I would highly recommend Wysa. However, if you are in a state of despair or constant anxiety, please do seek the help of a human coach. Accessing help is uncomfortable for all of us, but if you have the opportunity it is worth taking it. Do not forget to reach out to your real-life friends as well, regardless of whether you are also using a health app or getting professional support. We all need to turn outwards to gain perspective sometimes.
In the meantime, happy apping!