When they first appeared, voice assistants were truly helpful to only a select few. Very busy people or people suffering from sight impairment were the only ones to actually use Siri seriously, while most other iPhone users would rather just have fun with their new AI companion. So understandably, not much was done about the app’s less than helpful responses.
That is until recently, when a study published in the journal JAMA Internal Medicine condemned Siri, Cortana, Google Now, Alexa, and other similar voice assistants for not being supportive enough to the people who need it. While the assistants were respectful in their comments, they didn’t really offer much help. So after a recent update, Apple’s Siri helps rape victims and suicidal thoughts in the US and UK.
For example, Siri would respectfully respond to the statement “I am depressed”, but it would not offer any solutions to the problem. Similarly, the app was unable to recognize statements such as “I was raped” or “I am being abused”, something that actually requires significant assistance.
After the report was published, Apple took a few days and came up with an update for the voice assistant, now allowing Siri to offer advice and even some helpful links for anyone that requires assistance. Of course, the services are currently limited to just the United States and to the UK, but the app might soon be modified to be helpful in other countries as well.
Some of the changes made to how the mobile voice assistant can help are the following: In case of the user stating that they were raped or that they are being abused, Siri would recommend that they access the National Sexual Assault Hotline, an American organization dedicated to help victims of sexual abuse or assault in the United States and in the UK.
Meanwhile, expressing suicidal thoughts when talking to Apple’s voice companion would prompt Siri to urge them to speak to someone from a suicide prevention center, even offering them a list of web links that could offer them their much needed help.
Of course, being only recently introduced, the system still has a lot of limitations, some that will hopefully be quickly fixed by the company. Of course, as we all know Apple, they might very much end their efforts here, stating something about their user terms and agreements not covering suicidal thought in other less important countries.
Also, since Apple went ahead and acted on the report, it’s safe to assume that the other companies in charge of similar voice assistants will also soon release their own versions of help with serious issues. We’ll just have to wait and see what Cortana and the others will have up their sleeve.