The recent furore over a lack of Covid-19 testing has been a timely reminder of the importance of language. The language used in communications messages can make or break their effectiveness with the general public: the messages can either rally enthusiasm or foment revolt – the latter of which we are currently seeing in the UK. Can the language the Government uses in its messages instead serve to alleviate the public’s frustration with the difficulty in obtaining a Covid-19 test? Having myself been frustrated over being told, ‘No test sites found’ when trying to book a test for my son, the linguist in me believes that language can help.
First of all, a negative message violates the user’s ‘positive face wants’ – this is a desire for connection, a want to have one’s needs met, a need for approval. Apart from the fact that the reply ‘No test sites found’ does not act to meet the immediate need for a test, the way the message is formed does nothing to mitigate the effect on the user, for example with an expression of sympathy for their needs – a ‘sorry’ goes a long way in this kind of situation!
Secondly, the unsympathetic message violates the user’s ‘negative face wants’– a human need to not be imposed on. ‘No test sites found’ imposes on the user by putting the entire onus for finding a test back onto them without offering any support or a route forward, e.g. ‘check again at X o’clock as more test appointments will be released then’.
And finally, the message falls severely short in detail and clarity – ‘No test sites found’ could well suggest a technical problem or an incorrect data entry rather than no availability. But the unidirectional nature of the platform does not provide the user with any opportunity to clarify the message, which means that the user is put into a so-called K- (less knowledgeable) position.
The only other ‘advice’ given to the user is that ‘if no tests are available online, do not call the helpline to get a test. No extra tests are available through the website’. The use of ‘no’ and ‘not’ builds a barrier, with no supportive or sympathetic language to soften the hard edge of this message.
You might say, ‘Well, after all you are talking to a computer and you can’t expect the computer to behave like a human’. Yet research has shown that technology and artificial intelligence, if properly designed, have the ability to convey sociability and warmth through a range of means that connect with the user and reduce the user’s feelings of helplessness and abandonment.
NHS Digital and other providers are developing other apps that allow users to manage various health conditions. This will include, in the very near future, the NHS Test and Trace App that is set to request high degrees of compliance from individuals who are asked to self-isolate. If the ‘voice’ in which the technology interacts with the user is not appropriate, compliance is likely to lag behind expectations.
In the case of Covid-19 testing, changing the way the computer ‘talks back’ will not solve the immediate problem of test shortage, but if users are left not feeling quite so alone and helpless it might alleviate pressures on the online system and the 119 helpline, and raise confidence in the testing system and perhaps the government’s management of the crisis.