We need to chat about chatbots


Marie Johnson
Contributor

I’m curious as to what the colour of a chatbot t-shirt has to do with service delivery. In recent years, the ranks of the Australian Government’s chatbot brigade has swelled. Many have names. Do you know Alex (#1), Alex (#2), Sam (#1), Sam (#2), Charles, Oliver, and Pat? Some are nameless, redacted apparently.

Some have cutesy cartoon faces. And some have no “form” apart from a text box. The Home Affairs virtual assistant is effectively stateless: no name, no face.

But what do these do and how do these ‘help’ people? There is a worrying lack of strategy regarding chatbots or virtual assistants in the Australian Government’s ‘Digital Transformation Strategy‘.

This is especially problematic given the global intelligent virtual assistant market is expected to reach USD$45 billion by 2027, expanding at a whopping compound annual growth rate of 34.0 per cent.

Marie Johnson
Marie Johnson: Government does not have a coherent strategy on chatbots and its a problem

This is a massive market, and the deep pockets of government will be ruthlessly fought over, and each agency will – as usual – do its own thing resulting in multiple non-aligned solutions.

These technologies and artificial intelligence are breakthroughs that have the power to both liberate and discriminate. This is not about automating processes but serving humans, and for many, at a time of great vulnerability.

So, let’s see the impact that the lack of strategy and organising logic in the chatbot rush is having on service delivery. Having ‘virtual assistants’ simply listed on the ‘Digital Transformation Strategy’ roadmap does not make it a strategy.

And nor is it a strategy for Alex the chatbot to wear different coloured t-shirts as the ‘government helper‘. Can I point out that many citizens are in fact colour blind?

A whole-of-government strategy regarding virtual assistants, would bring coherency to purpose, co-design, use cases, and common and reusable patterns of service. A strategy would be anchored in governance and ethics.

But there is no co-design. The ‘Digital Transformation Strategy’ does not even mention co-design: a fatal flaw in service delivery.

Without co-design it is difficult to see how the ‘Digital Transformation Strategy’ statement that virtual assistants can improve access for people with disabilities, can practically be achieved. This is an ableist view that smart tech is bestowed on people with disability by others who might know better.

And worryingly, nor does the ‘Digital Transformation Strategy’ contain any mention of “citizen centric”. So, no co-design and no citizen centric, partly explains why the beauty pageant of chatbots is happening on an agency by agency basis.

Combined, this signals the lack of understanding or even appetite to grasp the disruptive and transformative nature of these technologies to servicing models.

The Copernican Moment has not yet arrived.

There is no coherency to the chatbot names, no apparent naming convention. There is no logic as to the reason or purpose of the names, the proliferation of names and what this means for the citizen.

There is also no coherency as to the use case nor an apparent understanding of the limitations of the use cases in play.

For example, the use case for most of the chatbots is as website navigators – another form of search – and not an application of artificial intelligence, no matter what the vendors might claim.

These chatbots cannot handle conversational dialogue, idioms or slang.

Precise words and sentences must be used – and the chatbots say so over and over again. This is a problem for a great many people with limited literacy.

Similarly, the chatbots cannot understand typos and often respond with incorrect answers: problematic for people who have difficulty using keyboards as well as people with limited literacy.

In the use case of chatbots as website navigators, consistency is a problem. Often, a search on the website will return a different result to that which is provided by the chatbot. This is a particular problem with the ATO website and Alex chatbot.

While the ATO is highly commended for its strategy on language literacy, this has not translated into a chatbot interaction that accommodates impaired literacy.

Across the board, the chatbots respond with blocks of text and nests of links to website pages, documents and forms. Sam (Services Australia) uses sliders to cram as much text as possible into the chatbot window.

Small chatbot windows crammed with links and bureaucratic language counter against accessibility and literacy considerations. Information overload, via whatever channel, causes problems and drives volumes and churn across channels.

An AI corpus is not a construct of links and screen scraping from websites. A corpus is essentially about language – patterns, machine language, natural language – shaped through co-design.

Any virtual assistant strategy is dependent on this new capability.

And in the area of languages other than English, there is no coherency or consistency as to how the chatbots respond. And Indigenous languages must be part of the strategy and capability.

Tested with “I speak Italian”, the nameless Home Affairs chatbot triggers a rather awkward “I’m a proud Australian” response.

The same “Italian” question on MyGov similarly does not trigger any information about services in other languages, instead the digital assistant begs forgiveness that it is still learning.

Jumping over to IP Australia, Alex the chatbot responds to the “Italian” question by asking for a complete sentence followed by “I just speak English because I’m on an English language website.”

Sam (the health chatbot, not the Services Australia chatbot) simply has no responses for “speak Italian”.

And importantly, there needs to be coherency as to how and when Indigenous languages might become part of the breakthrough “virtual assistant” servicing innovation that is rapidly unfolding.

After all, 2019 was the United Nations International Year of Indigenous Languages, and the landmark National Indigenous Languages Report provides evidence on the importance of Indigenous languages and guidance on the development of service delivery.

No community should be left behind.

And while you are hunting for information, do you know what the chatbot gets to know about you?

From an ethics and transparency perspective, how privacy information is provided by the chatbots and what information is provided, varies.

Much of the privacy policy information provided by the chatbots is linked to the particular website. Only one chatbot provided a privacy statement about information typed into the chatbot itself.

What is tracked is quite intrusive: scrolling, mouse activity, as well as IP addresses. Sam from Services Australia keeps a copy of the whole interaction: it should be assumed that all the other chatbots do, but don’t say so.

Imagine how off-putting the experience would be if the chatbot’s opening statement was “I’m tracking all your scrolling and keystroke activity – is that ok?”

What all this adds up to are fragmented agency-by-agency, English-language only implementations that are not citizen centric, have no regard for the impact of limited literacy, and do not support the needs of people with disability.

Breakthrough thinking, co-design and most importantly, a government wide strategy is needed to avoid the life event nightmare of a daisy chain of variously named chatbots spilling out links to bureaucratic information buried deep within websites.

The exponential technologies and advances in co-design break through these barriers, and there is extensive peer reviewed research globally on this. But technology, however so attractively marketed by tech giants and consultants, is not enough.

What is needed is a relentlessly inclusive strategy anchored in co-design and ethics.

A strategy that uplifts and connects the great efforts of many people.

An explicit forward-leaning strategy that fosters domestic Australian innovation, stimulates academic research and builds public sector capability to fully take advantage of the rapidly unfolding human-accessible web.

By designing for the edges, everyone is included.

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories