In the mid-2000s, virtual agents and customer service chatbots received a lot of adulation, even though they were not very conversational, and under the hood they were merely composed of data exchanges with web servers.
Nowadays, even though a huge number of examples of ‘weak AI’ exist (including Siri, Alexa, web search engines, automated translators and facial recognition) and other topics such as responsive web design are hogging the limelight, chatbots are still causing a stir. With major investment from big companies, there remain plenty of opportunities to hack the conversational interfaces of the future.
- How to design a chatbot experience
Sometimes they get a bad reputation, but chatbots can be useful. They don’t need to feel like a basic replacement for a standard web form, where the user fills in input fields and waits for validation – they can provide a conversational experience.
Essentially we’re enhancing the user experience to feel more natural, like conversing with an expert or a friend, instead of web browser point-and-clicks or mobile gestures. The aim is that by providing empathetic, contextual responses, this technology will become embedded directly in people’s lives.
Watch the video below or read on to discover a practical way to design and build a chatbot, based on a real project-intake application in a service design practice.
01. Set a personality
As this practice serves over 110,000 members globally, the goal was to provide a quick, convenient and natural interface through which internal stakeholders could request effective digital services, instead of having to fill out confusing forms.
The first step was to establish the chatbot’s personality, as this would represent the voice of the service design team to its stakeholders. We built on Aarron Walter’s seminal work on design personas. This greatly helped our team develop the bot’s personality traits, which then determined the messages for greetings, errors and user feedback.
This is a delicate stage, as it affects how the organisation is perceived. To make sure we had as much information as possible, we immediately set up stakeholder workshops to nail the appropriate personality, colour, typography, imagery and user’s flow when engaging with the bot.
After we had gained all the necessary approvals – including seeking legal counsel – we set out to convert archaic request forms into a series of back- and-forth questions that mimicked a conversation between the stakeholders and a representative of our design services team.
02. Use RiveScript
We knew we didn’t want to get too deep into AI markup language for the processing part – we just needed enough to jump-start the experience.
RiveScript is a simple chatbot API that is easy enough to learn and sufficed for our needs. Within a few days we had the logic down to intake a project request from the bot, and parse it with enough business logic to validate and categorise it so it could be sent it through JSON REST services to the appropriate internal project tasking queue.
To get this basic chatbot working, head to the RiveScript repo, clone it and install all the standard Node dependencies. In the repo you can also gain a taste of the interactions you can add with the various example snippets.
Next, run the web-client folder, which turns the bot into a web page by running a basic Grunt server. At this point you can enhance the experience to suit your needs.
03. Generate your bot's brain
The next step is to generate the ‘brain’ of our bot. This is specified in files with the .RIVE extension, and thankfully RiveScript already comes with basic interactions out of the box (for example, questions such as ‘What is your name?’, ‘How old are you?’ and ‘What is your favourite colour?’).
When you initiate the web-client app using the proper Node command, the HTML file is instructed to load these .RIVE files.
Next we need to generate the part of our chatbot’s brain that will deal with project requests. Our main goal is to convert a selection of project tasking intake questions into a regular conversation.
So, for example:
- Hello, how can we help?
- Great, how soon do we need to start?
- Can you give me a rough idea of your budget?
- Tell me more about your project…
- How did you hear about us?
A typical accessible web form would look like this:
With web forms, we’re very familiar with certain patterns: you click the Submit button, all form data is sent to another page where the request is processed, and then most likely a cheeky Thank you page pops up.
With chatbots, we’re able to take the interaction of submitting a request, and make it more meaningful.
04. Design a voice
To convert this form to a conversational user interface served in RiveScript’s chatbot web client, we need to convert the information architecture from rigid to fluid; or field labels into UI strings.
Let’s consider some accessible field labels, and their related question tone:
- Request: How can we help? Not sure? Do you mind if I ask a few questions?
- Timeline: How soon do we need to get started?
- Budget information: Can you give me a rough idea of your budget?
- Project description: OK, can you tell me a summary of the problem to be solved?
- Reference: Also, who referred you to us?
Next we need to convert the web form’s code into AI script, following RiveScript’s very learnable processing logic for two-way conversations:
05. Request submission
As opposed to standard form variables being sent to another page or service to process, chatbots can validate and submit information entered by the user in a chat window (or spoken) immediately, which means users can also revisit previously entered values easily.
We needed to send the user’s request entered in the chatbot UI via the JSON REST API to an external project tasking server.
In RiveScript-js we are free to make use of an XMLHttpRequest object to submit the request almost simultaneously, as the data is entered by the user:
06. Do not fear the chatbot
Soon, current ways of interacting with computers to obtain information will give in to AI-based technology like chatbots, where people just make simple voice commands, like we've seen with tech such as Amazon Echo and Google Home.
The web design community need not fear – we should all be embracing the added value of this new technology.
It could be a game-changer for the companies it works for, offering fully scalable customer service and improved customer intelligence.
This article was originally featured in net magazine, the world's best-selling magazine for web designers and developers. Subscribe here.
- How chatbots are learning – interview with Giles Colborne
- How the intelligent web will change our interactions
- How conversational interfaces are innovating banking