I'm a developer at the Government Digital Service and I work with a team that, earlier this year, built a custom prototype webchat, as part of the webchat alpha.
We looked at which technologies help make webchat software more usable with screen readers. This blog post talks about what we did.
Why we focused on screen readers
Let’s be clear, there’s more to accessibility than just screen readers. Other accessibility needs are better met by improvements in visual design and content design. Users with visual impairments for example, may benefit from more accessible visual design, while users with dyslexia may benefit from accessible content design.
Screen readers are slightly different because they can be made to work better with very specific web platform technologies that are well known and documented, such as WAI-ARIA.
Below are some of our most important findings.
Buttons and labels
Labels provide context for screen readers so that they can tell users which options are available to them.
If your message submission field doesn’t have a visible label, you should use a hidden label. These hidden labels, combined with HTML button elements, make navigating and operating webchat easier for users with screen readers.
We wrapped sections of our webchat prototype in aria-live regions. Aria-live regions allow us to communicate changes to the user journey directly to the user during a webchat. For example, if a user is waiting in a queue, they’ll be told if their position in the queue changes. Aria-live regions can also be used:
- to tell users when an advisor is typing
- to tell users when an adviser goes offline or becomes unavailable
- to notify users that the webchat feature has appeared on the page since they first landed on the page
- when receiving a new chat message
Be careful not to overwhelm your user with too many notifications.
We found that different screen readers have different behaviours when a new item is placed inside an aria-live container. It was important for us to test this with several assistive technologies. We found:
- VoiceOver for Safari (OSX 10.11) will interrupt a screen reader with an aria-live notification
- NVDA for Windows 7 on FireFox and IE11 will queue the new aria-live notification until the after the current announcement has ended.
We use sound to improve accessibility too. Sounds can help us communicate changes during the webchat. If someone is moving up in a queue or receiving a new message, we can signal this with a particular sound.
However, we also make sure that no information is communicated based on sound alone, so we don’t exclude users who are hard of hearing.
We use aria-labels on elements that don't have enough descriptive text and therefore rely on positioning or context to convey their purpose. For example:
- ‘Minimise the webchat’ is better than ‘Hide’
- ‘Close the webchat’ is better than ‘Close’
If elements don’t have enough descriptive text on them, you can add additional words into the aria-labels to make it easier for the user to understand. ‘Tom said: How can I help’ is more conversational than ‘Tom: How can I help.’ But our testing showed this isn’t always the best option.
Try out alternatives and see what works best for your particular user journey. If the webchats are likely to consist of lots of messages between the user and adviser, then paraphrasing might get in the way.
Skiplinks and focus
Skiplinks are internal page links (or shortcuts) which help users navigate around the current page or interface. You can implement skiplinks that allow the user to:
- go straight to the webchat window
- go straight to the message submission field when they’ve finished inputting their message
- go straight to the “Yes” option when asking the user if they want to end the chat
These patterns will make your webchat more accessible, but these alone are not enough. Make sure that your visual design and content design are of a high standard. Crucially, test your implementation of these accessibility patterns with users who have access needs.