Skip to main content

https://accessibility.blog.gov.uk/2016/06/09/research-with-blind-users-on-mobile-devices/

Research with blind users on mobile devices

Posted by: , Posted on: - Categories: Design, Testing, User research

I’m Ed, a designer at GDS. Last year we did some usability research with users with a range of visual impairments. We found out lots of interesting things, and one in particular was about the way blind users of touchscreen devices experience the web.

People often assume that because users are blind or partially sighted they won't be able to use touch-screen devices. That's not the case - for the most part our participants could use them easily, often preferring their phone or tablet to their desktop or laptop. However using a screen reader by touch is different than using it by keyboard - our research found that there are different usage patterns and new design challenges to think about.

Different ways of using the web

These observations come from watching blind users in the lab using VoiceOver on iPads and iPhones, but it may apply to other touchscreen devices.

VoiceOver is a screen reader that comes included on Apple devices. On Macs users usually use the keyboard to navigate their screen readers. On iPhones and iPads it works differently - users navigate by touch. Users can read out each section in order, or can read out the content beneath the user's finger. It's this last behaviour that was particularly interesting.

Navigating by touch

Rather than reading out the hierarchy of the page, some of the users navigated by moving their finger around to 'discover' content. We saw a few different ways of doing this.

One user swept their finger left to right (like using a metal detector). Another user ran their finger down the middle of the screen from top to bottom, listening to the items getting read out as they passed over them. When the user felt they'd missed something they started tracing their finger further to the left, often from the top, skipping quickly past bits they'd heard already, before eventually searching on the right.

https://www.youtube.com/watch?v=o_hxi0qQZAM

In the attached video a user slides their finger down the middle of an iPad screen. VoiceOver is playing so they hear the page's content as they browse. Their finger goes down the middle of the screen so they miss the "Upload your photo" button, which is on the left.

This was really interesting - traditionally good structure for screen readers is about order and hierarchy. But for these users, the physical placement on the screen was also really important (just as it is for sighted users). We don't know how common this usage is, but it was still really interesting.

Finding smaller elements

Navigating by touch also mean that small elements on the page may be harder to find and locate. In the video above you can see a user miss our 'Start now' button, because the it was far over to the left. Fortunately there were clues in the content to direct them there, but it’s worth exploring other ways to make sure we don’t confuse users with elements that seem to be missing.

Note: on smaller screens (like mobiles) we use full-width controls, so this shouldn't be an issue.

It’s definitely a nudge to be consistent and not place elements in unexpected places on the page. Having a linear order that you can scan down with your finger worked well. That's not to say you can't have multiple columns, but that you should think about their use carefully.

Elements close together

The other problem with small page elements was overlap. ‘Date of birth’ caused problems for one of our users. There are three fields and three labels very close together. The close placement of these meant that the labels were being read out very quickly, with small changes in the position of the user's finger suddenly triggering another label getting read out.

The author struggling to enter their date of birth using VoiceOver on a mobile phone.
Fat fingers (mine) and inputs close together.

It took a long time for the user to find the right box and input, even though they knew what they needed to do. My takeaway was that we should possibly just move our boxes apart slightly so they might be more easily distinguished.

Tablets are really focused

The great thing with using the tablet or phone was that users only ever had one context to think about at a time. Often that means there’s less to go wrong. Error messages didn’t pop up and disrupt the workflow in quite the same way, and other programs didn’t grab focus – a huge problem when those apps aren’t accessible.

More research needed

Usage of mobile devices has increased rapidly over the last few years - including among those with disabilities. Much of what I've learned doing this research has been that we need to be doing far more in this area, think more about how these new technologies may affect usage, and how we design for these new behaviours. If you've got any learnings or research in this area, we'd love to hear from you.

Follow Ed on Twitter and don't forget to sign up to email alerts.

Sharing and comments

Share this page

8 comments

  1. Comment by Phil Buckley posted on

    Ed, this is really interesting, thanks for sharing.

    We also came across screenreader users who struggled when we asked for more than one date on a page: in our case, outbound journey and return journey. The labels "Day, Month, Year" are the same for both journeys which made it difficult for some users to differentiate. There is no easy fix here however as the labels are very clear for sighted users.

    I'd love to hear more on this if you get a chance to blog again.

    • Replies to Phil Buckley>

      Comment by Ed Horsford posted on

      Hi Phil,

      It might be worth checking your markup - it sounds like perhaps your date fields aren't wrapped in a fieldset with a legend. When multiple fields are wrapped like that, the user will be read the connecting legend - in this case 'outbound journey' or 'inbound journey'. Otherwise it might be worth checking you've not got any duplicated IDs or things causing the legends to go wrong.

  2. Comment by John posted on

    Great post. As devices become more accessible people will find so many new and imaginative ways to use them. One VoiceOver user told me how they used their iPhone on their commute with Bluetooth keyboard and earphones. As they didn't need to see the phone's screen, it stayed in their bag. And they often overheard people wondering what they were doing with the tiny keyboard and no computer.

  3. Comment by Jeremy Anderson posted on

    Intriguing post. I'd be interested to find out how many people are "centre-draggers" and how many are "left-hand-draggers", or whether people are both, depending on whether they are on tablets or mobiles. (I think it is GDS standard for all buttons to be 100% width on mobile).

  4. Comment by lindsay macvean posted on

    Fantastic research. There are not enough resources out there to show non-impaired developers how accessibility users actually interact with their software.

  5. Comment by Leonie Tame posted on

    Enjoyed this, thanks for posting, and I agree with Lindsay's comment - very little around about how different touchscreens and mobile devices are used by people with disabilities. I'd love to know if anyone's found any good material on this subject.

    Something I've observed with vision impaired people using desktop computers is that combos of magnification software and screenreaders are not unusual. The effects of magnifying what's on the screen means there's a trade off between putting inputs/boxes and other elements further apart (which will help people who struggle to control a mouse, for example) and putting them so far apart that the magnification users miss them.

  6. Comment by Ben posted on

    Awesome article on research , and also the research is been made is also mind blowing , Device has made our life much easier it is a necessary part of our life .

  7. Comment by Bryn Anderson posted on

    Excellent, very interesting article! Its absolutely fascinating how blind users interact with computers, mobiles and tablets. The first time I saw a blind user using a mobile phone (an iPhone 5) I was amazed at the speed at which he was able to navigate the interface. In fact it seemed that he was a lot more efficient at consuming the information than myself who is visually impaired not blind. It seemed kind of ironic that no-vision can be more effective than-low vision when using a device that is mostly screen.

    As a visually impaired user I personally find using a mobile exhausting as I have to hold it very close to my face, and zoom and scroll constantly while squinting at the screen. I would have a phone call over a text message conversation any day 🙂