After the Accessibility Audit

money box

Cheers for you and your company.  You’ve done the accessibility audit.

Accessibility Audit

Either you just heard about or know about the need to make all websites compliant for people with disabilities OR you got an unwelcome letter from the Department of Justice (DOJ) or the Office of Civil Rights (OCR). You went ahead and hired a company to do an accessibility audit.

If the audit company was good they went through your website or other software and measured it against Section 508 (US Federal government standards) or WCAG 2.0 (International guidelines which are now partially included in Section 508).

In either case you got a report and are working with your development team to fix any flaws. That was a worthy use of $5k – $20k of your company money.

Done, right?  Not so fast.

More work to be done

The chances are really, really good that your company is still using inaccessible software and hardware that is also covered under Section 508 standards.

One way to minimize your risk is to stop purchasing Information and Communication Technology (ITC) which fails accessibility tests. That’s correct. Only you and your procurement department stand between inaccessible software and hardware and your employees, students, customers and whomever else you represent.

Procurement training

Get training for those procurement professionals NOW. Make sure the training includes how to ask for and analyze the latest VPAT (Voluntary Product Accessibility Template), which is a vendor-created document that tracks a specific product that you might buy to the Section 508 and/ or WCAG 2.0 standards and guidelines.

VPAT training

You should be asking for a VPAT in every RFP and every time you look to purchase any sort of ICT (phone systems, printers, software packages, computers, laptops or tablets, etc. etc).

An “average” company spends between 4 and 6% of their annual budget on ICT purchases. For a company with a $15 million budget that means annual expenditures of $600k to $900k. With that much money on the table the cost for high quality training and guidance at $6k to $10k is just a drop in the budget.

Seek out an accessibility specialist who can train your staff either onsite or remotely, help staff to learn how to analyze VPATs. This specialist should also be able and available to coach your staff on what questions to ask vendors AND what answers to accept.

To recruiters: 5 myths about QA accessibility testing

woman recruiter reviewing work order

woman recruiter reviewing work orderA new flavor of QA job order is rolling into recruiters nationwide. This is a request for an immediate, experienced QA professional who does accessibility testing. Here are timely suggestions for what the recruiter needs to ask of:

  1. The employer at time of receiving the job order
  2. The candidate at time of initial screening/ interview

 

First: a short background and list of some terms the recruiter needs to get up to speed on the topic. What, homework!? Yes, homework, so you’ll do your job right.

Even though the ADA (Americans with Disabilities Act) is 25 years old its application to software is still being defined. In short, this means that you are not going to find an exact definition anywhere of what the exact expectation of success is in this area.

With that said, gain some passing understanding of what these two terms mean: “Section 508” and “WCAG 2.0”.

Section 508 is a set of standards the US Federal Government has established to measure the Electronic and IT hardware and software it uses against. Please note that these standards were last updated in 2000 (which is a lifetime ago in internet terms.) A “refresh” of the standards is due soon.

This is where WCAG comes in. While the Section 508 has worked its way through the government process at a snail’s pace there are others in the world working on similar guidelines. Not a law, nevertheless most software companies look to align with these guidelines because they are more comprehensive and up to date. NOTE: Do not try to read most of WCAG. There are thousands of pages and as many ways to implement the guidelines as there are developers and planners.

5 myths:

Myth #1

Screen Reader (JAWS, NVDA) equals a “test tool”.

Reality Check:

Many employers will ask that the QA skill set includes “testing” with JAWS or NVDA.  Do not fall into the trap of thinking that these are test tools.

These software programs are Assistive Technology. They help people who have trouble accessing the internet or such programs as Microsoft Word convert the information encoded on the screen into audio or refreshable Braille outputs.

When testing for accessibility, qualified and aware QA professionals typically use screen readers to validate the end user experience of websites or software programs. This is hugely helpful, but is not the main method of testing.

Recruiter Question to Employer: Are there specific screen readers your users use? In the case of employee used software they probably have a preference. In cases where the software to be tested is for customers you will have to use best practices, which general means JAWS for US based companies. NVDA is also widely used.

Recruiter Question to prospective QA Employee:

What test tools do you use to validate the code against accessibility standards?

[Give points for answers such as WAT (IE toolbar from The Paciello Group http://www.paciellogroup.com or WAVE toolbar (Firefox and Chrome plugins from Webaim http://webaim.org).

Give points for answers which discuss evaluating the code for items like form labels, WAI-ARIA roles and properties]

If the QA person says they use JAWS, for instance, to test with, ask them how they do that.

Myth #2

User Experience is the same for all users.

Reality Check:

Even if a site is usable for the general public there is a need to do additional usability tests for people with disabilities (PWD)

Real example:  the code was properly designed to be read by a screen reader, but because the fields in the design of the screen were set up in a certain order the screen failed the usability test.

A resume upload page on a very common HR recruitment site had an input box below the Submit button. The sighted user could easily see the Comments box and choose to type something like “Portfolio” or “Cover letter”. The blind, screen reader user would not encounter that box until AFTER they had submitted their resume. At this point there was no chance to add the comment or go back.

The page passed the Section 508 and WCAG criteria but failed the simple usability test.

Recruiter Question to Employer: Will there be an opportunity in this role for the QA person to discuss the site with designers and/ or product managers?

[Give points whenever an employer understands that this discipline is a cross-functional, cross-team effort.]

Recruiter Question to prospective QA Employee: What experience do you have with creating/ testing against User personas? How much experience do you have working with people who need assistive technology (AT)?

[Give points for any recognition of what UX is, what personas are and, most of all, if the QA resource has worked with PWD.

Give points if the QA resource has had any experience explaining that websites/ other software may present challenges to those who are using AT. If they have done so, ask them to describe the outcomes. ]

Myth #3

You have to test all pages on a site to do an accurate accessibility test.

Reality Check:

Given that so many sites have hundreds, or even thousands of pages, it is impractical to comprehensively test them.

The usual method is to concentrate on the primary pages which handle the most user traffic. (Home, Search results, Contact us, main shopping and check out pages, etc.) If those pages reveal serious problems they must be fixed first. Only later should the testing go deeper. Additionally, newly coded and added pages should include accessibility testing as they are prepared for production.

Recruiter Question to Employer: Will the QA resource be presented with the core workflows so they can include them in the testing? Who has developed the initial test plans?

[Give points if Product Management is involved]

Recruiter Question to prospective QA Employee: How many scenarios do you usually test? What are the main paths? Do you find that pages deeper in a site present different challenges than the Home page?

[Give points for any answers that indicate that the QA resource is familiar with accessibility being disproportionately built into the Home page and ignored further into the site. Give points to a QA resource who mentions that some features, while not on every page would warrant special testing – forms, videos (for captioning and general keyboard accessibility of video player controls)

Myth #4

Anyone who has a degree in Computer Science would know how to test for Accessibility.

Reality Check:

Fact is that most of the computer science programs, boot camps, and assorted free and for-pay courses do not even touch on accessibility related coding. Most of the programmers who know how to prepare their code for consumption by assistive technology and people who use alternative methods of understanding the UI have learned it “on the job” or by doing research on their own.

Recruiter Question to Employer: What specific skills are you looking for in a QA resource to test your site/ software for accessibility?

[Give points for any answer that acknowledges that most resources will not have gained this knowledge in a common computer science program. Give points if the company already has some knowledgeable people on the ground who can help the new QA resource.]

Recruiter Question to prospective QA Employee: Where did you learn about accessibility and accessibility testing?

[Be prepared to hear a long and drawn out version of their discovery process. Be patient and take notes. Do not believe them if they say that they easily discovered it in a class or another job.]

Myth #5

There is no need to do manual testing, because all the tests can be automated

Reality Check:

While there is much merit to automating as many checks against clean and correct code bases, even the most progressive automated testing tools can only check code violations (HTML mis-matches, for instance).

None of them can accurately replicate either the user experience for a person using keyboard only approach or various assistive technologies: See Myths #1 and 2.

Recruiter Question to Employer: Are you expecting to have the QA resource convert some of their tests to an automated system? If so, which one are you using and are you aware of any necessary accessibility plug-ins that will be required?

(Give points if they already understand that there is a limit to the usefulness of applying automated testing across the board to this effort. Part of your role here is to help set expectations of what is possible in reality.)

Recruiter Question to prospective QA Employee: Have you ever committed your accessibility tests to an automated process? If so, which one? How effective has this been? What do you know about accessibility plug-ins for the major automated test systems?

(Give points for honesty when the tester has not been able to do much useful with automated tools. Give big points if they mention anything covered in this article by CogApp or this one where researchers in Norway examined the results of 12 automated accessibility checkers. Yes, done in Norway, but the checkers are global.)

Wrapping it all up:

QA can and should do testing re accessibility diligently, deliberately and determinately whenever a software project is undertaken in 2016. However, that QA is by default dramatically different from other areas of QA.

Get familiar with the ins and outs and prepare to make significant money for your recruiting firm and your QA resources. This is a niche that is not going away.  Think about placing QA security testers and prepare to reap the benefits in the same kinds of ways.

 

 

 

 

 

QA for SaaS products

QA needs coffee

QA effectively applied to a SaaS product

 

QA needs coffeeI wish I could say that, when given the assignment of auditing an HR recruiting system delivered as a SaaS (Software as a Service) product I could have predicted exactly what path we’d have to go down to get fixes made. I could not have been more wrong.

For many years I worked on the vendor side. We owned the code and we fixed it. This process is well known in software circles.

In SaaS you subscribe to software and you are the customer. The vendor still owns the code, but they are on the opposite side of the development fence.

When working for the Commonwealth of Massachusetts to assess the Oracle product Taleo, which we were going to use to update and modernize the hiring process, I invented the following process to get accessibility bugs fixed using trial and error as my guides.

First, I used a keyboard-only approach to audit the two initial sections of the HR recruiting system. Then I used the WAVE toolbar and the WAT toolbar to look into code violations. Then used both JAWS screenreader and ZoomText to replicate the user experience with assistive technology.

The candidate portal, where John and Jane Doe apply for a job, was written in HTML. There were bugs and issues, but they could be fixed. The hiring manager portal was another story. Written in a version of Flash called Flex it had no redeemable value for use with keyboard or JAWS screen reader.

Another day I’ll write about how we dealt with the hiring manager portal. This is to explain how we managed to get 20 accessibility fixes into the last two releases of the Taleo product.

As customers we had access to an externally facing web application where we could log bugs. In a four month period I logged 30 tickets, which represented 42 separate bugs (some were consolidated because they were similar) and worked with 17 different customer service agents (CSA) in the US, Romania and China.

The process of getting bugs accepted by the CSAs and moved to the next step, which is to place them in the internal development bug tracking system was painful and slow. Each CSA had to understand enough of both AT and the accessibility guidelines to believe that the issue was really an issue.

If they didn’t accept it as a bug then their default is to politely reject the bug and call it an “enhancement”. During the time I worked on this project Oracle opened a second website for enhancements where customers were encouraged to log their wish lists and try to socially promote them. I really hated this “popularity contest” approach but we had to use it.

We engaged in a series of cross functional meetings between the Commonwealth and Oracle representatives from sales, product management and customer service. We explained exactly what barriers a person using JAWS would face when trying to navigate the original site. In this way several “enhancements” were accepted into the faster track “development fixes.”

Of the original bugs filed, 21 were included in the past two releases with 8 more pending for the future. Although it seemed at times like I was doing the QA for Oracle on this product I really didn’t mind because I wanted to get the Commonwealth and other Taleo users the most accessible product possible.

Accessibility requirements

Examples of requirements written for Taleo to fix accessibility issues.

#1 –

As a JAWS screen reader user I want to know about all mandatory fields when filling out a form. In the personal information page of the Taleo job application there is a two field mandatory entry to indicate where I learned about the job position.

When tabbing through the form user arrives at the first drop down box labeled “Source Type”. This permits the user to select, using up/ down arrows a general category, such as ‘Career Fair’ or ‘Magazine’. When pressing Enter the system opens a second, mandatory drop down box, where choices are available to further refine the source of learning about the job. This drop down box appears visually on the screen but is not announced. When user Tabs again they are taken to the Submit button, skipping over the second drop down box.

This causes an error on the page, which user has to discover and fix before moving on through the application.

Expectation: all mandatory fields on a page will be accessible to user when using keyboard only approach. All page elements will be announced via JAWS when they are available on screen.

Actual: the second drop down box is skipped when user first moves through the screen using Tabbing.

Fix: the second drop down box can be accessed using the Tab and JAWS reads the second label and user is able to complete the action without incurring a page error.

#2 –

As a JAWS screen reader user I want to hear all error messages announced when they occur so I can take corrective action.

When user enters a search term while looking for a job, if there are no results, a message appears on the page saying “There are no results for the search term used. Please try again.”

Expected: When an error is reached, focus will be placed on the error message and JAWS will read the error text.

Actual: The error is rendered silently, leaving the JAWS user unaware that it occurred.

Fix: When an error occurs on the search page JAWS now reads the text so user is informed and can re-enter a different query.

#3 –

As a JAWS screen reader user I expect to move through a page in a logical fashion through all screen elements.

On the login page for Taleo cursor is placed automatically in the username form field upon page arrival. When they fill out that information and Tab, they are moved to the password form field. One more tab brings them back up to the top of the page, not to the next page element, which is the ‘Forgot password?’ link.

Expected: On a login page, the user is most likely to want to enter username, then password and then Submit to move into the site.

Actual: The focus, when Tabbing, moves from the password field to the top of the page. Due to page design there are 8 Tabs that need to be moved through to get to the Submit button.

Fix: When user Tabs from password, they move to the two “help” questions, “Forgot password” and “Forgot username?”, then to Submit button. They are no longer taken to the top of the page and forced to make their way back through the whole page to finally Submit their credentials.

VPAT, please!

restaurant table with menus

restaurant table with menusNo, it’s not a menu item, digital or restaurant! A VPAT stands for Voluntary Product Accessibility Template. This is a form based on Section 508 accessibility compliance (or lack thereof) for a software or hardware product. Originally the form and the legislation was designed for federal contractors. Over the past 20 years or more Section 508 has been informally recognized as a standard for accessibility all should follow.

The VPAT is a registered trademark of the Information Technology Industry Council) ITI. Most importantly, the form is available free for your use. Download it.

San Diego State University publically shares its approach to EIT procurement. There are a series of links to vendor VPATs at the bottom of this page.

The short, non-technical version is this. The VPAT is a way for procurement officials to compare your software offering with other, similar offerings as it pertains to accessibility. More and more savvy CIOs and their teams are requesting VPATs every time they purchase Electronic Information Technology (EIT). Even more significant is that those IT professionals are doing their own tests against the products and VPATs to independently confirm the (delicately said) accuracy of the VPAT.

So. Three guiding principles.

1 – If you create software hire someone who knows how to test it and certify it via VPAT. This doesn’t mean you’ll pass every section. This does mean that the VPAT will be credible.

2 – If you purchase EIT always, always ask for a VPAT. Always take some time to test against it. Trust but verify!

We provide these services – both creating VPATs and testing against them. Contact us for more details.

 

To Caption or not to Caption

woman eating watching movie

woman eating watching movieAs soon as your content includes videos you should be thinking about captioning. In the simplest form, captioning is those words on the screen that are readable as the audio portion of the program is spoken. The idea is that captioning is only useful for deaf or people who are hard of hearing. Not true. An oft quoted survey from the UK done in 2006 revealed that 80% of TV viewers who used closed captioning did not have hearing difficulties. Times have changed, technology has advanced, yet many people without any hearing loss use captioning as a way to enhance/ enjoy videos. They listen in a public area and don’t want to disturb others, on a noisy subway and they situationally can’t hear the sound or they get more out of the content if they can watch and read the words at the same time (dual sensory input).

The law: The US Congress passed a Twenty-First Century Communications and Accessibility Act in 2010, with a string of updates and clarifications through 2015. Two broad areas are covered. One is products and services that use Broadband. The other is video programming on television and the internet.

In February, 2015, National Association for the Deaf (NAD) and some individuals sued Harvard and MIT over lack of captioning or very poor captioning of the video content they provide, covering online lectures, podcasts, courses and more. In June, 2015, the Department of Justice (DOJ) joined the lawsuit advocating a speedy resolution and captioning be provided.

Netflix has lead the way to providing full captioning, even though it was, regretfully, a result of a successful lawsuit. YouTube provides an automated captioning option. The results, while technically over 90% accurate lead to, often hilarious (if you are able to hear the discrepancies) wildly incongruous word juxtapositions. If you doubt me, just switch on captioning when you are on YouTube next and find your own examples. Any time a person mumbles or has the slightest speech anomaly or accent there is no telling how the automated captioning will record it.

So, to caption or not to caption? When in doubt, decent captioning should be built into your process, including everyone and building your brand as inclusive.

Screen reader FAQ

Your customers and employees with visual impairments have a way to “view” your website. The assistive technology is called a screen reader. Once loaded onto a user’s computer the screen reader converts the website code into speech (or a refreshable Braille display).

The most common screen reader in use in the US is called JAWS. It was created by Freedom Scientific and has updated versions approximately once a year, plus bug fixes throughout the year. According to the 2014 Webaim screen reader survey 30% of their respondents use JAWS. This seems low until you consider that it has long been the screen reader of choice for AT programs teaching young blind/ visually impaired people how to use a screen reader. It is in widespread use in industry, library systems and government at various levels.

JAWS is also expensive – about $1000 a license. People without the financial resources to update their licenses often fall behind in updating.

NVDA (Non-visual Desktop Access) is a free screenreader created and updated by an Australian team. It is used worldwide and available in 43 languages. NVDA has been downloaded over 70,000 times. The fascinating story of how two young men, both blind, came to creating this product can be read here.

Window Eyes is free and available for those who have an Office 2010, 2013 or 2016 license.