QA for SaaS products

QA effectively applied to a SaaS product

 

QA needs coffeeI wish I could say that, when given the assignment of auditing an HR recruiting system delivered as a SaaS (Software as a Service) product I could have predicted exactly what path we’d have to go down to get fixes made. I could not have been more wrong.

For many years I worked on the vendor side. We owned the code and we fixed it. This process is well known in software circles.

In SaaS you subscribe to software and you are the customer. The vendor still owns the code, but they are on the opposite side of the development fence.

When working for the Commonwealth of Massachusetts to assess the Oracle product Taleo, which we were going to use to update and modernize the hiring process, I invented the following process to get accessibility bugs fixed using trial and error as my guides.

First, I used a keyboard-only approach to audit the two initial sections of the HR recruiting system. Then I used the WAVE toolbar and the WAT toolbar to look into code violations. Then used both JAWS screenreader and ZoomText to replicate the user experience with assistive technology.

The candidate portal, where John and Jane Doe apply for a job, was written in HTML. There were bugs and issues, but they could be fixed. The hiring manager portal was another story. Written in a version of Flash called Flex it had no redeemable value for use with keyboard or JAWS screen reader.

Another day I’ll write about how we dealt with the hiring manager portal. This is to explain how we managed to get 20 accessibility fixes into the last two releases of the Taleo product.

As customers we had access to an externally facing web application where we could log bugs. In a four month period I logged 30 tickets, which represented 42 separate bugs (some were consolidated because they were similar) and worked with 17 different customer service agents (CSA) in the US, Romania and China.

The process of getting bugs accepted by the CSAs and moved to the next step, which is to place them in the internal development bug tracking system was painful and slow. Each CSA had to understand enough of both AT and the accessibility guidelines to believe that the issue was really an issue.

If they didn’t accept it as a bug then their default is to politely reject the bug and call it an “enhancement”. During the time I worked on this project Oracle opened a second website for enhancements where customers were encouraged to log their wish lists and try to socially promote them. I really hated this “popularity contest” approach but we had to use it.

We engaged in a series of cross functional meetings between the Commonwealth and Oracle representatives from sales, product management and customer service. We explained exactly what barriers a person using JAWS would face when trying to navigate the original site. In this way several “enhancements” were accepted into the faster track “development fixes.”

Of the original bugs filed, 21 were included in the past two releases with 8 more pending for the future. Although it seemed at times like I was doing the QA for Oracle on this product I really didn’t mind because I wanted to get the Commonwealth and other Taleo users the most accessible product possible.

Leave a Reply

Your email address will not be published. Required fields are marked *