Why Prioritizing UX in Your Software Evaluation Is Important

Why Prioritizing UX in Your Software Evaluation Is Important

Whether you are in a manual environment or using legacy GRC software, you’ve likely heard about the poor staff member who spent days pasting data across systems, reconciling spreadsheets, or one of many other painstaking, manual tasks. Maybe you were that staff. It’s no secret that performing audit, risk, and compliance activities in a manual environment is painful, but not as painful as using a system that is unintuitive, is laden with performance issues, or that lacks proper customer support. In short, a system with a poor user experience (UX).

There are several reasons why you should prioritize UX in your software evaluation, but only one that matters — if the software isn’t usable, it won’t get used. It’s that simple. And if it doesn’t get used, the result is wasted time and money implementing a solution that was “designed” to save you time and money. 

Many people outside of the product and design world use UX and UI (user interface) interchangeably. However the User Experience Professionals Association (UXPA), describes UX as “every aspect of the user’s interaction with a product, service, or company that makes up the user’s perceptions of the whole.” UI, on the other hand, refers to the aesthetic of software. Although UI is an important component of UX, it’s not the only one. There are 3 key elements of UX:

  • How it works or how features enable a user to accomplish their tasks effectively and accommodate a users needs. 
  • How it feels or how intuitive workflows are to accomplish tasks. Includes experience with the vendor in how well they provide delivery and support.
  • How it looks or the desirability of the software and how well it is received by other users in their niche.

A product can have a fantastic UI but a horrific UX, which is why we are focusing on UX — from login to logout, and everything in between. An end-to-end journey for you, the user.

It can be harder than you think to identify a great solution that meets your team’s needs. The PwC Tech At Work research discovered that as companies introduce new technology, 90% of C-suite executives think they are paying attention to employee’s needs, while just 53% of employees agree. Due to the complex nature of audit, risk, and compliance tasks, it’s especially critical for those evaluating technology to gain staff feedback on the technology’s sufficiency in meeting their needs. 

The True Cost of Bad UX

Poorly designed UX can be costly. In some cases, it has been estimated that between 5 to 15 percent of IT hardware, software, and services projects have been abandoned as the result of poor usability, costing companies $150 billion. To make that sizable number more relatable, auditors using legacy GRC software might have to navigate 10+ clicks and drill up & down menus to document the test of design (TOD) and test of operating effectiveness (TOE) of a control — a seemingly simple task. 

When the technology is cumbersome to use, staff often end up doing their work in spreadsheets, which are quickly seen as a better solution to their problem. At that point, your technology investment becomes nothing more than a repository, rather than a real-time system of record, collaboration, and reporting. Drilling through layers of menus or pasting information from place to place may not seem like a big deal, but think about your team doing it hundreds, maybe thousands of times over the course of a year. That wasted time and frustration adds up, making a challenging job even harder. 

There’s well-known quote among designers, often attributed to IBM: “ease of use may be invisible, but its absence sure isn’t.” Design — and the experience it generates — plays a large role in how you feel about using any software and in your willingness to keep using it in the future. Software that is cumbersome and frustrating to use has been shown to drain productivity and lower employee morale. “When it comes to UX design and development, 67% of users claim unpleasant experiences as a reason for churn,” according to Esteban Kolsky, founder of ThinkJar.

 So how do you recognize less than stellar UX? Common symptoms of bad UX include:

  • Unorthodox interface.
  • Unintuitive workflows.
  • System latency.
  • Unnecessary workarounds.
  • Misplaced or lost information.
  • Inconsistent design.
  • Unreliable or unresponsive support.
  • Complaints from users.

Poor UX can result in frustration and stress due to wasted time and resources. You might be tempted to purchase a piece of software because it has a huge range of features or is extremely affordable, but if poor UX makes it unfriendly, counter-intuitive, and time-consuming to use, it may not be utilized as intended, and at minimum will impact team performance.

Measurable ROI from Well-Designed UX

Alternatively, when you buy software that has a well conceived and executed UX, you can expect greater adoption, increased productivity, reduced reliance on help desks, reduced training costs, a flattened learning curve, and happier employees. 

Putting the value of well designed UX into perspective is easily done. ROI calculators can offer insight into areas like employee productivity. For example, if 50 employees making $40/hour increased efficiency by 30 seconds for a given step in the app (accessed 20 times per day), the ROI is over $70,000/year on that one step alone. 

(50 employees) x ($40/hour) x (1/6 of an hour) x (230 work days) = $76,700 

ROI can also be measured as a result of fewer user errors. For example, if 50 employees making $40/hour made one less error per day, and saved 20 minutes as a result, the annual cost savings would be over $150,000. 

(50 employees) x ($40/hour) x (1/3 of an hour) x (230 work days) = $153,300  

A positive user experience is critical for software to have value and provide a solid return on investment for your company. Aesthetically pleasing software is great, but a well-designed user experience that helps you efficiently complete the most common and critical processes should be the top priority — and incorporating UX into your evaluation criteria is vital for success.  

Applying UX to Your Software Evaluation

Once you see the value of UX, how do you incorporate it into your software evaluation in order to discern good UX from bad and ensure widespread adoption by users? There are two approaches you can take when evaluating UX. The first is a more technical and hands on approach called usability testing, and the second is a qualitative approach focused on user satisfaction via peer reviews and feedback. Used together, they provide key metrics and elements to look for during your software examination.

Before you get started, make sure you’re not alone in the evaluation. Including and amplifying the voice of everyday users during the selection process will surface UX benefits and pitfalls, establishing stakeholder goodwill, and ultimately increasing the likelihood of user adoption. You’ll also want to educate team members about the basics of how to assess UX. 

Approach 1: Usability Testing

Traditionally used by design teams in focus groups, usability testing measures how long it takes users to complete tasks, how many errors users make in competing tasks, and the percentage of tasks that users complete correctly. 

Chances are you don’t have time or accessibility to collect hard data and put formulas to work as you compare solutions. However, there are principles you can apply as you watch product demos and click around in sandboxes. A few questions the evaluation team can consider are: 

  1. How many clicks, pages, or tabs do you need to traverse as you complete a task? Example: If you’re testing a SOX control, how many clicks or tabs away is prior year work, related evidence, and related process documentation? Having access to critical information and documentation in one place without having to navigate multiple screens is essential in day-to-day tasks.
  2. Do workflows facilitate processes and collaboration effectively or cause confusion? Example: There is one thing audit, risk, and compliance professionals have in common — collaboration with stakeholders and owners across the business. Having an easy-to-follow workflow can aid in more effortless evidence collection, issue remediation, certification management, etc.
  3. Do integrations flow naturally with the way you work? Example: Does an integration make it faster or slower to complete tasks? A good integration will enable you to work seamlessly with other applications in your ecosystem. However, not all integrations are created equal. A bad integration will create more effort and workarounds. 

Usability testing is helpful in assessing the day-to-day experience of a given solution for your needs. However, it is also important to consider how a given solution meets your long-term needs, and that’s where more qualitative factors like peer reviews come into play. 

Approach 2: User Satisfaction & Peer Review Analysis

It’s important to think about user satisfaction along every step of the buyer/user journey — from signing the contract and getting implemented to user adoption and ROI. The best way to get an understanding of how satisfied users are is simple — ask them. Luckily there are third-party software review sites that do just that for you. 

One example is G2.com, Inc., a third-party software review website where you can seek verified reviews from real customers. Their methodology involves asking customers questions around usability, results, and implementation. G2 leverages an algorithm that plots vendors on a leader grid based on aggregate responses to the following UX-relevant criteria:

  • Usability: 
    • Ease of use.
    • Ease of admin.
    • Estimated user adoption percentage.
  • Results
    • Estimated ROI.
    • Product’s ability to meet their requirements.
    • Likelihood to recommend product.
  • Implementation
    • Satisfaction with the set-up process.
    • Amount of time required to go live. 
  • Relationship
    • Ease of Doing Business With.
    • Quality of Support.

As shown by the figure below, AuditBoard is the only vendor featured in the Leader quadrant on both the Audit Management and GRC grid. Additionally, AuditBoard outranks all competitors on the user satisfaction axis.

Source: G2.com grids as of 3/2021

You can find more customer reviews on other third-party software review websites like Gartner Peer Insights and Capterra (Audit, Risk, Compliance). 

Third-party reviews are certainly insightful, but you don’t have to take their word for it. If you want to incorporate user satisfaction into your evaluation, here are a few questions you can consider:

  1. How easy is it to use? Example: How easy and intuitive is it to create reports and dashboards? Can you easily add new attributes to a report? Can you perform basic functions like filter and search? Is there built-in data visualization? 
  2. How easy is the admin? Example: How configurable is the solution? Does it take a high level of expertise to implement the solution? Can I or someone on my team easily make tweaks to field names to change for instance, “PBC Requests” to “Document Requests” or do we need to reach out to customer support for seemingly simple changes?
  3. Is there evidence of successful implementation and support? Example: Are there success stories available of teams similar to yours that have had success implementing the solution? Can you find evidence of customers that are satisfied with their implementation process and level of support? 

Trust and credibility is another crucial facet of user satisfaction that can be measured using Net Promoter Scores (NPS). Calculated by surveying customers of a vendor to understand user satisfaction and loyalty with one direct question: How likely are you to recommend this company/product/service/experience to a friend or colleague? Results are analyzed, yielding scores that range from -100 to 100. Companies historically don’t share their NPS scores publicly, but this data is now available to you thanks to third-party software review websites like G2.

In the figure below, G2 reports that AuditBoard has a NPS of 86 in the Audit Management category, more than doubling the score of most competitors and the industry benchmark. 

By incorporating user satisfaction metrics and elements into your evaluation, you’ll ensure that the platform you select will be user friendly — encouraging successful adoption by all stakeholders. 

6 Things to Keep in Mind When Applying UX to Software Selection

Now that you’ve learned how to define UX, why it matters, and how you can measure it, you’re ready to include UX as part of your next software selection process. In summary, here are a few concluding tips to keep in mind:

  1. Assemble a diverse team: Don’t embark on the evaluation alone. It’s arguably more important to invite lower level staff who will be daily users over management who will use the software periodically. Overall, it’s a good rule of thumb to have representation across the board. 
  2. Don’t forget business owners: Collaboration with business owners is a critical component to success. Yet the importance of their user experience is often overlooked since they aren’t daily users of a software solution. Infrequent use is precisely the reason their user experience needs to be prioritized, because when it comes time to obtain evidence, remediate issues, and perform certifications, you want to ensure it’s an easy and intuitive experience for them. 
  3. Buy purpose-built: Ultimately, software needs to align perfectly with what you spend the majority of your time doing. There is a reason surgeons use a scalpel and not a machete to perform surgery. Pay special attention to the critical day-to-day tasks you need to accomplish most frequently, and whether the software focuses specifically on reducing effort for those tasks. A popular adage goes “buy cheap, buy twice.” Whatever you save in trying to “hack” a solution, you’ll likely pay for in efficiency and performance. 
  4. Leverage customer reviews: Validate your findings with verified customer reviews from trusted third-party sites like G2 (Audit, GRC), Gartner Peer Insights, and Capterra (Audit, Risk, Compliance). After you’ve narrowed a list of potential solutions, look into how other users have rated the technology across functionality, usability ratings, and ease of use. Look for red flags in qualitative feedback as you compare and contrast how users rate your top choices. 
  5. Usability over features: A long list of features and use cases is tempting, but it doesn’t guarantee good UX. There will be greater gains from well-designed core features used for repeatable and everyday tasks than from a laundry list of features that are rarely or never used.
  6. Consider your entire customer journey: The product is important, but don’t discount the importance of good service, support, and thought leadership. Also, configurability is key. After you buy a solution, you’ll want it to be easy to maintain rather than need to contact customer service for every small tweak. 

When you set out to evaluate software using a UX lens, you’ll be able to identify user-friendly platforms that will guarantee widespread user adoption, elevate the way your team works, boost your return on investment, and ensure that your software rollout is a success. 

Michele

Michele Muriyan led Product Marketing at AuditBoard. A former IT auditor and KPMG alumna, Michele has advised some of the world’s largest organizations on their audit, risk, and compliance programs.