Someone recently asked me about the credibility of user interface standards, which provide requirements or guidance for web sites, software, or hardware, with the objective of making an interface “more intuitive, learnable, and consistent.” How valuable are standards? What are they based on, and what makes one standard better than another? Is it worth the effort to learn the standards, and which standards are best for me or my organization?
There are a wealth of user interface standards available. Books and articles provide much guidance in this area. Corporations like Google or Apple offer user interface guidance. Some organizations, such as the International Organization for Standardization (ISO), consider setting standards to be a core mission. The government can become involved in research that supports standards, or can set standards.
This article, the first in a two-part series, will introduce user interface and design standards and why you might want to consider following a standard, or at least referring to one or two favorites as you work. The second article will look at some specific examples of standards.
Sometimes You Have To
Sometimes you simply have to follow a standard. For example, your company or organization may have a requirement to do so. You may also be subject to outside requirements to follow a standard. A classic example is Apple’s requirement that iOS apps follow Apple’s Human Interface Guidelines. Apple lists one of the common reasons they reject apps as “substandard user interface”: “Apple places a high value on clean, refined, and user-friendly interfaces. Make sure your UI meets these requirements by planning your design carefully and following our design guides and UI Design Dos and Don’ts.”
Sometimes it is strongly recommended that you follow relevant user interface guidelines; for example Google Cast has suggested user experience guidelines for submitted games.
Often It Is a Good Idea
But, often, there is no requirement to follow a formal user interface standard; the Android app store, for example, has no such requirement, and neither does Amazon’s Developer Portal. Absent a requirement to do so, is it worthwhile to adopt and follow an established user interface standard? I believe it often is. Standards document good practices. If a standard is established, there must be a reason it has been accepted–professionals must find it useful, and that’s a validation of the standard and a good reason to consider it.
Paul Brooks, in the article What on Earth is ISO 9241?, tells how ISO 9241, which covers the ergonomics of human computer organization, helped him evaluate software: “By following the principles of part 171 (ISO 9241-171 Guidance on Software Accessibility), I was able to not only methodically evaluate the accessibility of the desktop application but also provide recommendations which were directly tied to the human-centred design process that was proposed to resolve the issues found.”
Brooks’ experience touches on an especially important aspect of standards — their credibility. Many standards are based on research. Many of the guidelines at usability.gov, for example, are based on research, with academic sources cited. With their experimental pedigree, these guidelines are especially credible.
Even if a standard is not research-based, it likely represents a collective wisdom on the best way to design an interface, and that in itself offers value. Standards are often based on experience with what doesn’t work as well as what does work. Dennis Schumaker, writing on the Microsoft Developer Network, points out that “many user interface features are the result of many years of trial-and-error experimenting. They aren’t just the result of finding better ways to interact with the user, but also the result of learning about techniques that don’t work. Without standards, your developers might implement one of those unfortunate user interface designs.”
When you follow a standard, you are leveraging the work of others. You don’t need to duplicate the work that went into developing the standard. And if someone questions the approach you’re taking, you can better explain and defend your approach if it is based on a standard.
Your work will also be consistent with the work of others. Consistency with established conventions makes your web site, application, or device easier to use. That’s obviously important. (See Simplifying Complexity – Part 2 for more on the benefits of consistency.) The larger your team, the greater the potential for errors or re-work resulting from inconsistent interface issues across your team, making it more important to have standards in place and follow them.
Further, if a standard exists, then by definition someone already has documented it, so standards are easy to share and reference. Using established standards eliminates or reduces the requirement for you or someone in your organization to compile a standard–potentially a time consuming task. I’ve seen organizations spend a lot of time and effort developing their own standards that primarily include generic guidance that they could have simply adopted from someone’s existing material.
You don’t have to formally adopt a standard to benefit from it. Standards are valuable simply as reference; I have several books on my shelf and web sites that I refer to often.
The next article will look at some of these references, as well as other standards that may be helpful to you.
Dr. Craig Rosenberg is an entrepreneur, human factors engineer, computer scientist, and expert witness. You can learn more about Dr. Rosenberg and his expert witness consulting business at www.ui.expert