Seventeen Customer Directives
Chapter 3. Customer-Effective Testing
Testing is the single-most important thing you will do. Unfortunately, it could also be the most ignored.
Why Bother Testing?
When we create customer-effective Web sites, we are concerned with developing Web sites that make it easy for customers to do what they need to. What customers want to do is the context for development and must also be the context for testing.
To find out what customers want to do, and how they actually need to go about doing it, we must find out about customer behavior. We need to understand the big picture as well as the specifics of what customers will want to do on a Web site. And that requires some smart planning and execution across the business and the Web site development company.
Customer needs drive the development process and are our touchstone for success throughout that process. We need to talk to customers, up front, before development and pull them back in at critical points in the development process prior to launching a Web site to the public. The testing gets more and more specific as we go. A lot of companies run into problems when they try to test generalities when they need specifics, and vice versa. General customer feedback is appropriate early in the development process, but later on when development has progressed to specific designs and functionality, specific customer feedback will be required. The level of the testing, its timing, and the
techniques used all affect our ability to get the right information when we need it.
If we don't know what we need to know about customers, we will not develop customer- effective Web sites. We may develop good Web sites, but if customers can't, or don't, adopt them into their service relationships, then we've failed.
So we test our ideas with customers in an attempt to offer them the best e-services we can.
And this implies an iterative process where we have to potentially redevelop concepts or parts of a Web site and then retest them.
Iterative processes often send businesses and Web site developers into a spin. It's not surprising when we are all working with tight deadlines. We simply don't have the time to redo, redo, redo. Do we?
As with all things, it's a matter of striking a balance. It is possible to work against tight deadlines and still benefit from customer testing. That said, if you have no intention of revising what you're doing as a result of customer feedback, it's probably not worth asking customers what they think in the first place.
Of course, we don't have to deliver a perfect e-solution from day one. Customers will learn with us and help us along the way, as long as we give them something useful in the first iteration of our Web site. And therein lies the problem. Too many companies
continually save revisions to "the next release" and launch a Web site that offers little or no value to their customers. Customers don't think in releases; they think about what they need to know and do, now, not later. If you provide customers with useful content and functionality in the first iteration of your Web site, and then continue to deliver additional useful content and functionality in the first iteration of your Web site and then continue to deliver additional useful content and functionality when it is needed by e-customers, you can keep them on board.
A business needs to take e-customers' priorities into account when deciding what content and functionality will be released and when. If e-customers are all asking for a specific service function, such as view and pay their bill online, because it is the most important thing they want to be able to do on a business' Web site, then this should be the first thing to be developed by the business.
Sometimes e-customers want things businesses can't deliver straight away, and this needs to be understood and managed by the business. Sometimes e-customers want things that can be delivered straight away, but they would have been overlooked had e-customers not been consulted. Testing, therefore, is necessary to really understand e-customer priorities and how businesses can create for e-customers from day one.
If we set the right direction at the start, then we avoid having to reinvent during the development process. Reinvention, late in the development process, is a waste of everybody's time and resources. More often than not, companies will roll out an
inadequate Web site, rather than redo what has already required a lot of time and money.
And that's a risky move, if you know you're not on the right track with your customers.
We may lose our customers by providing our competitors with the opportunity to offer them something better.
Customer testing should lead to invention, not reinvention. It's just a matter of good management and timing.
And, of course, customer testing can also be a way of fixing obvious problems we miss.
Many of the mistakes we make relate to the way information and functionality is presented on Web sites and can be easily fixed. Being aware of the 17 customer directives in Chapter 2 will give businesses and Web site developers many potential mistakes to avoid.
Practical Examples
In his Web site on usability, useit.com, (see Netography) Jakob Nielsen gives us an example of how we can improve things simply be asking customers for feedback. He relates an "icon intuitiveness" test that was conducted as part of developing an intranet
for Sun Microsystem's employees. Figure 3-1 lists the icons that were tested, and the ways in which they were interpreted.
Figure 3-1. Icons tested for intuitiveness.
The icons for the toolbox and the World Wide Web were the most problematic. The toolbox was seen as a briefcase and the WWW icon was interpreted too closely to the icon representing the geography of the company. All of the other icons did well enough, when taken on balance and in context.
In total, twenty versions of the toolbox icon were designed; seven tool metaphor icons, nine shopping metaphor icons (including a shopping cart and a grocery shelf), and four chest icons. The original and four revisions were tested with users as seen in Figure 3-2.
Figure 3-2. Revisions made to Sun's toolbox icon.
The first revision was too strong and users thought that everything that could be deemed a tool on the site should be found in this section. Unfortunately, some of the site "tools"
belonged in other sections. In order to use a weaker metaphor for special-purpose applications, a storefront icon was adopted in the second revision.
However, the second revision was too easily interpreted by this audience as a circuit board. From there, several alternative storefront and shopping icons, such as revision 3, were used. These interfered with the "product catalog" icon. The idea of an application store was then dropped.
The last icon, the application chest, was finally settled on.
Another practical example of the difference testing can make is given by Jeffrey Veen of Wired Digital in his article on Webmonkey; "Test your designs on people". He says:
When our designers redesigned the Wired News Web site, they included a navigation panel down the left side of the screen. It pointed to the various sections behind the frontdoor: an area to check stock prices, the week's top 10 stories, and a collection of articles from trade magazines. However, when our testing subjects saw the navigation bar and its respective links—Stocks, Week's Top 10, From the Trades—they instinctively grouped the sections together. When asked what they thought the links pointed to, users all agreed the "Week's Top 10" was about the best-performing, publicly traded companies, and "From the Trades" must be about large buys and sells from the exchange floor. They mentally grouped everything around stocks.
A simple test—a half-dozen subjects using the Web site for a half-hour each—uncovered a potentially confusing design. Armed with this data, our designers modified the link placement and made the site less obtrusive and easier to use.
The modified design can be seen in Figure 3-3.
Figure 3-3. Wired screen resulting from customer feedback.