ad info




CNN.com
 MAIN PAGE
 WORLD
 ASIANOW
 U.S.
 LOCAL
 POLITICS
 WEATHER
 BUSINESS
 SPORTS
 TECHNOLOGY
   computing
   personal technology
   space
 NATURE
 ENTERTAINMENT
 BOOKS
 TRAVEL
 FOOD
 HEALTH
 STYLE
 IN-DEPTH

 custom news
 Headline News brief
 daily almanac
 CNN networks
 CNN programs
 on-air transcripts
 news quiz

  CNN WEB SITES:
CNN Websites
 TIME INC. SITES:
 MORE SERVICES:
 video on demand
 video archive
 audio on demand
 news email services
 free email accounts
 desktop headlines
 pointcast
 pagenet

 DISCUSSION:
 message boards
 chat
 feedback

 SITE GUIDES:
 help
 contents
 search

 FASTER ACCESS:
 europe
 japan

 WEB SERVICES:
COMPUTING

From...
Computerworld

Is software too hard to use?

August 25, 1999
Web posted at: 2:27 p.m. EDT (1827 GMT)

by David Orenstein software

(IDG) -- Right now, corporate information technology managers and purchasers have no standard way to assess the usability of the software they buy, but within a few years it may be a matter of simply looking at the box.

Next month, a group of corporate users, vendors and experts will convene in Redwood Shores, Calif., to test what they hope will become a common method for evaluating the usability of software. The report, which a vendor would present to users, is analogous to nutrition information and ingredients on a food package. If the program is successful, it could save corporate software buyers millions of dollars by reducing the lost productivity or unnecessary training that hit when companies are unable to judge how easy software is to use.
MORE COMPUTING INTELLIGENCE
IDG.net   IDG.net home page
  Computerworld's home page
  Computerworld Year 2000 resource center
  Computerworld's online subscription center
 Reviews & in-depth info at IDG.net
  IDG.net's personal news page
  Year 2000 World
  Questions about computers? Let IDG.net's editors help you
  Subscribe to IDG.net's free daily newsletter for IT leaders
  Search IDG.net in 12 languages
 News Radio
 * Computerworld Minute
 * Fusion audio primers
   

State Farm Insurance Cos. in Bloomington, Ill., was all set a few years ago to spend more than $5 million on an intranet-based career planning software package. But when the company brought the package to its usability lab, it made a huge savings.

Lab tests found that none of five test users was able to effectively complete the software's 40-hour regimen, says usability lab coordinator Jack Means. The results prompted State Farm, with its 75,000 desktops' worth of buying power, to walk away from the deal.

If vendors faced that danger all the time instead of rarely, experts say, IT would be a lot easier to use than it is today. The common report format would describe exactly how a vendor tested its software's usability by listing the demographics of the testers, the test tasks they performed and the results of the tests. The data will be sufficient, says Keith Butler, advanced computing technologist at The Boeing Co., to let companies replicate the vendors' tests to ensure their accuracy. The purpose of next month's meeting is to match users with vendors in pilot tests of the reports.

Bottom-line value

The big question the pilot tests must answer is whether companies will find the reports valuable. "To make a convincing argument, we will need to have some hard data," says Sharon Laskowski, manager of the visualization group at the National Institute of Standards and Technology (NIST), which is facilitating and financing the effort. "If it's not affecting the bottom line, [companies] will not do it." There are now almost no hard numbers on the savings a company can achieve when it pays stringent attention to usability.

But heavyweight user companies such as State Farm, Boeing, Fidelity Investments, Eastman Kodak Co. and Northwestern Mutual Life Insurance Co. came to NIST two years ago to launch the effort, called the Common Industry Format for Usability Test Reports. Top computer industry vendors such as IBM, Compaq Computer Corp., Microsoft Corp., Oracle Corp., Intel Corp., Sun Microsystems Inc. and Hewlett-Packard Co. are also at the table.

After the players have conducted two and a half years of pilot tests, Laskowski says, the companies will take the data to a standards organization in hopes of changing the way IT purchases are made. It will be the difference, she says, between a customer vainly asking whether software is user-friendly and that customer having enough information to re-create the tests and validate the vendor's claims.

Milwaukee-based Northwestern Mutual has added to its standard request for proposals a paragraph that specifically asks vendors for usability data. "If you've got it, we want it," says Eric Strandt, the insurer's manager of software product design. But the company is constrained by the reality that many vendors don't have that kind of information -- and there's no standard that encourages them to come up with it.

Northwestern does send some software to the lab. "We've done quite a bit of that, with varying degrees of success," Strandt says. A few years ago, contact management packages were rejected after they all showed substantial usability concerns. Software from a small vendor, Corporate Software & Technologies in Montreal, beat out IBM to become the company's calendaring package because of its better usability. Northwestern Mutual would have saved end users considerable grief when it upgraded to Microsoft Office 97 had it done usability tests and found file format incompatibilities, Strandt says.

Other savings

State Farm has success stories of its own, but their benefits are less tangible than in the case of the career software savings, Means says. When the company was mulling a time and attendance management product, it assumed that administrative assistants would need three days of training. Usability tests showed that a mere half-day -- or, at most, one -- was needed. The savings: roughly two days' pay for each of approximately 10,000 assistants.

In general, usability labs, which first popped up in the early 1990s, are used to test software that's developed in-house. That's where potential savings are the greatest, because tests that reveal problems early in the development cycle can save months of developers' time -- and, of course, prevent productivity losses when the application is deployed.

Usability experts such as Janice Rohn at Sun point to The Standish Group International Inc.'s Chaos studies, which have attributed many of IT's frequent project failures to a lack of end-user input. Usability professionals such as Means cite IBM research that the eventual return on a dollar spent on usability early in the development cycle can be up to $100.

At Fidelity, usability testing is a growing practice -- but not for software purchases, says Thomas Tullis, vice president of human interface design. The company is building a second lab to accommodate an increase in testing driven by the desire to make the company's Web sites more usable. The prospect of acquiring customers via a user-friendly Web site or the fear of losing them with a frustrating site has made companies much more sensitive to usability, say experts like Rohn and Harley Manning, an analyst at Forrester Research Inc. in Cambridge, Mass. "When you add the Web, everybody is a software company," Manning says.

But the benefits Tullis and others see in improving their own software is harder to find when reviewing commercial tools. Although many usability professionals say their test results have been able to effect major changes in vendors' products, that's not the norm. The feedback loop isn't there, Tullis says. Butler agrees: "If we were to do the testing for our suppliers, it would have to be during the development of the product."

For the common industry format effort to take off, IT must make the usability of commercial software a top priority and hold vendors' feet to the fire, say experts, including Rohn. Although Rohn says she believes that a common reporting format would help vendors by clarifying what usability criteria users most want, she acknowledges that the added burden on vendors won't be welcomed unless customers make it a competitive issue. "It's a business decision," she says. "The more the customers are telling this to vendor companies -- that this is important -- the more the vendor companies will take note."

As much as they want to cheer it on, many usability experts are skeptical about the reporting form's prospects. Either they worry that user companies will show the resolve necessary to nudge vendors beyond lip service, or they're not sure the reports will specify anything useful enough. Usability's benefits, after all, have proved difficult to quantify so far.

Manning says usability testing ultimately won't gain much ground. Surprising as it may be, he says, quality hasn't ranked as the primary priority in software purchases. "Do you pick up software boxes that tell you how bug-free the software is?" he asks. "We buy buggy software all the time." Meanwhile, he adds, "the usability community constantly struggles for attention."

Moreover, usability testing can run into difficulty because it's often conducted with a small sampling of testers. Users vary widely in their skills and intelligence. One particularly savvy or dim user can skew the results abnormally

Laskowski says the common reporting format would at least elicit enough detail from vendors so that customers would know how much experience a vendor's testers have. The customers would also see which features were tested, so they could determine whether the vendor tested the features they want.

Even users directly involved in the project concede that it may not achieve the ideal goal of making usability data as visible in software purchasing as nutrition data is in food purchasing. "I think the resolve varies," Strandt acknowledges.

"I certainly hope it is going to take off," Tullis says. But if the effort ultimately results only in heightened awareness about the importance of usability testing, he says, that will be a major benefit in its own right.

It's a worthy effort, Manning says. "It might help if users revolted a little."


RELATED STORIES:
OPINION: Sure a lot of software stinks, and you may have something to do with it
August 17, 1999
Controversial software licensing law approved
August 2, 1999
Writer's block ad-blocking software
July 12, 1999
New software will help school, police identify threats and hate crimes on the Net
July 6, 1999

RELATED IDG.net STORIES:
High-tech resume traps
(Computerworld)
Managing the stress of looking for a job
(InfoWorld)
How to get more bang out of your networking buck
(Network World Fusion)
Hacking your way to an IT career
(Computerworld)
Bosses from heaven and hell
(Computerworld)
How to get the most from consultants
(Civic.com)
CIOs can learn a lot from McDonald's
(InfoWorld)
Year 2000 World
(IDG.net)
Note: Pages will open in a new browser window
External sites are not endorsed by CNN Interactive.

RELATED SITES:
National Institute of Standards and Technology
Note: Pages will open in a new browser window
External sites are not endorsed by CNN Interactive.
 LATEST HEADLINES:
SEARCH CNN.com
Enter keyword(s)   go    help

Back to the top   © 2001 Cable News Network. All Rights Reserved.
Terms under which this service is provided to you.
Read our privacy guidelines.