IT & Internet  Milan, Italy - 4529'15" N - 912'45" E
Turismo IT & Internet Pagine Sparse Astronomia Curriculum

An Introduction to Document Usability
   Rodolfo Baggio, Piero Schiavo Campo - 2000


Usability Assessment is considered now as an essential step in any development activity. Usability measurement can help the process manager to:

      • understand the feeling of the final user about the product
      • obtain feedback which can be used to improve design
      • assess whether the targets have been met.

The four main approaches to usability assessment are:

      • heuristic
      • user-based
      • design criteria
      • model-based.

Heuristic (sometimes known as ‘rule-based’) evaluation is usually carried out by task experts. It can be fast and economical and is good for identifying major problems but is not sufficient to guarantee a successful product. Experts are not considered to be able to predict all the problems that end users will experience.
User-based evaluation can be used to provide feedback at any stage of design. In the early stages, users may be involved in the evaluation of scenarios, drafts or partial/rapid prototypes. As design solutions become more developed, evaluations involving users will be based on progressively more complete and concrete versions of the system.
Evaluation against design criteria is an established technique which can contribute to usability. These criteria are contained in design guides, collections of ergonomic guidelines and standards.
Model-based assessment takes place against a theoretical model of human abilities. The nature of the models used is usually quite generic and does not guarantee that the end users will react in the same way to the product. Model-based evaluation is only gradually being introduced to industry.

The User Centered Design

The standard usability concept is often related to the so-called "User Centered Design". This involves several technical aspects.
User Centered design is an approach to development which focuses specifically on making products usable and safe for their users. User Centered systems empower users and motivate users to learn and explore new solutions. The benefits include increased productivity, enhanced quality of work, reductions in support and training costs and improved user health and safety. Although there is a substantial body of human factors and ergonomics knowledge about how such design processes can be organized and used effectively, much of this information is not yet widely applied.
Adopting a User Centered design process leads to more usable systems and products. It reduces the risk that the resulting system will under-deliver or fail.
Generally, User Centered design is characterized by:

      • an appropriate allocation of functions between user and system
      • iteration of design solutions
      • the active involvement of users
      • multi-disciplinary design teams.

The Technical Documentation Usability

Until now the documentation usability seems to be neglected. This may depend on the fact that usability testing has important applications in the software user interfaces design, while the documentation is often perceived as a secondary problem (and a secondary business).
Nonetheless, the following points should be considered:

      • Documentation is a part of the product it describes.
      • Although most users approach the product ignoring the related documentation, this is more a fault of the documentation than of the users.
      • A good documentation can improve the overall quality of the product, making its usage easier and more satisfactory.

We propose to extend the definition of usability in order to adapt it to a documentation project.
In this paper we will discuss the user documentation associated with an industrial product. In other words when writing "document", the reader should mean "user document".

Definition of Document Usability

The Standard Usability Components

Following ISO 9126 Usability (for software products) is defined by the following components:

      • Understandability:
        The capability of the software product to be easily understandable by the user, who should be able to recognize how to operate with it as quickly as possible.
      • Learnability:
        The capability of the software product to enable the user to learn its application
      • Operability:
        The capability of the software product to enable the user to operate and control it
      • Attractiveness:
        The capability of the software product to be liked by the user.

The Document Usability Components

It seems obvious that this subdivision cannot be adapted to the documentation problem, and the first step to define the Documentation Usability is to define the corresponding relevant aspects.
Our proposal is the following.

A document is complete if all is important the user will know to approach the product is reported in the document.

A document is coherent if:

      • the reported information exactly correspond to the product behavior
      • the adopted terminology is uniform
      • the adopted documentation standard is met.

A document is effective if it reaches its goal, i.e.

      • it is useful to understand the relevant aspects of the product
      • it allows the user to quickly find any information required to use or to understand the product
      • its usage improves the interaction between the user and the product.

A document is appealing if:

      • the reader is attracted by it
      • It is easily readable
      • It uses all the graphical techniques required to obtain a full comprehension of the document argument
      • Its page layout has a balanced and rational content of text and graphics.

A Short Analysis of Document Usability Components

In this section we will analyze in more detail the components of Document Usability.
The final goal of this proposal is to define a practical method to measure the Document Usability. This will imply to measure the completeness, the coherence, the effectiveness and the appealing separately.
It is important to notice that these aspects of Document Usability require different reference subjects to be measured. In particular, the coherence can be measured by the internal team (technicians and writers) that produces the document, while the others require a "final user" to be correctly evaluated.
On the other hand, the respect of the standards and the terminology can be evaluated directly by the writing staff, while the technical coherence requires an interaction with the technical staff.
A document is a form of interface between a product and its user. From this point of view, Completeness and Coherence are concepts more related to the product, while Effectiveness and Appeal are related to the user. 


Completeness concerns technical aspects and general information aspects.
The first point means that no relevant (technical) information can be missing. This happens, for instance, if in the description of a dialog box one or more buttons are not described. This kind of completeness can be verified by the technical staff.
Lack of general information may arise if some previous statement is missing. In other words, missing information is not technically belonging to the product, but it belongs to the cultural environment the product is part of. This is a more sophisticated aspect, and it must be analyzed in more detail.
Let us examine the following sentence, that may be found in a document describing the user interface of an Operation Center used to monitoring a telecommunication network.

"The root level includes information related to the whole network (number of NEs, names and addresses of the NEs and so on), and pointers to the NEs MIBs. Each NE MIB includes specific information related to that NE (IP address, Connection State, NE name, used Proxy and so on) and pointers to the MIBs of the equipment to it."

This sentence includes several words (NE, MIB, IP address, Proxy) that are supposed to be familiar to the reader. It may happen that the meaning of these words is not explained in the document. This choice can be right or not, depending on:

      • the expected skill of the reader
      • how innovative or non-standard is the product from the point of view of the basic concepts.

In other words: it may be absolutely right to skip the definition of Proxy, since the reader is supposed to be acquainted with the basic concepts of modern telecommunications. It may be better to define what a MIB is, or (at least) to add the word MIB in a glossary or in a list of acronyms.
One of the goals of Document Usability is to evaluate what is better to do with terms like Proxy and MIB asking the final users of the manual, instead of trying to guess their expectation.
Completeness is a never-ending concept: there is no limit to the information that could be required to understand a product. This depends on the user experience and on "environmental" characteristics.
For example, let us look at a popular product, as for instance MS Word.
A "complete" manual of MS Word should report all the author needs to write a text, to revise it, to make it appealing from a graphical point of view and so on. The manual should report much more, since the product can be used for many activities that may be not relevant at a certain time.
Should the manual say, somewhere, that to have a "t" character written on the text the writer must press the "t" key? Obviously not, since this information is supposed to be well known to any (possible) user of the product.
What about a 2 years old child, or a central Amazonian native? Maybe these guys would find difficult to write a "t". The problem is that MS Word is not just a tool, it is a product of a culture. Of course, its description cannot include all the cultural items that may be relevant to understand its usage by any possible human being.
This consideration brings to another relevant point. A document can be incomplete as well as "over-complete". This means that a document can say "too much" to reach its goal. For example, let us consider the case of a "maintenance manual". This kind of document is normally intended as a practical tool for maintenance people. It should be ineffective and misleading to describe the product basic concepts and philosophy on such a manual. The maintenance operator could be confused.
Any attempt to define and measure the Document Usability should take into account both under-completeness and over-completeness.


Coherence has a technical edge and an "internal" edge.
The former is related to the fit between the document contents and the real behavior of the product. This is a responsibility of the technical staff.
The latter concerns questions like:
"All the commands are written boldface, as for the specifications?"
"Does the Specification section avoid to talk about architectural aspects, as for the standard document structure?"
Notice that both these questions can have a complete answer after a careful examination of the document by the writing staff.
In conclusion, we must notice that

      • The final user can perceive some incoherence
      • The technical staff must ensure the technical coherence
      • The writing staff must ensure the coherence from the point of view of contents and layout.

This means that the concrete definition (and measurement) of coherence requires three different opinions to be merged.


This is the most critical aspect of the Document Usability. Both technical and writing staffs may feel something about the Effectiveness, but the final user is the only one who can definitely say that the document is effective.
On the other hand, the final user may perceive Effectiveness in a quite confusing way. Most readers will be able to recognize that "the document is not effective", but they will not be able to say exactly "why", nor to provide a score to the effectiveness.
It should be noticed that this problem might arise also for Completeness. Most readers will not clearly distinguish between Completeness and Effectiveness. The reader perceives a Lack of Completeness, in most cases, as a lack of Effectiveness.
Effectiveness may concern the structure of a document, the linguistic quality, the number and the quality of the examples, the artworks, the tables, the internal links and so on. The writing staff may directly check some of these aspects. For instance, readability indexes are described in literature.


This is definitely an aspect concerning the final user.
The reader is the only one to be able to express an assessment concerning topics like the readability or the balance of a page layout with regards to texts and pictures, the elegance of the graphical layout.
The only possible help for an evaluator in this area is one of the several "readability indexes" that are defined and that can be computed automatically, or an estimate of the ratio between text and pictures.

Usability and Documentation Lifecycle

The document production lifecycle is somehow embedded into the product development. Only in very few cases the document production can start after the product has been released.
For this reason, it seems very important to plug the Document Usability into a correctly defined document production lifecycle.

The Evolutionary Lifecycles

The "old fashion" software development lifecycle used to consider the whole product as its main object. The standard phases of this lifecycle were:
Specifications -> development -> testing -> release.
The "new" development philosophy accepts this approach as suitable only if:

      • the product belongs to a well defined and accepted typology, so that the user has an idea about what he can expect from the product;
      • the product requirements can be fully defined before starting the development;
      • the release deadline is not critical.

The first point, in particular, seems to be important to adopt a different style for the product lifecycle, since it concerns the user and his degree of satisfaction with the product. Indeed, following the "old style" lifecycle, the user opinion was requested only at the very beginning (product requirements) and at the end, after the product release.
This approach seems to be definitely poor:

      • in case of very innovative products
      • in case of long development time involved in the production. Since the user expectation may change during this period, the result, although it is consistent with the original specification, may be unsatisfactory.

Evolutionary lifecycles have been investigated to increase the "weight" of the user satisfaction. In an evolutionary lifecycle the production is split into short sub-cycles, each involving Analysis, Design, and Build (ADB cycles) activities. The user is requested to provide his opinion at the end of each sub-cycle, by mean of specific usability tests. That is why this development philosophy is often called "User Centered".
In order to be effective, evolutionary lifecycles must be able to define sub-tasks in the development process, so that it is possible to define a specific sub-cycle for each of them.
The following picture summarizes the basic idea of an evolutionary lifecycle. We will use it as a basis for our discussion on the Document lifecycle. Please notice that the "Pass" step includes the user opinion, since it involves a usability test.

Figure 1 – Evolutionary Lifecycle Schema

The Document Lifecycle vs. Product Lifecycle

Documentation is nowadays produced in several ways and following different approaches. Some companies have well defined document standards, some (in particular the small ones) produce the relevant documents assigning the task to the same technical staff that builds the product.

Although it is difficult to define a general frame, it can be said that the document development evolves in parallel with the product development. This happens because waiting for the end of the development before starting the documentation process would result in a substantial delay of the final release.
In spite of this parallelism, the relationship between development and documentation is weak, at least in most cases:

      • Documentation items are not correlated with the development sub-tasks.
      • The documentation cycle is separated from the development cycle, and the technical staff perceives its activity on documents as an "extra job" to be done.

A typical lifecycle for a document can be depicted as follows:

Figure 2 – The Currently Adopted Document Lifecycle

Although this is the most suited approach to the document production, it is wrong from the Document Usability point of view, since it does not keep into account the opinion of the user.
A well-known statement of the classical usability is that people developing the product cannot measure the usability itself, since usability concerns the user, not the developer. In the case of documentation, it seems at all natural that the "technical staff" is the only entity allowed to give feedback about the document quality.
This produces the most common problem of the technical documents. Although the documents are coherent (since they have been revised by technicians), they are not complete. This because the technician, in most cases, is unable to perceive the difficulties that the final reader will encounter (everything is known to him, but not to the final reader!)
To step toward the Document Usability we need to change the above diagram, including in it the final reader. The most obvious approach is to start from the development lifecycle, introducing the documentation. The result looks like the following.

Figure 3 – An Evolutionary Document Lifecycle

The above picture is quite indicative. Actually, it seems difficult to force development and documentation in a single lifecycle. The main aspects of this idea are the following:

      • Documents must be considered as formed by sub-items, as happens for the product development.
      • Sub-items must be defined somehow before starting the documents production.
      • Usability tests must be implemented for each documentation sub-item.
      • Specific tests must be performed in any case at the end of the document production.

The last point must be added since the document in its whole requires being "usable". Usability tests made on single sub-items can provide a measurement of the sub-item description, but it is important to check if the overall structure of the document is correct.

Document Usability Evaluation

The Measurement Technique

The users’ attitudes towards a document are an important component of usability. The most conveniently structured way to assess attitudes is to use a questionnaire. This approach can provide valid and reliable measures of the documentation usability provided a sound methodology in developing a questionnaire is put in place.
Questionnaires can be used in the following situations:

      • During documentation planning, to analyze previous versions, or competing products.
      • During the later stages of documentation design, when a workable model or draft is available.
      • After documentation release, as a survey to find out user reaction to the product.

In order to give reliable results, questionnaires must be standardized. This is namely important to go beyond the subjective opinions of the people who drew up the questionnaire. A standardized questionnaire is:

      • Reliable: that is, it can give equivalent results when administered to different user samples or when administered at different times, given that the same type of user carrying out the same kinds of tasks with the same documents are being assessed.
      • Valid: that is, it measures known properties of the document being assessed and that these properties are relevant.
      • Normed: that is, there is a list of expected values against which the values obtained for the document under investigation can be compared.

The development of a good questionnaire tool takes a considerable amount of time and effort; however, the application of a questionnaire is fast and cost-effective.
For any document, each sub-item description must be tested for usability. This requires measuring the degree of completeness, coherence, effectiveness and appeal of the description itself.
The global Document Usability can thus be defined as the weighted average score of the document sub-items usability together with the overall score obtained by the document in the final test.
Notice that a good testing methodology must provide not only the final scores, but also indications about what is good and what is poor in the analyzed documentation.

The Design Guidance

An established technique, which can contribute to usability, is to require a product to conform to standard requirements and guidelines (style guides). This approach is likely to gain in popularity, as more parts of the ISO standards are finalized and published as agreed standards.
Although conformance with these standards will normally contribute to usability, conformance cannot assure the usability of a document as defined above.
The product may have "ergonomic" deficiencies not covered by the standards, or it may only be usable by a very narrow range of users for specific tasks. There may also be several alternative designs, all of which conform with the guidelines, but with large differences in usability.
Guidelines specify attributes of a document, which have been shown to improve usability. Some guidelines are at a surface level (e.g. page layouts), and others state higher level objectives (e.g. consistency). In many cases, the usability will be improved by modifying the document to be consistent with guidelines.
Guidelines have the advantage that they can be applied early in document production and conformance to a style guide can be assessed merely by inspection of the document.
The weakness of guidelines is that they often generalize across a wide range of characteristics of users, tasks and environments, and it is very difficult to rigorously specify the limits of the context in which a guideline is applicable.
One way to apply guidelines effectively is to create a checklist or style guide specifically for the document being developed and to ensure that the project personnel adhere to it.
However, it must be emphasized that conformance to guidelines and standards is highly desirable, but does not guarantee high product usability.


Alred G., Oliu W., Brusaw C.: The Professional Writer: A Guide for Advanced Technical Writing (St. Martin's Press, 1992).
Austin M.: The ISTC Handbook of Technical Writing and Publication Techniques (Heinemann, 1990)
Barrett J. F., Craig-Smith S. J.: Multimedia Documents: Towards a New Paradigm for Instructional Technology, in: Proceedings of the Sixth IFIP World Conference on Computers in Education. ed. by J. D. Tinsley and T. J. Van Weert (London: Chapman & Hall, 1995: pp.225-34).
Bias R.G., Mayhew D.J. (eds.): Cost-Justifying Usability (London: Academic Press, 1994).
Bland, K. E., Liebowitz J.: Evaluating Hypermedia: A Methodology and Case Study, in: Proceedings of the International Conference on Multimedia Computing and Systems (Los Alamitos, CA: IEEE Computing Society Press, 1995: pp.123-30).
Blunden B., Blunden M. (ed.): The electronic publishing business and its market (Reading, Berkshire: The Eastern Press Limited, 1994)
Bolter J. D.: Text and Technology: Reading and Writing in the Electronic Age (Library Resources and Technical Services 31 no. 1, 1987: pp.12-23).
Brent D.: E-Publishing and Hypertext Publishing (EJournal vol. 6, no. 3, August 1996).
British Standard BS 4884: Technical manuals. Specification for Presentation of Essential Information, Part 1: 1992
British Standard BS 4884: Technical manuals. Guide to content, Part 2: 1993
British Standard BS 4884: Technical manuals. Guide to presentation, Part 3: 1993
Brusaw C. T. , Alred G. J., Oliu W. E.: Handbook of Technical Writing (St Martins Pr., 1997).
Bush, V.: As We May Think (Atlantic Monthly, July 1945).
Butcher J.: Copy-editing - The Cambridge Handbook (Cambridge University Press, 1975).
Cole F., Brown H.: Standards: What Can Hypertext Learn from Paper Documents?, in Proceedings of the Hypertext Standardization Workshop (Gaithersburg, MD: NIST, 1990: pp.59-70).
Gunning R.: The Technique of Clear Writing (McGraw-Hill, 1952).
Hackos J. T.: Managing Your Documentation Projects (John Wiley & Sons: Wiley Technical Communication Library, 1994).
Hackos J. T., Stevens D. M.: Standards for Online Communication : Publishing Information for the Internet/World Wide Web/Help Systems/Corporate Intranets (John Wiley & Sons, 1997).
Haramundanis K.:The Art of Technical Documentation (Digital Press, 1997).
Hargis G. (ed.), Hernandez A., Hughes P., Ramaker J.: Developing Quality Technical Information : A Handbook for Writers and Editors (Prentice Hall, 1997).
Haydon L. M.: The Complete Guide to Writing & Producing Technical Manuals : A Reference of Style and Procedure (John Wiley & Sons, 1995).
Horton W.: Secrets of User-Seductive Documents (Arlington: Society for Technical Communication/Publications, 1997).
Kemnitz C. F. (ed.): Technical Editing: Basic Theory and Practice (Arlington: Society for Technical Communication/Publications, 1994).
Laufer R.: Texte, Hypertexte, Hypermedia (Paris: Presses universitaires de France, 1995).
McLaughlin G. H.: What Makes Prose Understandable (Ph.D. Thesis, University College, London, 1966).
Rubin J.: Handbook of Usability Testing (New York: John Wiley & Sons, Ltd., 1994)
Weiss E. H.: How to Write Usable User Documentation (Oryx Press, 1991).
Zipf G. K.: The Psycho-Biology of Language (Boston: Houghton Mifflin Co. 1935)

Rodolfo Baggio, P. Schiavo Campo, An Introduction to Document Usability, Sonar Technical Report, Milano, 2000


R. Baggio - Last update: July 2000