Chapter I.I The nature of academic writing Anyone who wishes ro become a good writer should endeavour, before he allows himself co be tempted by rhe more showy qualities, to be direct, simple, brief, vigorous, and lucid. (Fowler & Fowler, 1906, p. 1 1) THE LANGUAGE OF SCIENCE AND ACADEMIA If we examine the text of scientific articles it is clear that there is a generally accepted way of writing them. Scientific text is precise, impersonal and objective. It typically uses the third person, the passive tense, complex terminology, and various footnoting and referencing systems. Such matters are important when it comes to learning how to write scientific articles. Consider, for example, the following advice: Good scientific writing is characterised by objectivity. This means that a paper must present a balanced discussion of a range of views . . . Moreover, value judgements, which involve moral beliefs of what is 'right' or 'wrong' must be avoided . . . The use of personal pronouns is unnecessary, and can lead to biases or unsupported assumptions. In scientific papers, therefore, personal pronouns should not be used. When you write a paper, unless you attribute an opinion to someone else, it is understood to be your own. Phrases such as 'in my opinion' or 'I think,' therefore, are superfluous and a waste of words . . . For the same reasons, the plural pronouns we and our are not used. (Cited, with permission, from Smyth, 1996, pp. 2—3) CLARITY IN SCIENTIFIC WRITING In my view, following this sort of advice obscures rather than clarifies the text. Indeed, Smyth has rather softened his views with the passage of time 4 Introduction The nature of academic writing 5 (see Smyth, 2004). For me, the views expressed by Fowler and Fowler in 1906, which head this chapter, seem more appropriate. Consider, for example, the following piece by Watson and Crick, announcing their discovery of the structure of DNA, written in 1953. Note how this text contravenes almost all of Smyth's strictures cited above: We wish to suggest a structure for the salt of deoxyribose nucleic acids (D.N.A.). This structure has novel features which are of considerable biological interest. A structure for nucleic acid has already been proposed by Pauling and Corey. They kindly made their manuscript available to us in advance of publication. Their model consists of three inter-twined chains, with the phosphates near the fibre axis, and the bases on the outside. In our opinion this structure is unsatisfactory for two reasons: (1) We believe that the material which gives the X-ray diagrams is the salt, not the free acid- Without the acidic hydrogen atoms it is not clear what forces would hold the structure together, especially as the negatively charged phosphates near the axis will repel each other. (2) Some of the van der WaaLs distances appear too small. Another three-chain structure has also been suggested by Fraser (in the press). In his model the phosphates are on the outside and the bases on the inside, linked together by hydrogen bonds. This structure as described is father ill-defined, and for this reason we shall not comment on it. (Opening paragraphs from Watson and Crick, 1953, pp. 737-8, reproduced with permission from James D. Watson and Macmillan Publishers Ltd) Table 1.1.1 lists some of the comments that different people have made about academic text. Some consider that academic writing is spare, dull and undistinguished. Some consider that articles in prestigious journals will be more difficult to read than articles in less-respected journals ones because of their greater use of technical vocabulary. Others warn against disguising poor-qualiry articles in an eloquent style. Indeed, there is some evidence that journals do become less readable as they become more prestigious and that academics and students do judge complex writing to be more erudite than simpler text (Hartley et a!., 1988; Oppenheimer, 2005; Shelley and Schuh, 2001). Furthermore, Sokal (1996) once famously wrote a spoof article in scientific and sociological jargon that went undetected by the editors (and presumably the referees) of the journal it was submitted to. MEASURING THE DIFFICULTY OF ACADEMIC TEXT There are many different ways of measuring the difficulty of academic text. Three different kinds of measure (which can be used in combination) are: 'expert-based', 'reader-based' and 'text-based', respectively (Schriver, 1989). • Expert-based methods are ones that use experts to make assessments of the effectiveness of a piece of text. Referees, for example, are typically asked to judge the quality of an article submitted for publication in a scientific journal, and they frequently make comments about the clarity of the writing. Similarly, subject-matter experrs are asked by publishers to judge the suitability of a manuscript submitted for publication in terms of content and difficulty. • Reader-based methods are ones that involve the actual readers in making assessments of the text. Readers might be asked to complete evaluation scales, to state their preferences for differenr versions of the same texts, to comment on sections of text that they find difficult to follow, or be tested on how much they can recall after reading a text. • Text-based measures are ones that can be used without recourse to experts or to readers, and these focus on the text itself. Such measures include computet-based readability formulae and computer-based measures of style and word use. Table I.I.I Some characteristics of academic writing Academic writing is: • unnecessarily complicated • pompous, long-winded, technical • impersonal, authoritative, humourless • elitist, and excludes outsiders. But it can be: " appropriate in specific circumstances • easier for non-native speakers to follow. Two particular measures deserve attention here because they have both been used to assess the readability of academic text. One is a reader-based measure, called the 'cloze' test. The other is a computer-based measure, called the Flesch 'Reading Ease' score. Cloze tests The cloze test was originally developed in 1953 to measure people's understanding of text. Here, samples from a passage are presented to readers with, say, every sixth word missing. The readers are then required to fill in the missing words. 6 Introduction The nature of academic writing 7 Technically speaking, if every sixth word is deleted, then six versions should be prepared, with the gaps each starting from a different point. However, it is more common_prepare one version and perhaps__ to focus the gaps on _ words. Whatever the procedure, the_ are scored either: (a) by_accepting as correct those responses_directly match what the original _actually said, or (b) by_ these together with acceptable synonyms. As the two scoring methods (a) and (b) correlate highly, it is more objective to use the tougher measure of matching exact words (in this case: 'to', 'even', 'important', 'passages', only', 'which' 'author and 'accepting'). Test scores can be improved by having the gaps more widely dispersed (say every tenth word), by varying the lengths of the gaps to match the lengths of the missing words; by providing the first of the missing letters; by having a selection of words to choose from for each gap; or by having readers work in pairs or small groups. These minor variations, however, do not affect the main purpose of the cloze procedure, which is to assess readers' comprehension of the text and, by inference, its difficulty. The cloze test can be used by readers both concurrently and retrospectively. It can be presented concurrently (as in the paragraph above) as a test of comprehension, and readers are required to complete it, or it can be presented retrospectively, and readers are asked ro complete it after they have first read the original text. In this case the test can serve as a measure of recall as well as comprehension. The cloze test can also be used to assess the effects on readers' comprehension of different textual organisations, readers' prior knowledge and other textual features, such as illustrations, tables and graphs (Reid