Reader Comments

Post a new comment on this article

Supporting information is disappointing

Posted by hans_adler on 27 Dec 2014 at 13:00 GMT

The study did not test what the authors claim it did. Scientific copy editors (a job that I believe is mostly outsourced to places such as India nowadays) sometimes have to reproduce printed articles under conditions similar to those of the experiment. They are typically not familiar with the specific special typography required by the field and subfield in question (such as for instance linguistic symbols or mathematical symbols only used in certain fields, sometimes used only by a few dozen researchers), they spend only little time with each publication, and they have little use for version control. For them, an investment of a few hours for getting acquainted with the special macros used in a specific field will not usually pay off. But even copy editors need search-and-replace in formulas in those cases when they get an electronic version to work on, but this is another strength of LaTeX and weakness of Microsoft Word that this study did not test. As is the handling of huge documents including tables of content and indices.

The vast majority of researchers are not copy editors. Many have needs very different from those tested.They do not reproduce typography, most of us just want to produce good typography without thinking about it and in a way that makes adaptation to a journal's house style a quick and straightforward procedure.

I wonder if the authors' assessment of document quality reflects biases similar to those evident in the study setup and in their conclusions. (To make my question concrete: Was different but functionally equivalent typography considered defective?) I cannot check this because the supporting information does not include the typeset documents. The supporting information does not even include excerpts of typeset documents along with the authors' assessment of their quality.

No competing interests declared.