A former long-term client wanted me to use grammar-checking software on their documents, in addition to checking them myself – so I’ve got quite a lot of experience of the way it works, and many of the ways in which it doesn’t work.
Grammar-checking software works on rules and heuristics. It doesn’t understand what it’s reading at all, so it has no way of knowing that something that looks like an example of one of its rules actually isn’t, even though this would be obvious to a human reader. I’m not talking about the way accomplished writers can sometimes “bend the rules” – I’m talking about sentences which look to the software like they’re an instance of a rule but a human could tell that they actually aren’t.
For example, the software evidently has a rule about cutting out clichés and overused word pairs. It thinks “dark” in “dark night” and “up” in “rise up” are redundant. Which is fine, apart from in sentences like “If space were infinite, the night sky would be completely full of stars, so the dark night is evidence that space is finite”, or “The fractions rise up the fractionating column”.
It also has no concept of coherence, so it cannot tell whether its proposed corrections would turn the text into nonsensical word salad; and it has very little idea of consistency. Its dictionary appears to have British spellings for some words but not all, so it will correct “crystallize” to “crystallise” while simultaneously correcting “recrystallise” to “recrystallize”. And I once edited a section about amu (atomic mass units) in which the software wanted to change some instances of “amu” to “am” and some to “amp”, apparently randomly.
Finally, the software has no real-world knowledge or “common sense”, so it produces sentences which are grammatically sound but lack the semantic knowledge of a human five-year-old. One of the more entertaining examples was when it wanted to change “Watch the video to learn more” to “Watch the video learn more”. You can totally see how it got there, with syntactic knowledge but no semantic understanding: it’s obviously been programmed to know that “watch [entity] [do thing]” should not have an infinitive, but has no idea that videos don’t learn things.
The leading automatic grammar checker markets itself to non-native English speakers and people who are not confident in their grammar. But it’s only because I am a native speaker and am very confident in my grammar that I feel able to use it (and reject most of its suggestions). Sadly, I can imagine there are lots of people who look at its suggestions and decide to trust them over their own judgement.
Some of the mistakes made by grammar-checking software are funny. Some of them are very interesting linguistically. And some of them are just baffling and make me wonder how it got there. This is the first in a series of blog posts in which I will share them for your amusement and discuss them from a linguistic perspective.