Here are two more examples like the “rise up” and “dark night” examples in The G Files Part 1, where the grammar-checking software thinks it’s spotted a phrase that its designers considered poor style, but in fact it’s just the same words as would be used in that phrase, but meaning something different in context.
A science textbook author wrote “…as discussed earlier in Addition Polymers”, and the software wanted to apply a rule changing “in addition” to “also”, which would result in “as discussed earlier also Polymers”.
Another author wrote “in very close proximity”, and the software wanted to apply a rule changing the redundant-sounding “close proximity” to “proximity”, which in this case would result in “in very proximity”.
A similar type of error is when, rather than objecting to an overused phrase, the software spots what it thinks is an incorrect deviation from a well-known phrase or idiom, and tries to correct it back to the usual wording… when really the writer wasn’t trying to use that phrase at all.
For example, the software tried to claim that a heretical scientist was “burned at stake”.
And a coursework guide advised students on what to discuss “in the conclusion”, but the software wanted to change it to “in conclusion”.