I have two scripts that I find helpful for finding over-used words and phrases. The first, I wrote to help with indexing and found it useful for editing. It counts the number of times each single word appears, each two-word sequence, three-word sequence and so on. Then it spits out a report listing the words and sequences in descending order of frequency with the counts.
It's useful, but it doesn't capture the entire picture. I wrote another script to spot what I call "clumps," using the same word frequently in a small area of text. Using "address" 20 times in a 60,000 word text is nothing, but when "address" appears 5 times in 3 paragraphs, it grates. I use both scripts when I am editing.
They are not a replacement for a human editor, but they help. I imagine there are similar (and easier to use!) tools available commercially, but I get a kick out of writing them for myself.