Out of 700 million transistors in a CPU, yeah, a few probably go bad -- but even 7 million bad is just 1%, far below the performance threshold where any human will notice. I'd guess what happens in the Real World when a CPU degrades is that the system crashes because such a flaw creates a nonsense instruction.
However, the variance in total working transistors is probably closer to 40% -- but not as a gradual degradation over time. Rather, that's how they come from the factory.
CPUs of a given class are all designed and manufactured to perform at the same speed. But the fact is with micro-transistors, you get lots of variance during manufacture, and it mostly varies by batch. So they're batch tested, THEN labeled for speed.
This is actually why some CPUs lend themselves to overclocking. They batch-tested slower, so they all got the same slower label. But some of that batch, even most, will not actually be that slow, and those CPUs will be overclockable. (You're not actually overclocking, you're just bringing it up to the speed it would have been labeled had the CPUs been individually tested rather than batch-tested.) The more variation in production quality, the more chance the label will be inaccurate, and the more individual CPUs will be overclockable. (Hence AMDs are seen as 'more overclockable' than Intels, but the truth is they're not as consistent.)
[Strange facts and useless information: there was actually no such thing as a P75 CPU. They were all downlabeled P90s.]