My issue is, who gets to determine what's offensive?
The thing is, a good content warning isn't about what's offensive. It's about warning readers about things that some of them might find disturbing. There are some coarse-grained things that are good bets to give content warnings about - explicit sex, violence, language, self-harm, animal abuse, etc.
I don't use content warnings to tell people they should or shouldn't read, I use them to give people some guidelines about whether or not they want to read this particular sort of story at this particular time. And of course you miss things that bother people, and you label things as potentially problematic when they don't bother many people at all, but that doesn't mean it's not worth the effort.
Ratings like PG-13 are pretty useless IMHO (for movies as well). They're too coarse. If you want to get anything useful out of them, you need to dig around and find out more details anyway.
I fear if publishers start using movie-like ratings we'll see movie-like censorship. I might be paranoid. But the idea of leaving content warnings - even more nuanced ones - to corporations makes me nervous.