Net Neutrality is a real issue, though I'm not sure that the examples they give in the video are real or not.
Here is a real example of the Net Neutrality controversy:
http://www.marketwatch.com/news/sto...95-F630-46B6-8282-C187FF10CD0E}&dist=hplatest
But I think the technology could change in the near future, making Net Neutrality a concern for only a minority of Internet users (those "out in the country") who will still have to connect through traditional ISP's:
With current and near-future WiFi and WiMAX technology, traditional ISP's may well be in danger, especially in urban environments. If a wireless router is left "unlocked" it's easy for neighbors to use it to get on the Internet. With approproate programs, they could just as easily wind up looking at webpages you set up "locally" on your computer, even with your Internet cable disconnected. With things such as WiMAX "repeaters" it could be just as easy for computers citywide to connect to each other without going through the Internet "backbone" or a traditional ISP. Things are then controlled by whoever puts up the WiMAX repeaters, but the more people who buy them, the cheaper they will get over time. Imagine a choice of dozens of local wireless ISP's. Of course, if they all set up their little repeaters then sell out to Verizon or Comcast, this idea won't work, but if that happens someone else will set up more repeaters, offering all the content the big ISP's would charge more for.
The design of the Internet is based on ARPANet from the 1970's. Half a dozen college and military computers across the country communicated with one another over modems and telephone lines. Each one wasn't connected to all the others (long distance charges were expensive, and the way this works, each one doesn't have to have a phone line to all the others), but each one connected to perhaps two or three others. If data were being sent meant for another computer that wasn't directly connected, it was sent to one that WAS connected, with instructions to be sent "in the direction of" the destination computer, and it would be passed on until it got to the destination. This is a robust design. Data can be sent from any computer to any other, even if one of those in-between "goes offline" (military euphemism for gets destroyed by a nuclear bomb).
This is how the Internet works. Content filtering is considered damage, and the Internet routes data around damage.
And of course WiFi and WiMAX boxes are manufactured in China, where I see this in widespread use (Internet website filtering in China for the Olympics and in general was mentioned in another thread), where it will be impossible to control for filter such direct "peer to peer" connections. The best the government could do would be to transmit a powerful radio signal that would interfere, but that would be a blatant and rash act (that I wouldn't put past the Chinese government, but then everyone knows what's happening). The next-best would be to monitor everything they can, which is a huge undertaking they're surely already doing.
I don't know if it's a real problem or a hoax, but if the ISPs came up with a package that would filter out all the junk websites, I'd seriously consider buying into it. Whenever I try searching for something on the internet, I end up with a gazillion results and the first page is garbage in terms of reliability.
Are you talking about Google (or other search engine) search results? Firstly, this is a separate issue. If you want to pay extra to STOP getting something (and it's technically feasible to do), ISP's and other services will stand in line to take your money. But this isn't even about the ISP's, it's about seearch engine results.
Deciding on the keywords for searches is a bit of an art and craft, and I doubt simply "paying money" could ever get you better results. I might write something up on it, but I'm not sure how to describe what I do, it's sort of intuitive. Maybe we could discuss this in "tech help" or something.