What's to be done about social media?

lizmonster

Possibly A Mermaid Queen
Absolute Sage
Super Member
Registered
Joined
Jul 5, 2012
Messages
14,775
Reaction score
24,914
Location
Massachusetts
Website
elizabethbonesteel.com
Content warning: This story contains descriptions of violent acts against people and animals, accounts of sexual harassment and post-traumatic stress disorder, and other potentially disturbing content.

Be advised: the CW on this article is accurate. The descriptions of the content these people are asked to moderate - again and again - are horrific.

Interviews with Facebook content moderators

So my visceral response to this is pretty strong, but I'm struggling with what the right answer is. Shutting down Facebook (if it were possible) would, IMHO, do nothing but rearrange the deck chairs (although it might send a useful message). The social media genie is out of the bottle - and Twitter, where I found this article, is not exactly in a position to point fingers at anybody else.

This bit at the end struck me:

If you believe moderation is a high-skilled, high-stakes job that presents unique psychological risks to your workforce, you might hire all of those workers as full-time employees. But if you believe that it is a low-skill job that will someday be done primarily by algorithms, you probably would not.

Instead, you would do what Facebook, Google, YouTube, and Twitter have done, and hire companies like Accenture, Genpact, and Cognizant to do the work for you. Leave to them the messy work of finding and training human beings, and of laying them all off when the contract ends. Ask the vendors to hit some just-out-of-reach metric, and let them figure out how to get there.

Which is part of the continued upending of what capitalism is supposed to be about. These people are doing extremely hazardous work that's important to their employer (or in this case, their employer's employer), but they're paid crap because the tech companies are still convinced they're going to be able to solve the problem with machines. They're expendable seat-warmers, employed only until the "real" solution shows up.

This kind of thing isn't my area of expertise, but I'm pretty sure we're quite a few years - maybe decades - away from a reliable, fully-automated content analysis system. (Purely automated I'd say is impossible.) Treating these people like they're disposable because they may, someday, in the amorphous future be replaced by software is amoral and unconscionable.

The other thing that hits me is all the hoops Facebook and Google (and all the others) jump through to "protect" speech. They are not the government. They can absolutely have all the content guidelines they want. They can say no videos or no swearing or no using the word "azure." Will this impact their business model? Sure will. They'll have a few less billions to redistribute to the C suite, but I'm not weeping for any of them.

For myself...I'm still on Facebook and Twitter. I interact with friends and family on Facebook. I interact with the writing community on Twitter. They're useful tools, and I'm uncomfortable with my continued participation. (Facebook is completely dragged in this article, but Twitter's run by someone whose political beliefs are just about 180 degrees from mine.) Then again, I sold three books to News Corp, so it's late for me to be getting a conscience.

But it's never too late to recognize that there's something really, really wrong with how things are working. The question is, how do we go about repairing this?
 

Introversion

Pie aren't squared, pie are round!
Kind Benefactor
Super Member
Registered
Joined
Apr 17, 2013
Messages
10,795
Reaction score
15,323
Location
Massachusetts
If Facebook really cared about what gets posted to their site, they would perma-ban anyone posting the kind of violent content enumerated in that article. Post a video showing teens bashing a lizard to death? Banned. Doesn’t matter if it’s your original content or just a repost.

But, of course they won’t, because their entire business model requires maximum eyeballs to show ads to.
 

Albedo

Alex
Super Member
Registered
Joined
Dec 17, 2007
Messages
7,376
Reaction score
2,958
Location
A dimension of pure BEES
David Langford came up with the idea of visual basilisks, computer-generated fractal images that can permanently destroy the mind if merely glanced upon, and subsequently become a beloved tool of terrorists and trolls. He invented a solution as well: an internet where all images and videos are completely banned, with life in prison for anyone who even attempts to post one.

Reading this article, and the shit that humanity wants to post online - yeah, we're just about at the point I could get behind a total ban on posting anything visual at all.
 

Alpha Echo

I should be writing.
Super Member
Registered
Joined
Jul 11, 2008
Messages
9,615
Reaction score
1,852
Location
East Coast
I haven't read the OP's article yet, but I will. I just wanted to comment that I have read article where moderators are interviewed, and I've listened to a podcast or two about it...and it's terrible. Absolutely terrible. These poor people are getting paid low wages to sit in front of a computer and develop PTSD. Some admitted to even...becoming brainwashed. They watch one too many Youtube videos about conspiracy theories and literally begin to believe them. Most companies seem to offer therapy, but the way the employees describe a "therapy session" is a joke. The companies also don't offer breaks or even the option to step away for a minute to get your head together. Some employees describe having sex with each other wherever they can just to try to let off some of the steam.

The fact that there is so much nastiness out there that the poor people trying to prevent US from seeing those things are ending up with PTSD, withdrawn from family and friends and questioning their own, previously solid beliefs is terrifying.

ETA: I'm sure this article probably goes into some of this stuff too...it's just hard to wrap my head around...
 
Last edited:

lizmonster

Possibly A Mermaid Queen
Absolute Sage
Super Member
Registered
Joined
Jul 5, 2012
Messages
14,775
Reaction score
24,914
Location
Massachusetts
Website
elizabethbonesteel.com
ETA: I'm sure this article probably goes into some of this stuff too...it's just hard to wrap my head around...

It does. It also goes into working conditions that should not be legally permitted to exist anywhere. This industry is desperately in need of unionization. The tech companies need humans for this work, and will for the foreseeable future. If those humans banded together, they'd have some real power.
 

Alpha Echo

I should be writing.
Super Member
Registered
Joined
Jul 11, 2008
Messages
9,615
Reaction score
1,852
Location
East Coast
It does. It also goes into working conditions that should not be legally permitted to exist anywhere. This industry is desperately in need of unionization. The tech companies need humans for this work, and will for the foreseeable future. If those humans banded together, they'd have some real power.

Reading it now. I believe it was the author's first article on the subject that I read.
 

BenPanced

THE BLUEBERRY QUEEN OF HADES (he/him)
Kind Benefactor
Super Member
Registered
Joined
Nov 5, 2006
Messages
17,876
Reaction score
4,671
Location
dunking doughnuts at Dunkin' Donuts

Roxxsmom

Beastly Fido
Kind Benefactor
Super Member
Registered
Joined
Oct 24, 2011
Messages
23,132
Reaction score
10,904
Location
Where faults collide
Website
doggedlywriting.blogspot.com
Wow. How they can get away with those working conditions is beyond me.

We desperately need federal standards for workplace conditions, because the states clearly aren't stepping up to the plate. Any workplace that has workers routinely throwing up in trash cans at their desks (because they have to come to work sick and don't get enough restroom breaks) and smearing feces on the bathroom walls should be shut down by the health department.

And if the police aren't working with them to arrest the sick pieces of shit who mutilate animals and torture children on social media, then why the hell not?

They should definitely unionize, but I don't know that there's anything that can protect people from the stress and horror having to pore through the kinds of content described in the article.
 
Last edited:

Jack McManus

smoothopr8r
Super Member
Registered
Joined
Jan 24, 2014
Messages
832
Reaction score
133
Location
West of where the red fern grows
Be advised: the CW on this article is accurate. The descriptions of the content these people are asked to moderate - again and again - are horrific.

Interviews with Facebook content moderators

So my visceral response to this is pretty strong, but I'm struggling with what the right answer is. Shutting down Facebook (if it were possible) would, IMHO, do nothing but rearrange the deck chairs (although it might send a useful message). The social media genie is out of the bottle - and Twitter, where I found this article, is not exactly in a position to point fingers at anybody else.

This bit at the end struck me:



Which is part of the continued upending of what capitalism is supposed to be about. These people are doing extremely hazardous work that's important to their employer (or in this case, their employer's employer), but they're paid crap because the tech companies are still convinced they're going to be able to solve the problem with machines. They're expendable seat-warmers, employed only until the "real" solution shows up.

This kind of thing isn't my area of expertise, but I'm pretty sure we're quite a few years - maybe decades - away from a reliable, fully-automated content analysis system. (Purely automated I'd say is impossible.) Treating these people like they're disposable because they may, someday, in the amorphous future be replaced by software is amoral and unconscionable.

The other thing that hits me is all the hoops Facebook and Google (and all the others) jump through to "protect" speech. They are not the government. They can absolutely have all the content guidelines they want. They can say no videos or no swearing or no using the word "azure." Will this impact their business model? Sure will. They'll have a few less billions to redistribute to the C suite, but I'm not weeping for any of them.

For myself...I'm still on Facebook and Twitter. I interact with friends and family on Facebook. I interact with the writing community on Twitter. They're useful tools, and I'm uncomfortable with my continued participation. (Facebook is completely dragged in this article, but Twitter's run by someone whose political beliefs are just about 180 degrees from mine.) Then again, I sold three books to News Corp, so it's late for me to be getting a conscience.

But it's never too late to recognize that there's something really, really wrong with how things are working. The question is, how do we go about repairing this?

Yeah, that's a tricky one Liz, and the pathway out of this mess is full of pitfalls. The larger question I have is what about social media's influence over social behavior. I'm no scientist, just a working stiff old guy who squeaked through high school. What are we to do? Back in my formative years, someone figured out how to leverage our lizard brain by inserting a single frame into a movie reel in order to plant an image of popcorn and soda pop so that we'd run to the concession stand at intermission. My parents grew up in an age before television and its insidious influence. But there was no going back to life without TV. Unless people quit watching.

How does that help? Information is ammunition. Those who stand to profit from leveraging technology will never quit doing so. Laws are enacted to moderate, not eliminate this sort of thing, not as long as those making the laws don't bite too hard on the hands that feed them. So don't look for a workable political solution anytime soon. They can't legislate behavior anyway. I don't seek out these kinds of sick, twisted videos and I sure as hell wouldn't want to have to spend all day looking at them for a paycheck. My bigger issue right now is my phone company selling vacant local numbers to telemarketers.

By the way, I have no problem with your conscience, Liz, for using whatever's available to get your words out into the world.
 

lizmonster

Possibly A Mermaid Queen
Absolute Sage
Super Member
Registered
Joined
Jul 5, 2012
Messages
14,775
Reaction score
24,914
Location
Massachusetts
Website
elizabethbonesteel.com
Laws are enacted to moderate, not eliminate this sort of thing, not as long as those making the laws don't bite too hard on the hands that feed them. So don't look for a workable political solution anytime soon.

I agree that this is a genie-out-of-the-bottle thing. And I do think the right thing to do is try to catch and remove/moderate posts like this.

But if you're going to do that, you MUST properly warn people of what they're getting into, and you MUST give them hazard pay for what is genuinely hazardous work.

I'm back to unions, I think.

By the way, I have no problem with your conscience, Liz, for using whatever's available to get your words out into the world.

It nags at me, because it's another thing I didn't do my homework about and didn't discover until it was too late to do anything about it. And it's a little unfair to blame the publisher, who'd been assimilated earlier by a far more massive corporation, or the individuals working there, all of whom were not-at-all-Rupert-Murdock-y.
 

Jack McManus

smoothopr8r
Super Member
Registered
Joined
Jan 24, 2014
Messages
832
Reaction score
133
Location
West of where the red fern grows
I agree that this is a genie-out-of-the-bottle thing. And I do think the right thing to do is try to catch and remove/moderate posts like this.

But if you're going to do that, you MUST properly warn people of what they're getting into, and you MUST give them hazard pay for what is genuinely hazardous work.

I'm back to unions, I think.



It nags at me, because it's another thing I didn't do my homework about and didn't discover until it was too late to do anything about it. And it's a little unfair to blame the publisher, who'd been assimilated earlier by a far more massive corporation, or the individuals working there, all of whom were not-at-all-Rupert-Murdock-y.

That was how I took your meaning, as regret for not doing the homework. My comment was intended support, sorry that wasn't clear. Maybe I should self-ban from morning posting until after coffee kicks in!

I support unionization also. Worker protections always come at a cost to the shareholders if not passed along to consumers, however. And smaller dividends don't seem a likely choice.
 

frimble3

Heckuva good sport
Super Member
Registered
Joined
Oct 7, 2006
Messages
11,693
Reaction score
6,605
Location
west coast, canada
But if you're going to do that, you MUST properly warn people of what they're getting into, and you MUST give them hazard pay for what is genuinely hazardous work.

I'm back to unions, I think.
I support unionization also. Worker protections always come at a cost to the shareholders if not passed along to consumers, however. And smaller dividends don't seem a likely choice.

Big deal if consumers pay more. Social media is a privilege, not a necessity.
It's like saying that without slavery the cost of cotton will go up. :Shrug:
People are more important.
I am a strong believer in unions, and these people need one! Preferably a big one, with teeth. It's a very spread out industry, with no particular workplace to picket, or to sign up people at, so you'd want experienced organizers and a union with national reach. (And, please, no management supported cowboy union, set up purely to be able to say 'they're unionized'.)

Because if the work is always going to be horrible, there are ways to support people:
extra breaks after bad calls, people to decompress with, hazard pay for taking a string of bad calls. Aside from way better pay, paid sick days and medical benefits, including psychotherapy so that you can go to the provider of your choice, not the company shrink.

I would assume that 911 call-takers have reasonable plans to base this on. Or suicide or abuse hot-lines?

In any case, sometimes an enthusiastic threat of unionization itself is enough to spur management to make changes.
 

Roxxsmom

Beastly Fido
Kind Benefactor
Super Member
Registered
Joined
Oct 24, 2011
Messages
23,132
Reaction score
10,904
Location
Where faults collide
Website
doggedlywriting.blogspot.com
One option is to ban anonymity on these platforms. The government can't restrict speech, but it can require that platforms collect and constantly confirm personal data on all posters with the stipulation it be kept private unless the person posts content that suggests they are doing something illegal. And we need to find a way to get law enforcement to take cyber threats, cyber stalking, and videos such as the ones described in the article much more seriously.

Threatening someone online should be illegal, as should posting videos depicting illegal acts (such as animal cruelty etc).

Platforms could certainly do a better job of having clear policies about things like speech that advocates harm against groups of people, even when it isn't illegal. They won't, though, unless people start cancelling their accounts en masse, and that's not likely to happen when doing so means losing access to something that lets you stay in the loop with friends and family.

I suppose the answer is to boycott social media, or given that this is pretty hard to do in this era, to create a new network that allows people to connect with friends and family but has much narrower parameters about what is allowed. The problem is, the strength of the "dominant" social media platforms is precisely what creates the problem: people can use them to share pretty much every kind of content and to create social networks revolving around almost every possible interest, from groups of folks who want to talk about their dogs or grandkids, to highly political groups, to Russian troll farms, to people who (evidently) get their kicks from torturing animals and kids or murdering people and sharing it.

The problem is, if I (or any of us) opted out of facebook in favor of a much smaller network between closer friends and family (this is what my husband--one of five people in the US who still don't partake of any social media platforms at all--wants), I'd lose the power that allows me to connect with old childhood friends, or to have one place I can go to keep up with family, dog training friends, old friends from highschool or college, writer friends, and other people I know in different aspects of my life.

Oh my spouse gets smug whenever I talk about issues with FB or Twitter or whatever. "That's why I'm not on them," he says primly.
 

cornflake

practical experience, FTW
Super Member
Registered
Joined
Jul 11, 2012
Messages
16,171
Reaction score
3,734
One option is to ban anonymity on these platforms. The government can't restrict speech, but it can require that platforms collect and constantly confirm personal data on all posters with the stipulation it be kept private unless the person posts content that suggests they are doing something illegal. And we need to find a way to get law enforcement to take cyber threats, cyber stalking, and videos such as the ones described in the article much more seriously.

Threatening someone online should be illegal, as should posting videos depicting illegal acts (such as animal cruelty etc).

Platforms could certainly do a better job of having clear policies about things like speech that advocates harm against groups of people, even when it isn't illegal. They won't, though, unless people start cancelling their accounts en masse, and that's not likely to happen when doing so means losing access to something that lets you stay in the loop with friends and family.

I suppose the answer is to boycott social media, or given that this is pretty hard to do in this era, to create a new network that allows people to connect with friends and family but has much narrower parameters about what is allowed. The problem is, the strength of the "dominant" social media platforms is precisely what creates the problem: people can use them to share pretty much every kind of content and to create social networks revolving around almost every possible interest, from groups of folks who want to talk about their dogs or grandkids, to highly political groups, to Russian troll farms, to people who (evidently) get their kicks from torturing animals and kids or murdering people and sharing it.

The problem is, if I (or any of us) opted out of facebook in favor of a much smaller network between closer friends and family (this is what my husband--one of five people in the US who still don't partake of any social media platforms at all--wants), I'd lose the power that allows me to connect with old childhood friends, or to have one place I can go to keep up with family, dog training friends, old friends from highschool or college, writer friends, and other people I know in different aspects of my life.

Oh my spouse gets smug whenever I talk about issues with FB or Twitter or whatever. "That's why I'm not on them," he says primly.

By what mechanism could the government require that though?
 

lizmonster

Possibly A Mermaid Queen
Absolute Sage
Super Member
Registered
Joined
Jul 5, 2012
Messages
14,775
Reaction score
24,914
Location
Massachusetts
Website
elizabethbonesteel.com
One option is to ban anonymity on these platforms.


...Yeah, as someone who was anonymous in a few places in Usenet days, I can't say I trust the government (or any of these social media companies) to honor my privacy there. And there are too many situations where anonymity is the only thing that allows someone to be safe.

And we need to find a way to get law enforcement to take cyber threats, cyber stalking, and videos such as the ones described in the article much more seriously.

This, absolutely. Gamergate taught people nothing. Law enforcement still seems to be reacting to online harassment with a big ¯\_(ツ)_/¯. Some of that is certainly a learning curve, but a lot of it is the persistent feeling that a) online harassment isn't dangerous, and b) the people who get harassed somehow deserve it.

Platforms could certainly do a better job of having clear policies about things like speech that advocates harm against groups of people, even when it isn't illegal. They won't, though, unless people start cancelling their accounts en masse, and that's not likely to happen when doing so means losing access to something that lets you stay in the loop with friends and family.

Actually, I'd dispute that this is on the users. Twitter and Facebook don't make money off of you and me. It's the advertisers who need to get serious about this.

I suppose the answer is to boycott social media, or given that this is pretty hard to do in this era, to create a new network that allows people to connect with friends and family but has much narrower parameters about what is allowed.

I really think we're past this point.

Apart from properly paying and unionizing the people who have to review this stuff, it seems to me what we need is clear rules that are enforced quickly and uniformly. No labeling politicians' threats of nuclear war with "hey, this might violate our rules but it's coming from a Really Important Person so we're gonna show y'all anyway." If a post is good enough for *rump it should be good enough for me - and if I'd get suspended/banned over it, he should be, too.

Oh my spouse gets smug whenever I talk about issues with FB or Twitter or whatever. "That's why I'm not on them," he says primly.

He's probably happier for it. But that's a disingenuous reaction to the problem. The internet's not disappearing, and neither is social media. No matter what we do as individuals, as a society we're going to be grappling with this.
 

Xelebes

Delerium ex Ennui
Super Member
Registered
Joined
Aug 8, 2009
Messages
14,205
Reaction score
884
Location
Edmonton, Canada
...Yeah, as someone who was anonymous in a few places in Usenet days, I can't say I trust the government (or any of these social media companies) to honor my privacy there. And there are too many situations where anonymity is the only thing that allows someone to be safe.

Let anonymous situations be anonymous, but don't try to mix anonymous with non-anonymous.
 

Jack McManus

smoothopr8r
Super Member
Registered
Joined
Jan 24, 2014
Messages
832
Reaction score
133
Location
West of where the red fern grows
I support unionization also. Worker protections always come at a cost to the shareholders if not passed along to consumers, however. And smaller dividends don't seem a likely

Big deal if consumers pay more. Social media is a privilege, not a necessity.
It's like saying that without slavery the cost of cotton will go up.

I would liken it more to 19th century tobacco industry practices for comparing worker abuse by companies making addictive products widely consumed with little to no gov't oversight.


I am a strong believer in unions, and these people need one! Preferably a big one, with teeth. It's a very spread out industry, with no particular workplace to picket, or to sign up people at, so you'd want experienced organizers and a union with national reach. (And, please, no management supported cowboy union, set up purely to be able to say 'they're unionized'.)

I am a union member in good standing, just saying. But even in a non-union shop, there is government oversight of worker safety and health (OSHA).

Because if the work is always going to be horrible, there are ways to support people:
extra breaks after bad calls, people to decompress with, hazard pay for taking a string of bad calls. Aside from way better pay, paid sick days and medical benefits, including psychotherapy so that you can go to the provider of your choice, not the company shrink.

I would assume that 911 call-takers have reasonable plans to base this on. Or suicide or abuse hot-lines?

In any case, sometimes an enthusiastic threat of unionization itself is enough to spur management to make changes.

Aren't the content moderators whose working conditions we're talking about, aren't these folks contractors and not full time employees of FB? I think outside sources are in use to try and sanitize FB's image. They can say, "Look, we care about what users post on our network," without a serious hit to the bottom line. Organizing contract workers in an industry that exists for the sole purpose of making online socializing more convenient may be a lost cause, I'm afraid.
 
Last edited:

Kaiser-Kun

!
Super Member
Registered
Joined
Jan 12, 2009
Messages
6,944
Reaction score
1,915
Age
39
Location
Mexico
Facebook can afford better conditions for its moderators. Someone needs to do such a dirty job, and they need to be in the better conditions for it.
 

Roxxsmom

Beastly Fido
Kind Benefactor
Super Member
Registered
Joined
Oct 24, 2011
Messages
23,132
Reaction score
10,904
Location
Where faults collide
Website
doggedlywriting.blogspot.com
By what mechanism could the government require that though?

That's the problem, really, that and trusting the government (or a private agency, for that matter) to honor privacy, as lizmonster said. Of course, we have never been anonymous in our daily offline interactions, and we are now at risk of being recorded or monitored everywhere we go, whether we are online or not. A students, co-worker, employer, business owners, municipalities, or a random stranger can record anything we do or say, in or out of context, and many post it online. The courts have upheld the right of perverts to place camera in public spaces "without reasonable expectation of privacy" so as to take pictures up women's skirts.

The competing interests between ordinary users, who generally have excellent reasons to want to be anonymous (so as not to be stalked and harassed on or offline by the trolls, criminals and sickos), vs said trolls, criminals and sickos, is what's creating the problem. Or maybe it's even just the feeling that one is anonymous online that leads many people to say and do things they'd never say or do in front of people offline.

Am I in more danger from the things I say online than from things I say in person in a world where every move can be recorded? I'm also not sure we're all as anonymous on the web as we think. I really do think the cops could track down more cyber stalkers and harassers if they gave a shit and had the support to do so.

I just don't see a way we can protect free speech and online privacy for the decent majority without creating a refuge for truly evil people (and I know no other word for what some people post online) to run amok?

Asking advertisers to step up to the plate and pull their adds works sometimes, but not always. The public has only so much attention to focus on product boycotts or whatever, and there's a ton of money to be made by the highly targeted advertising on FB. I can write every company that shows up on my feed that I will never do business with them as long as FB doesn't treat its contractors better or do a better job of enforcing its policies about videos depicting animal cruelty or whatever (the fact that the same videos keep tormenting moderators over and over after they'd flagged them suggests that FB is not enforcing their own policies), but can I really follow through, since these ads are so targeted towards me and the things I tend to buy? Will that make a bigger difference than a mass exodus from FB (which would definitely result in a loss of ad revenue)?
 
Last edited: