Musk is doing the equivalent of a AMA on Twitter right now, and there are some statements from him that might get the interest of the FTC or the EU:
If the docs and e-mails contained, not just Twitter employee info, but info about customers or other 3rd parties, then handing it over to random outsiders may be against the FTC consent decree or EU regs. (We already know that this info dump exposed the personal e-mail address of US Congressman Ro Khanna.)
Also, just to confirm who Twitter is for these days:
Meanwhile (gifted article):
Surging Twitter antisemitism unites fringe, encourages violence, officials say
It’s getting harder and harder to look at the big follower-count accounts on Twitter as anything other than complicit.
Not saying you shouldn’t act, but I think it’s very important to keep in mind that these people don’t go away because they get de-platformed. It makes the platform nicer, not society itself.
And Twitter is just the one we’re focusing on this week.
Before I ditched Facebook, I decided to do a little test to see how well their moderation actually worked. I found this one neo nazi whose page was nothing but wall-to-wall racial hatred. His banner, his profile picture, 100% of his content was nothing but text, videos and images expressing really explicit racial hatred.
The only thing he had done to trick moderators as far as I could tell, was switch the n-word with the word “baboon”. It was still perfectly clear what that word meant in context.
I reported somewhere around 20 of his posts, all of which were clearly in violation of Facebook’s moderation policy, and they all came back negative from the automated system. So I escalated it to humans, and of those 20 posts, all were left up. One video was blurred because of graphic violence.
I was already on my way out, but that just made it a little easier. We talk about Twitter a lot, but they’re not the only major commercial website helping to spread disinformation and hatred worldwide. It’s a feature, not a bug.
I would agree. The reason, though, in my view is not that Facebook for example wants a world full of Nazis and racists. It is because Facebook is a corporation, and corporation are by definition amoral. Despite thee American fetish for considering corporate entities individuals for so many purposes, a corporation is a mechanism, not a being. It exists for one purpose and one purpose only: concentrating profit while distributing (or avoiding) risk.
When we see corporations (or private firms acting pretty much as corporations) enabling terrible things, just follow the money. Not in the “evil Nazis are paying them off” way, but simply in the mundane course of business. It makes them more money to have these things on their platforms than it does to boot them off. Pretty simple. It’s totally, utterly amoral from their POV, no matter how immoral it is in any rational calculus.
Zylon
3944
What a nonsensical sentiment. If giving horrible racists a huge megaphone makes society worse (which it does), then of course taking it away would make society better.
There have always been horrible racists in the world, and there have always been droves of fearful, easily-influenced morons. The more we can do to keep those groups apart, the better for society.
You’re not keeping them apart, and you can’t. You’re just depriving them of a tool.
That’s exactly the sort of wishful thinking I was cautioning against.
Hearing no evil and seeing no evil doesn’t mean it isn’t there. I know it’s mindblowing, but these guys were around long before anyone ever thought about social media.
De-platforming is a massively successful tool to reduce the reach of extremists. This has been clearly shown quite a few times in the past few years.
And I didn’t say you shouldn’t do it :)
Timex
3948
You are depriving then of a tool… That they use to influence society.
It’s good to deprive them off that tool.
antlers
3949
The problem with deplatforming in principle is that it is a favorite tool of fascists. Orban and Putin use it to marginalize opposition without the necessity of more overt repression.
KevinC
3950
Imprisonment is used by Putin as well but I don’t know if that says anything on whether we should jail criminals or not.
Tools are tools; they are intrinsically amoral. The use of tools is a question of morality and ethics. At least, most of the time. I’m open to the idea that some tools are, for practical purposes, too tainted to use even for good, but really we don’t have too many One Rings running around.
antlers
3952
Your analogy points out exactly the problem: functioning democracies have incredibly involved and cumbersome legal procedures based on centuries of tradition to decide if someone can be imprisoned, while nothing like that exists for deplatforming. That’s why aspiring fascists can use deplatforming to suppress opposition while conforming to legal norms.
Depriving people of speech and depriving them the use of a service are not the same thing. Extremists still have freedom of speech and freedom of assembly, and I don’t think we should mess with that.
But the root causes of radicalization and extremism are significantly more complex than simply taking away a megaphone - feelings of marginalization, loneliness and low self-worth play a huge part - and those feelings will obviously increase and aid in their radicalization when those people are rejected.
There’s not much we can do about that. The good of de-platforming clearly outweighs the bad. But that’s not going to do much to combat extremism in general.
For one thing, Twitter and Facebook aren’t going to do a very good job as long as they’re protected by 230, but also, people are still going to be radicalized by watching Tucker Carlson, or listening to politicians, or reading the Daily Mail, or watching crap on YouTube, or playing on Steam.
I think the article linked by @vinraith covered the bases pretty well, although it didn’t say very much about what we can actually do to counteract radicalization. Not calling radicalized people morons would definitely be a start.
Would you accept “people of the land, the common clay of the New West”?
Alstein
3955
I think such moderation problems are inevitable with any corporate social media with a profit motive that gets to sufficient scale.
Zylon
3956
You’re essentially making the same terrible argument as right-wingers who claim that gun controls laws are pointless, because sufficiently determined criminals will always be able to obtain guns. That any solution that isn’t a perfect solution isn’t worth bothering with.
Yes, there are root causes that make easily-influenced people of the land more likely to turn to extremism. But we’re pretty much stuck with those root causes. There are always going to be people dissatisfied with their life, or hating change. But as long as these people are scattered and disorganized, and not getting riled up even more than they already are, they’re manageable.
It’s when these people get gathered up and pointed in the same direction that we end up with things like Jan 6 happening. And that’s why deplatforming the people who try to exploit and encourage extremism will always be a net win, no matter what the platform.
No, that’s just the way you decided to read it.
We’re stuck with the concept of extremism, but extremism does not evolve into a movement in a vacuum. Believing it does prevents us from understanding the problem and doing something about it.
I wasn’t trying to say you shouldn’t de-platform extremists, I was trying to say that if you think that solves the problem, you will be disappointed.
I’m from Europe, but previously I’ve lived in Mississippi and Alabama among the poorest people I’ve ever seen in my life. Most weren’t wealthy to begin with, but first they were hit by a hurricane that destroyed everything near the coast, then they were hit by an economic collapse that devastated the local economy.
Most of those people are wearing red hats right now, and it has nothing to do with social media. They were worn out and pissed off.
If they could afford their groceries, and if they had been armed with a decent education, a lot fewer of them would be at those rallies pumping their fists.
Pretending that we can’t do anything about that is just bullshit. Lacking the will is a different thing entirely. It’s easier to insult them anyway, and I bet it goes down great in the ol’ echo chamber.
It’s a nice thought, but they’re not scattered and disorganized. You’ll need to do better.
So do people who can’t afford their groceries inevitably become crazed right-wingers? I don’t see the connection. I think it’s more likely the culture they grew up in because there are plenty of people just as poor who are not crazy right-wingers.
It may not be entirely due to social media, but social media seems to reinforce how they feel because social media creates echo chambers for every persuasion.
DoubleG
3959
People with higher education levels are less likely to vote Republican, but it’s the opposite for higher income levels.