Social media controls the world


#1287

Twitch


#1288

I really like YouTube. I actually like the algorithm when it’s working to show me stuff like exercise videos, hearthstone decks, music videos, I find lots of good stuff because of the algorithm. But they need to police their users.

I don’t know the answer other than a forced manual review of each uploaded video. Anything with a kid or targeted toward kids needs to be reviewed at a bare minimum.


#1289

Manual review is impossible. Every minute there’s approximately 400 hours of new video uploaded to YouTube. That’s 16.67 days worth of video per minute.


#1290

No problem, they “just” need to hire 3000 people to each watch 16 uploads at the same time for 12 hours per day and screen them perfectly every day for the rest of time.


#1291

This reminds me of the little town on the Washington coast near where my mom lives. It’s a beach/tourist spot during the summer. Well, a couple years back, the local go-cart track was shut down- the owner was arrested for it being the center of a huge local drug ring. This was going on as vaguely public knowledge for years. But lots of locals were pissed, since it meant that it would depress the tourist industry for the summer. Teenagers had died ODing on heroin. But the money was more important. Fuckers.


#1292

There’s a huge Great White shark terrorizing the beaches. But we can’t close them. We need the tourist money.


#1293

Sometimes it’s great, but as someone mentioned here, iirc, eventually those algorithms will lead you to insane racist shit. Like invariably. Like how Pandora always led to Dio for whatever reason if you left it alone long enough. You could be listening to classical music, start doing something else and come back to Dio. No matter what all music led to Dio.

YouTube refuses to moderate anything at the end of the day. That takes money and they’re not willing to spend it. Both them and Facebook are in the same boat in that regard. Both multi-billion dollar companies that refuse to spend money monitoring anything. They have reporting features that they ignore or automate to bots/algorithms instead of having a human being in the loop, so they’re easily exploited or fail horribly.


#1294

It happens with all manner of extremism. Watch a medical video, and it starts recommending videos that teach you how to reverse your heart disease by eating only ginsing and grout; watch a child development video and you get anti-vaxx stuff; watch space science and you get flat-earth; watch investing and precious metals and you get Rothchilds and the global banking elites, etc.


#1295

Exactly. That’s a failing of the algorithm, that they refuse to address. It drives everything to the extremes and/or big channels.
It makes me think of the uselessness of Steam’s “discovery queue” where literally everything on the list is “Because it is popular.” Yeah, thanks for the individualized discovery queue Steam. I’ve never shown an interest in any of these things, but they’re popular so just shit them out for me over and over again, maybe I’ll give in.


#1296

It’s not a failure of the algorithm, because they don’t see it as a problem, indeed it is their business model to make you watch as much as possible. The social cost doesn’t factor in.
{Obligatory Zeynep}


#1297

That sounds like it’s working perfectly, imo.


#1298

Well to be fair, he is the last in line.

The YouTube thing is a tough issue to solve. There is no way you can manually moderate the amount of content coming in, much less the comments on the content. Algorithms to police that sort of thing can only do so much, if you tighten them way up you catch flak for auto-suspending people whose content or comments didn’t necessarily deserve such, but if you keep them slack then you let idiots get away with lots of idiocy.

Social media platforms were meant to be self-policing. The users of any platform would be the ones reporting violations of terms and objectionable content. The problem with that is twofold : First, there is way more content than anyone ever dreamed there would be, and Second, people have become more shitty in general as social media has matured. It’s hard to self-police for douchebags when the very nature of your service allows people to act like douchebags without consequence.


#1299

I mean it isn’t really. The issue here isn’t just the content (though that is a huge issue) it is that this content is being monetized. They are approving all of these accounts for monetization. They can’t really control who uploads videos, but it would be magnitudes easier to control how content can be monetized. Say, a stricter vetting process? Some transparency in the decision making etc.

They never can control what is being uploaded and who is commenting, but they certainly control how ads get served on their platform. These videos could have been found, and it would be a really bad look for youtube, as it is clear there are child predators using their site for nefarious purposes, but it wouldn’t have been as bad if these videos weren’t being monetized. This is why the advertisers left. They had been assured their ads wouldn’t be served to questionable content, and that is exactly what happened.

Youtube’s standards for ad service and monetization are completely and utterly broken. For instance, I can’t monetize content on youtube because in 2009 my friend clicked on a bunch of google ads on my website, and that was seen as cheating their system. I didn’t direct my friend to do this, but as he was helping me work on the site, I have to guess his IP address was suspect for clicking on the ads. That got me a lifetime ban, but some pervert re-uploading some 10 year old girl doing gymnastics can get money from disney and nestle for serving ads. Fucking disgusting.

I am not saying this out of any animosity towards Google for what happened to me, but it is obvious that they basically have no clear reasoning or policies surrounding what content can be monetized, and who can get paid. Their house is not in order.

And I want to say there is a special place in hell for people criticizing the people bringing this issue to light because it might cause “ad-pocalypse 2” and hurt the channels they like.

Shit is broken, and Youtube deserves to eat mountains of shit for this.


#1300

That’s pretty messed up. Many of these companies already have a means to address these… through their reporting functions which they outright ignore or don’t support. They don’t have to monitor everything that is uploaded.

I thought there was an article floating around somewhere that had great suggestions on how to reduce this stuff from one of their employees and i was shot down. I’ll probably never find that again.


#1301

And their response was basically to fire up some bots that declared everything wasn’t ad friendly. Channels about HEMA or ancient weapons got flagged for violence or whatever. Every video game got flagged because they might have guns or someone might “kill” something. It was and still is a clusterfuck.

I mean, it is possible, but at the same time the first ad-pocalypse never ended anyway really, so who cares at this point? The damage has been done, but it didn’t solve the problem, so fear that it will cause the damage already happening is kind of a questionable position.


#1302

Speak of the Devil:


#1303

#1304


The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”


#1305

“He who fights with monsters should be careful lest he thereby become a monster. And if thou gaze long into an abyss , the abyss will also gaze into thee.”


#1306

They need to figure out how to rotate these responsibilities and never leave it with one person too long.