I do think this needs to be broken out from the general AI thread, as one of the main problems with AI is what shitty things shitty people will do with it.
I’m leading with this article. Hydra would be envious.
Yeah, 37,000 people targeted for assassination via AI.
They added: “There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defence personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.”
The people they interviewed don’t sound any less coldhearted than the AI does.
There’s a funny case of “we don’t know what we’re doing so maybe porn”. If they don’t know what they are doing, I would have expected opting out would result in you generating nothing.
I guess people don’t realize that these activities cost them money via electricity, unless they are leeching off their parents, school, etc.
On the Salad website, the company explains that “some workloads may generate images, text or video of a mature nature”, and that any adult content generated is wiped from a users system as soon as the workload is completed. This fuzzy generalisation around sexually explicit content creation is likely because Salad is unable to view and moderate images created through its platforms, and is thereby covering its bases by marking all image generation as “adult” in general.