Bring your karma
Join the waitlist today
HUMBLECAT.ORG

/r/Technology

Last sync: 1y ago
2535
Buffalo mass shooting victims sue social media platforms that they say contributed to the massacre (13newsnow.com)
submitted 1d ago by esporx
thaiatom 154 points 1d ago
Is plotting and bragging about mass murder free speech until bullets hit flesh?
probono105 61 points 1d ago
a call to violence is not no
InsuranceToTheRescue 29 points 1d ago
The courts have decided that it needs to cause clear and immediate danger to someone for it to no longer be protected. Which is a hard bar to clear, legally, even when you know someone is the cause.
notcaffeinefree 9 points 1d ago
FWIW, clear and present danger hasn't been the legal test since 1969. It's now "imminent lawless action", which is a higher bar than even " clear and present danger".
InsuranceToTheRescue 3 points 1d ago
I didn't know that. What a ridiculous test. How the fuck do you prove that someone's speech caused imminent lawless action?
Psychological-Sale64 1 points 19h ago
Then they need to hear it themselves.
Nothing like the visceral for a reality check, no matter how vlesver or convoluted you are.
steeljunkiepingping 7 points 1d ago
Yes and no. You can’t be direct about it but you can say “punch a Nazi” or “punch a commie” and that’s not illegal to say.
GaneshaWarrior 5 points 1d ago
Exactly, because defending violence against a certain group as a general statement is not illegal in US law, it may be against the TOS of social media companies but it is still part of the 1st amendment because it is defined as an opinion.

However, a much different thing is a *threat* of violence. Directing the will to kill or harm in any way an individual or group of people in a designated set time and place is actually illegal and will get you in trouble

So in this example, a person saiyng: ''i think nazies should be punched'' is not illegal. But saiyng '' Punch this nazi guy when you see him in the next rally'' - That is a threat to violence - It's illegal as it is a plausible it can happen.
(additional comments not archived)
CGordini 0 points 20h ago
Amusingly, Reddit will hard ban you for threatening to punch Nazis.

Ask me how I know.

Just another fun example of Reddit Administration hard at work - a reminder that Spez himself was a The_Donald member.
(additional comments not archived)
thaiatom -13 points 1d ago
If you know someone is threatening violence, what responsibility do you have to intervene? I think you have some moral and legal responsibility to do something or alert the proper authorities if you know someone who is talking about planning a violent attack. It’s like school officials finding a bomb threat written on a bathroom wall and they don’t have to tell the school students because a bomb hasn’t gone off yet. Do the school officials share some responsibility if a bomb does go off and they didn’t evacuate the school or call the police?
HappyLofi 9 points 1d ago
Yup. If you're aware someone is a psycho and potentially plans to do something like this you are 100% obligated to report it. If you don't you're straight up a piece of shit. I don't give a fuck what anyone says about that.
thaiatom 1 points 1d ago
Exactly. Yes.
Fuzakenaideyo -7 points 1d ago
Why have people downvoted you?
Djamalfna 9 points 1d ago
Because it's a stupid question that was answered by society centuries ago.
HardlineMike 299 points 1d ago
I feel for them but this smells like an opportunistic lawyer preying on them. Suing the entire internet isn't really gonna work out.
Ashmedai 27 points 1d ago
I don't see how they don't get blocked at the door to a challenge on their standing to present the case at all. "Section 230, motion for summary dismissal," I would think?
RealClarity9606 35 points 1d ago
Agreed. It seems like suing the phone company because someone called an accomplice before committing a crime.
DukeOfGeek 38 points 1d ago
To bad they are not allowed to sue the law enforcement agencies who had significant forewarning of this crime.
McMacHack 21 points 1d ago
Imagine if Law Enforcement was actually required to serve and protect the public instead of special interest groups
PM_ME_YOUR_BEAMSHOTS 1 points 1d ago
> special interest groups

The rich ride the short limousine
24-Hour-Hate 4 points 1d ago
It’s not quite the same. Some of those social media companies deliberately use algorithms to push and promote conspiracy theories (it encourages the consumption of increasingly extreme content) because it increases engagement and therefore profit. There is an argument there that this goes beyond being a mere platform. I’ve seen this on YouTube myself. I am not even remotely right wing, but that trash gets put on my home page to see if I’ll bite. And if I do click something because I don’t realize what it is or just accidentally, they immediately push more at me. And more extreme stuff at me. Tell me that’s it deliberate. And in my view something needs to be done about it.
Aeri73 9 points 1d ago
no it's more like suing a teacher for indoctrinating a bunch of kids to become suicidebombers... the algorithms push you to extremes, make people more extreme in their beliefs, this is known and meta and google should know that, now they have to prove that but if they can...
TomBirkenstock 20 points 1d ago
If you can show that the algorithm is promoting hate speech that contributed to this, then this might go somewhere. For some reason people still think of the internet as a neutral platform.
Aeri73 3 points 1d ago
the internet is neutral... it's the algoritms that make facebook and google work diffently.. there it's the algoritm that decides what you see, what gets pushed to the top... and they've learned that extreme posts get more views, more clicks, more engagement... make them more money...

so they chose to keep using them, knowing well how it works, what it does... it's been proven over and over.

those choices should have consequences
RealClarity9606 10 points 1d ago
The algorithms may have flaws, but ultimately the individual is responsible for their behavior. The intent of the algorithms are engagement, not to turn someone into a mass murderer. Plus millions are on social media, but, despite the coverage that makes it seem like more, the number of such shooters is an infinitesimal portion of the user base. The factors that lead someone to commit such acts don’t stop with “they saw a dumpster fire of content on social media.”
Aeri73 1 points 1d ago
that's the same argument they use against gunreform...

and it's denial of the influence of these algoritms on our behaviour and thoughts....
Djamalfna -1 points 1d ago
> but ultimately the individual is responsible for their behavior.

Charlie Manson tried to use that argument. It didn't go well for him.

You're confusing "individual responsibility" with "sole responsibility".

The shooter is responsible for his actions. That's why he's been charged and sentenced.

But when there's people or organizations that are fomenting a psychological state which constantly and continuously tells the person that they must commit horrific acts if they want to act morally responsibly, then the fomenter must bear partial responsibility for creating that situation.

Your beliefs are overly simplistic and ignore a whole host of thoughts and philosophies. Luckily for us, society and the law disagrees vehemently with you.
gcapi 2 points 1d ago
Not entirely, since with a teacher you know the exact source of where things are coming from. This feels like Timmy in the fairly oddparents saying "uhhh the internet" whenever his parents asked where he got something. What you're saying is closer to suing a teacher for one of their students being convinced by other students to become a suicide bomber.
davewashere 1 points 1d ago
That's probably the right approach to take. Have there been discussions between higher ups at social media companies about extremist content getting greater engagement numbers and what problems that could cause with the algorithm pushing content with high engagement numbers? If you can find a few people who will testify that those conversations did happen and that the company ultimately decided to ignore the potential dangers, there might be a case. That's easier said than done. Most people in positions of power at these companies are not going to stir the pot by pointing out a problem that they can't be entirely sure exists and that the company is not going to spend resources studying because the results of those studies would almost certainly have negative outcomes for the bottom line.
Aeri73 1 points 1d ago
those testemonies are already there...https://www.youtube.com/watch?v=X6lEkO6NRAM
n3w4cc01_1nt 40 points 1d ago
idk fox news seems to make people insane then they end up generating a lot of revenue off the clickbait from their extremist attacks.
HuXu7 -77 points 1d ago
Right, remember the trans man who shot up an elementary school? They probably watched too much Fox.
B1ackFridai 43 points 1d ago
One trans man out of thousands of cis men mass shooters. You really got ‘em! 🙄
Stolles -14 points 1d ago
Was it the first trans mass shooter or the first woman mass shooter? Which aisle is taking that one?
dark_brandon_20k 12 points 1d ago
You're forgetting the guy with nazi tattoos who watches Tim pool
Ragnarok3246 15 points 1d ago
Right! Remember the thousands of mass shootings comitted by trans people? No? Then we have a clear distinction between right wing extremism driving mass shootings, and a few people that commit a mass shooting due to various reasons. Fuckhead.
JRepo 4 points 1d ago
What point are you trying to make here?
Bertrando1 -7 points 1d ago
The point they’re trying to make is probably that anyone can be a shooter, not just right wing nut jobs.
DanielBrian1966 9 points 1d ago
Ben Crump is definitely a black ambulance chaser. I used to cheer for him and then I noticed he was the go-to lawyer for most high-profile crimes involving black victims. Now it seems like he represents every single one.
throwaway11111111888 -1 points 1d ago
Gun manufacturers are sued all the time. Why not internet companies?
Psychological-Sale64 1 points 18h ago
The light off day.archic laws not up with technology and human nature or the effects of living in a social setting.
Give it time and bit coin will see collapse, technology has missed some subtile aspects of the collective.
Time will tell, think of the concerns around AI.
Riots povity, why.
Because people will have no purpose or structure.
No one is totally free.
ApatheticWithoutTheA 24 points 1d ago
That will never work and that attorney is a scumbag for giving them hope that it will.

The laws protecting social media companies are very clear.
Eldias 7 points 1d ago
It sounds like they're suing the manufacturers and distributors of the armor, magazines, and weapon he used too. Really going for the "Sue them all and let a judge sort it out" tactic.
CosmicBoat 2 points 1d ago
Going to end up like $1. Sad that it happened to them but they're being used as a tool for others, they'll be abandoned once they have no more use.
locri 69 points 1d ago
Unless they can prove the social media platforms incentivised or intentionally created an environment for the extremism that caused this, I don't think this should work and it would be a bad thing if it does.

It is a good thing if only individuals are held responsible for that single individual's behaviour, anything else creates a tangled mess of duty of care, stochastic violence, etc etc.
SoundTracx 35 points 1d ago
I think there’s a more than a handful of psychologists and neuroscientists that are going to jump in on this. There’s so many studies already done on social media and it’s effects on society. They are the first stepping stone.
LiveLaughLobster 41 points 1d ago
Yeah for a long time people thought suing the tobacco industries for how they marketed cigarettes was a crazy idea bc “of course” people made their own choice to smoke. Through litigation the parties were able to obtain documents that showed that they very literally were doing everything they could to make people get addicted. I think there’s a good chance some social media sites are just as bad as tobacco companies were.
smiley_coight 16 points 1d ago
sounds a lot like the algorithms used by Facebook, tikcrap, insta et al. to generate "engagement". Social media addiction wouldn't be very different to tobacco addiction.

I personally would love to see all of the social media sites gone from the internet.
RichardSaunders 4 points 1d ago
including this one?
jaam01 2 points 1d ago
The difference is that social media and search engines are protected by section 230 according to Twitter, Inc. v. Taamneh and Gonzalez v. Google LLC.
DefendSection230 2 points 1d ago
>Yeah for a long time people thought suing the tobacco industries for how they marketed cigarettes was a crazy idea bc “of course” people made their own choice to smoke.

Yeah because from the 1930's to the 1950's, patients were prescribed cigarettes by the doctor, as they weren't looked at as dangerous as they are now. This was because tobacco brands hired throat doctors to explain that dust, germs and lack of menthol were to blame when it came to illnesses, not cigarettes.

Are you trying say that Websites are doing the same kind of thing?
LiveLaughLobster 5 points 21h ago
I’m actually thinking more about the 1980s tobacco lawsuits where they found out companies were doing things marketing to children to get them addicted early or even sometimes giving kids free cigarettes to get them hooked.

I don’t know if social media websites are doing anything similar. It will be the plaintiffs’ job to prove that to a jury and I think they deserve the opportunity to try.
NoFunny1739 2 points 1d ago
So...what is the solution here?

Do we outlaw any kind of algorithmic pushing of data to consumers?

How does this work?

All advertising for all time has had an "algorithm" to make sure it reaches the target demographic that might buy the product.

In the 1920s you would put your adds in the papers and magazines that the people you think would buy your product bought.

In the 1970s you would make sure your TV advertisements ran at the right days and hours of the day to reach your target demographic.

Imagine if, by law, advertisements had to be shown *at random*. So, you get a commercial for Tampax during Saturday morning cartoons, and a commercial for the new Transformers toy during afternoon soap operas.

Who is going to buy advertising when you can't be even remotely sure that your target demographic is going to ever see it?
Archaris 2 points 1d ago
this city is known for making 'throw away' pieces of chickens cost more than a dollar per piece. i think these lawyers are going to get their money's worth.

and it's about damn time. if texas/florida can sue social media for 'wokeness' without evidence; Buffalo NY will WIN HARD with their case against social media on 'basic human decency' alone.

it'll take some time. like all good LEGAL measures, so expect a hard-earned fight come 2026 (or 2029 if it goes to a Trump-appointed Federal "Judge" who literally has had ZERO experience in active-court before being nominated and voted to a lifelong position as Federal "Judge").
mdk2004 -7 points 1d ago
There isn't a product in the world that doesn't have some kind of upstream or down stream negative impact. You have to accept personal responsibility if you want to live in a free society.
marvbinks 1 points 1d ago
When will people realise America isn't a free society?
Eldias 1 points 1d ago
Please elaborate, I could use a good laugh today
FrogStork 0 points 1d ago
> There isn't a product in the world that doesn't have some kind of upstream or down stream negative impact

Well yeah, ever heard that there's no ethical consumption under capitalism?
cultured_banana_slug 1 points 13h ago
The internet has allowed the village idiots to unite and think they can run the world.
Luname 0 points 1d ago
Yep.

This won't work for the exact same reasons that suing gun manufacturers always fail.
ResilientBiscuit 1 points 22h ago
The reason gun manufacturers can't be sued is because there is a law that specifically calls out that they cannot be sued.

No such law exists here.
Bardfinn 1 points 1d ago
They’re suing Reddit — which will be fruitless; I read the shooter’s profile before Reddit pulled it, and there was one (1) comment he made in one (1) hate group subreddit that was a sign that he was anti-Semitic, and a lot of comments he made about body armour, but no comments or posts he made on Reddit which indicated violence.

They’re also suing the company that owns 4chan, and they might get more traction there. 4chan hosts and radicalises Racially or Ethnically Motivated Violent Extremists, Ideologically Motivated Violent Extremists, and other violent extremists. The “moderators” should be serving prison time for aiding & abetting. The owner, too, probably.

They might get something from Discord, since he had a private discord & discord could have had a policy about red flags in private discords, turning those over to law enforcement- who knows.

The rest of the entities they’re suing will either move to dismiss or offer a no-fault settlement.

If they can somehow prove that his participation on Reddit was significant in his radicalisation, when the NYAG office found nothing actionable in their investigation, then kudos to them. The people he claimed to be radicalised by on Reddit are violent neoNazis and they keep coming back like herpes.
CyberBot129 9 points 1d ago
Still seems like their suit will fail on Section 230 grounds and/or not meet the standards set by the Supreme Court in Gonzalez v. Google and Twitter v. Taamneh
Bardfinn -1 points 1d ago
Section 230 protects from liability those who *take action* to take down objectionable material.

It does not shield from liability those who, through studied *inaction*, aid & abet terrorism and crimes.

When an entity has red flag knowledge of criminal activity on the part of another where the other is seeking to involve that entity, that entity has a duty to separate itself from the efforts of the other at the earliest reasonable junction.

We literally jail people for driving other people to stores, knowing the other person has a firearm and a tendency to rob stores.

We can jail people for running forums that aid & abet mass murders.
chowderbags 5 points 1d ago
Taamneh didn't really get into 230 protections. The issue there ultimately ended up being whether or not there was sufficient nexus between the website and the terrorist actions, and without evidence that the attacker use the website to plan the attack, it's not going to clear the bar for "aid and abet".
DefendSection230 4 points 1d ago
>It does not shield from liability those who, through studied inaction, aid & abet terrorism and crimes.

Yeah, that's not a thing.

230 leaves in place something that law has long recognized: direct liability. If someone has done something wrong, then the law can hold them responsible for it.
TheDeadlySinner 0 points 18h ago
> Section 230 protects from liability those who take action to take down objectionable material.

That is a lie. Their only requirement is to take down *illegal* material if they know about the specific material in question.

> It does not shield from liability those who, through studied inaction, aid & abet terrorism and crimes.

Where is the proof that any of these websites knew about the specific posts of this individual, and then took no action against them specifically to help him?

P.s., posting about criminal activity is not illegal, nor are websites required to take those posts down.

> When an entity has red flag knowledge of criminal activity on the part of another where the other is seeking to involve that entity, that entity has a duty to separate itself from the efforts of the other at the earliest reasonable junction.

As I said, posting about criminal activity is not illegal. Only illegal posts must be taken down.

> We literally jail people for driving other people to stores, knowing the other person has a firearm and a tendency to rob stores.

The fuck does that have to do with anything?
colonel_beeeees 0 points 1d ago
I think duty of care is messy, and zuck et al are glad we're not holding them to task for the power they've chosen to wield and abuse
probono105 0 points 1d ago
i agree but we know its causing issues so what then just let the bad stuff happen because it makes things to complicated? just wait maybe it will sort it out on its own? prevention is 95 percent of everything we do
jaam01 6 points 1d ago
Scotus already has rulings about this in 2023, Gonzalez v. Google LLC; and Twitter, Inc. v. Taamneh. This is destined to fail.
greentoiletpaper 3 points 1d ago
I feel for them but they have no chance. IANAL, but this sounds nearly identical to

$1

And

$1
sokos 2 points 1d ago
If you really want to blame someone.. blame the people for making shit like this exceptional news and therefore buying fame and notoriety for those that commit these acts.

It's not tide's fault nor tiktoks that the tidepod challenge became popular. it's people for being stupid and spreading it.
Stillwater215 2 points 1d ago
Like they say “the only thing that stops a bad guy with a gun, is cutting his access to social media websites.”
lavarotti 2 points 18h ago
Imagine you have paper factory and got sued by stupid parents for contributing on bullying their kids because they were bullied per post ;-)
airheadjace 1 points 8h ago
Imagine making a post on Reddit thinking you’re being cute but missing the point entirely 🤣
Tohuwaboho 4 points 1d ago
Social media contributs to more hate and murder all around the World.
They should finally be held acountable for their hate promoting algorhytms.
Error_404_403 8 points 1d ago
Do they also sue the company that built the building where the massacre took place? How about cell phone companies that made it oh so easy for criminals to communicate? And cell phone manufacturers?

Where do we stop???
Eldias 1 points 1d ago
While I think it's a dumb suit, there is some reason to the "Sue them all, let a jury/judge sort it out" litigation strategy
Error_404_403 3 points 1d ago
No, there is no merit in that strategy. More than that, frivolous lawsuits are considered a crime as they consume valuable resources and waste taxpayers money.
Imasniffachair 1 points 22h ago
I mean it's not a good strategy, but I can see the rationale behind it and it doesn't have to be "we want money"
NawImGoood -1 points 1d ago
Agreed. It’s terribly sad these people died, however these suits are nothing more than a money grab & it just looks bad. Perhaps the families are completely devastated & not thinking clear due to the loss of loved ones. Still though, this seems like an act of desperation to get rich quick. If they’re suing all these social media companies why not sue the entities that created & manufactured the clothing the killer wore? Why not sue the school district the killer went to for not teaching the killer not to grow up to be a killer?
Eldias -1 points 1d ago
I'm a black-hearted cynic but I can believe that these are people who are anguishing and wanting to do anything to stop a tragedy like the one they experienced from happening to anyone else.
NawImGoood 4 points 1d ago
By suing every random social media company they could think of off the top of their head? Why not sue every company that has a phone app? Anyways, they’re obviously going to lose these suits as they have no merit what’s so ever. Is a sad situation, but yes, this is a money grab
Error_404_403 4 points 1d ago
But to sue everything left and right is not the way to go about it.
Fit_Earth_339 3 points 1d ago
I’m glad that irresponsible media throwing shit out for ratings are finally seeing consequences.
jgilbs 2 points 1d ago
They should also sue the state dept of transportation that built the roads that allowed him to get there.
B0nkMyKn0b 1 points 1d ago
A bit flimsy i'd say
breezyfye 2 points 1d ago
Didn’t the shooter post in /r/PoliticalCompassMemes ?
Bardfinn 13 points 1d ago
He posted an anti-Semitic post in r|4chan, a comment in r|AskReddit about how it’s a privilege to be white, a bunch of comments in tactical gear subreddits, a bunch of posts in a precious metals trading subreddit trading silver coins (which is how he claimed to finance his gear purchases), a racist comment on a video in r/trashy about racist comments made by other high school kids. And a comment endorsing an anti-Semitic hate speech on PCM
synae 9 points 1d ago
That all adds up, I have no further questions
PopeKevin45 2 points 1d ago
This is what is needed...the threshold for holding online hate mongers accountable needs to be lowered to account for the ease, scope and trifle cost that these scumbags enjoy...a juggernaut that honest stakeholders have no defense against. The requirement that there be a direct link between the hatemonger and the victim should be reduced to merely proving that the hatemongers message and the beliefs of the victims attacker are the same vein. The hatemongers greatest allies, the 'freedumb' crowd, have been effective in preventing online hate and disinformation from being removed, so this seems the next best thing.
Reble77 -5 points 1d ago
None of this would have happened if he hadn't had access lawful to military grade weapons
NoFunny1739 7 points 1d ago
The United States Constitution expressly protects access to firearms suitable for military use. The Supreme Court in US vs. Miller held in 1929 that the second amendment protects *only* weapons suitable for military use.
throwaway11111111888 4 points 1d ago
The weapon he used wasn’t military grade. Do some research.
dhskiskdferh -5 points 1d ago
Anyone can make them at home now, gun control has been rendered ineffective by 3d printers
Eldias 2 points 1d ago
It's not going to be until after an Assault Weapon Ban is ruled unconstitutional that gun control proponents start looking at 3d printed arms and ammunition control
dhskiskdferh 2 points 1d ago
I am looking forward to that day
Eldias 5 points 1d ago
Keep eyes out for the California case of *Miller v Bonta*. The judge asked the State for their "all stars" of analogous laws to justify the Assault Weapons Ban in California. The State submitted something like 900 laws from across the US that they came are close fits. That was submitted *five months ago*.

I suspect the delay is due to an extremely thorough analysis of each law. Hopefully when the ruling finally comes it will dismantle not just Californias ban, but lay the ground work for every AWB to fall across the country.
arond3 1 points 1d ago
There was a journalist that has gone to a yearly event for 3d printed firearms and a lot of them stop after a few bullets and use still use normal weapon parts because it's too hard to print ^^
cancerlad 0 points 1d ago
Please educate yourself and check out r/fosscad
dhskiskdferh 1 points 1d ago
They’ve gotten much better over the years, they last hundreds of rounds now. Arguably infinite if using more metal parts
ideal-ramen -6 points 1d ago
That explains why there's so many mass shootings in Australia /s
dhskiskdferh 10 points 1d ago
“3D-printed guns are on the rise in Australia. How can we prevent them being made?”

“Last week, police seized about 80 illegal firearms across Victoria. These included eight homemade firearms, four of which were “military-style weapons”, as well as two 3D printers.”

https://theconversation.com/3d-printed-guns-are-on-the-rise-in-australia-how-can-we-prevent-them-being-made-193936

“3D-Printed Firearms Are Illegal in Australia, But They’re Still Being Made”

https://gizmodo.com.au/2022/11/3d-printed-firearms-are-illegal-in-australia-but-theyre-still-being-made/

Point is, criminals can always now get/create guns
ideal-ramen -6 points 1d ago
So why isn't there mass shootings in Australia like in the US? Probably because these guns were seized before anything bad could happen. Australia's lack of mass shootings is proof gun control works.
aftenbladet 1 points 1d ago
What about the shoes the perp used to get there and do the deed? Adidas was really enabling him to do it
jb6997 0 points 1d ago
While I’m sorry for the victims and their family suing the platform is absurd. Likened to suing a phone company because it’s what a person used to plan a burglary or murder with a cohort. Victims are suing the makers of the body armor as well.
Excellenllj 1 points 1d ago
Social media is clean porn with twice the impact as porn. It can destroy quicker.
ultradianfreq 1 points 1d ago
Even if they called for violence, how is it the platforms’ fault? If someone yells fire in a crowded movie theater is it the theater’s fault?
Round_Researcher6659 1 points 1d ago
They have nothing but dollar signs in their eyes.
Excellenllj -8 points 1d ago
Yup wouldn’t you
CyberBot129 1 points 1d ago
This won’t be getting past Section 230
readthatlastyear 1 points 1d ago
It should be illegal to publish anything about the names of details of people involved. Make it illegal to publish any details of who in the news or media.

Like suicide isn't published about...
MaoWasaLoser 5 points 1d ago
> Make it illegal to publish any details of who in the news or media.

Honestly what part of this:


Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Is difficult to understand?
Imasniffachair 1 points 22h ago
Yeah, illegal? No. Frowned upon and not rewarded with views? Yes and I wish the latter were possible to achieve.
Ashmedai 4 points 1d ago
> Like suicide isn't published about...

That's actually voluntary (not publishing about suicide, a common practice in San Francisco). I kinda agree that (vague hand waiving here) that something should be done about the news which always sexes up the mass shooters and what not, but it's not really possible to do under our 1st Amendment, as it is, and I'm unclear on what a revision would look like that wouldn't have a lot of unintended consquences.
readthatlastyear 1 points 1d ago
Interesting I didn't realise that thanks for letting me know. But if they did the same I think it would go half way to fixing the issue.
hawkwings -7 points 1d ago
If everybody who got shot sued social media platforms, then all social media platforms would shut down.
404Dawg 13 points 1d ago
Would that be a bad thing?
Asleep-Doctor-6586 -2 points 1d ago
Blaming everyone else but themselves.
Asleep-Doctor-6586 -1 points 1d ago
Butt hurt redditors! 👆🏼👆🏼👬
unturnedulema 0 points 1d ago
I've been discussing it with my colleagues, and some of them learned about the shhoting from me. I guess I'm also guilty.
fellipec -1 points 1d ago
Minors using internet without parents supervision is now okay?
Greedy_Event4662 -2 points 1d ago
Most replies here sect230 blabla are simplidtic.

If thats true,why is silk roads e commerce site down?
TheDeadlySinner 1 points 18h ago
Because he sold drugs and hired hitmen.
Joeaywa -22 points 1d ago
Really? I'm sorry to their families, but this is like suing McDonald's for hot coffee.
xorcsm 12 points 1d ago
So you're saying it's a slam dunk then?
Joeaywa -14 points 1d ago
I'm saying it's like saying hot coffee can burn you. Social Media can spread misinformation.
B1ackFridai 10 points 1d ago
And then you and others like you spread it around.
Joeaywa -2 points 1d ago
You're to judge misinformation? I'm not, but go ahead.
blyan 5 points 1d ago
Of all the examples you could’ve used, you picked one where the plaintiff was clearly in the right?

You might wanna actually read up on the McDonald’s coffee case before claiming anyone else here is spreading “misinformation”

> Liebeck went into shock and was taken to an emergency room at a hospital. She suffered third-degree burns on six percent of her skin and lesser burns over sixteen percent.[14][13] She remained in the hospital for eight days while she underwent skin grafting. During this period, Liebeck lost 20 pounds (9.1 kg), nearly 20 percent of her body weight, reducing her to 83 pounds (38 kg). After the hospital stay, Liebeck needed care for three weeks, which was provided by her daughter.[15] Liebeck suffered permanent disfigurement after the incident and was partially disabled for two years.[16][17]

Hot coffee CAN seriously burn you and she was disfigured for life because of it.

There’s a massive difference between “hot coffee” and coffee that was served at 180-190 degrees Fahrenheit, which that was. It’s extremely dangerous and they fully deserved to get sued and lose the money that they did for ruining her life.
Eldias 3 points 1d ago
>Liebeck suffered permanent disfigurement after the incident...

Any time people bring up the coffee suit as frivolous I try to ask them what part of "fused labia" implies the suit was without merit.
bluecorkscrew 14 points 1d ago
Except that the plaintiff won that lawsuit.
B1ackFridai 13 points 1d ago
You’re telling everyone here you’re ignorant of this and the McDonald’s case and have done nothing to educate yourself despite bountiful information out there. Odd move, but ok.
Joeaywa -5 points 1d ago
Sure if that's how you see it, not worried about you seeing my point.
Ragnarok3246 5 points 1d ago
I mean you don't see jack shit with your eyes closed all the time lol.
Zeelots 2 points 17h ago
The thing is, you listed a case that proves your point wrong..
Samurai_Meisters 14 points 1d ago
So you're saying the plaintiff is justified and will win this case too?
Joeaywa -13 points 1d ago
Maybe, this is America they set stupid precedents all the time based on the emotion of the situation.
spiralbatross 9 points 1d ago
Do you not remember the coffee situation? She got 3rd degree burns. It was bad.
dciDavid 3 points 1d ago
Yeah, everyone makes fun of her but the case was awful. And She only asked for her medical bills to be paid.
synae 7 points 1d ago
You should review the McDonald's coffee case. You're not making the point you think you are.
Joeaywa -1 points 1d ago
It doesn't have to win or lose in case to be a stupid precedent. When I order hot coffee I know it's hot and will hurt me, just like I know electricity will electrocute me.
Ragnarok3246 3 points 1d ago
Which was a actually justified case. The woman was handed coffee that was WAY too hot, as in nearly still boiling in the container. Mcdonalds spent THOUSANDS to depict that woman as a dumb idiot, ruining her life.
wmaung58 -11 points 1d ago
Sue the gun manufacturer first. Then sue the gun store. If the person bought the gun with credit sue the credit card company. Sue the bank. Sure the parent. Sue the school. Sue the friends of the gunner. The list keep going.
hate_without_borders -7 points 1d ago
They greedy and want more money. They don't need jt
jimmyray3000 -4 points 1d ago
Now we just have to wait for those new federal and state laws that say you can't sue social media companies for their content killing people.
dhskiskdferh 5 points 1d ago
Section 230 already exists
Otherwise_Ad_3976 1 points 1d ago
I’m losing track…
macdennis1234 1 points 1d ago
Hmm I don't see Gab or Twitter mentioned. Or Rumble
Asleep-Doctor-6586 1 points 23h ago
People are clowns!
GagOnMacaque 1 points 23h ago
This won't even get a trial. Liability of social media has almost hardened at this point.
Hyperion1144 1 points 22h ago
This was already tried in suing social media platforms for not stopping terror attacks.

This isn't going to work.
farkwadian 1 points 16h ago
^(Nice carpet bomb legal strategy Cotton, let's see if it pays off.)
This nonprofit website is run by volunteers.
Please contribute if you can. Thank you!
Our mission is to provide everyone with access to large-
scale community websites for the good of humanity.
Without ads, without tracking, without greed.
©2023 HumbleCat Inc   •   HumbleCat is a 501(c)3 nonprofit based in Michigan, USA.