Meta removes ads for social media addiction litigation
278 points
3 hours ago
| 16 comments
| axios.com
| HN
bcjdjsndon
15 minutes ago
[-]
Hang on a minute, meta apparently didn't have the time to be checking the content of adverts they get paid to serve when it was child porn, what's changed all of a sudden?
reply
mekdoonggi
10 minutes ago
[-]
This one actually cost them money.
reply
henry2023
11 minutes ago
[-]
The crypto-“investing” deep fakes impersonating recognizable names are up and running too.
reply
mrwh
2 hours ago
[-]
Meta wants to be an impartial platform only and exactly when it suits them to be.
reply
rurp
27 minutes ago
[-]
Yeah, glad to see Zuck is sticking with those strong free speech principles he couldn't wait to get back to last year.
reply
PaulHoule
23 minutes ago
[-]
Wow.

Does Zuckerberg have some kind of clinical condition where he just can't imagine how other people might see him?

Sure this will slow down the personal injury lawyers finding clients but it won't stop them, meantime it is more ammunition for Facebook's enemies to use against it.

It is one thing to do shady business, it is another thing to incriminate yourself. If you were involved with weed and somebody sent you an email asking if they could come around and pick up a Q.P. next Saturday I'd expect you to give the person a correction in person that they shouldn't do that again.

Not to say you should be like Epstein but I mean he and the people he corresponded with had some sense so there is is very little evidence of criminal activity in millions of emails.

At Facebook on the other hand all the time people sent emails about things that could just as easily been left as "dark matter" unexplained and minimally documented decisions but no it is like that M.F. Doom song "Rapp Snitch Knishes", like a bunch of children or something with no common sense at all.

reply
jmye
1 minute ago
[-]
> Does Zuckerberg have some kind of clinical condition where he just can't imagine how other people might see him?

Nah, he just doesn't care. Nothing he does will ever get people (en masse, onesie, twosies don't matter) to stop using Meta products.

People can/will complain about him forever, but shitty people will continue to help him build things, and shitty people will continue to use them.

reply
jjtheblunt
12 minutes ago
[-]
language failure on my end: what's a "Q.P."?
reply
tadfisher
10 minutes ago
[-]
Quarter-pound
reply
ambicapter
11 minutes ago
[-]
quarter pound
reply
Lord_Zero
20 minutes ago
[-]
Damn whos buying a Q.P.??
reply
tiberius_p
2 hours ago
[-]
That's exactly what they're saying.
reply
zeroonetwothree
1 hour ago
[-]
I think there’s a clear difference in restricting advertising vs organic posts.
reply
thimabi
1 hour ago
[-]
Meta does both. It has long been said that businesses have little organic reach in Meta’s platforms, as an incentive for them to use ads.
reply
lazarus01
41 minutes ago
[-]
I’m not a big poster at all, but ran into this precise issue.

They analyze the video posts on instagram. If they detect the video has even a small amount of commercial value, they classify it as branded content and you need to pay for it to get promoted.

reply
alex1138
32 minutes ago
[-]
For all the creepy People You May Know stuff they don't even bother connecting people properly even on people's personal pages https://news.ycombinator.com/item?id=14147719
reply
HWR_14
1 hour ago
[-]
What difference is that?
reply
stronglikedan
56 minutes ago
[-]
Name one platform that doesn't, and I'm not just talking about lip service.
reply
bloqs
16 minutes ago
[-]
It's not unilateral but if it is a commercial interest, then I'll agree that it usually is
reply
mbesto
31 minutes ago
[-]
Signal?
reply
_moof
30 minutes ago
[-]
There are degrees.
reply
RobotToaster
25 minutes ago
[-]
4chan?
reply
kotaKat
2 hours ago
[-]
I mean, they spun up a bullshit "Oversight Board" that they can fully 100% choose to ignore and decline to implement their demands when they're made.
reply
2OEH8eoCRo0
1 hour ago
[-]
Repeal section 230
reply
ndiddy
15 minutes ago
[-]
The main problem with 230 is that the courts have decided to treat it as if it removes all legal liability from online platforms, rather than just publisher liability. The way the text was written seems to be intended to protect platform operators from publisher liability but still have them under distributor liability. For example, if you own a bookstore and carry a book that says something defamatory, you can be held liable if you don't remove the book after being informed about its contents. However, a court case soon after 230 passed created the precedent that it absolves online platforms of all forms of liability. This means that if a platform knows it hosts illegal or defamatory content and doesn't take it down, they aren't liable and any legal cases against them will get thrown out due to 230. One of the authors of section 230 later said that "the judge-made law has drifted away from the original purpose of the statute."
reply
krapp
5 minutes ago
[-]
>For example, if you own a bookstore and carry a book that says something defamatory, you can be held liable if you don't remove the book after being informed about its contents.

I don't think you can in the US. Maybe elsewhere, but in the US AFAIK the author is responsible for the content they publish, not the bookstores carrying the books.

>This means that if a platform knows it hosts illegal or defamatory content and doesn't take it down, they aren't liable and any legal cases against them will get thrown out due to 230.

No it doesn't. Section 230 doesn't allow sites to host illegal content, of course only "legality" within the framework of US law matters.

All it says is that the liability for user posted content lies with the user posting the content, not the platform hosting it.

reply
LocalH
31 minutes ago
[-]
Throwing the baby out with the bathwater?

I believe we need to strengthen 230, but with the added caveat that affected platform owners must stop gaming the algorithms, that it must require user-driven curation. Let me curate my own feed, stop shoving shit in front of my eyes. When you do so, you're making heavy editorial decisions, and should be open to liability.

reply
deeponey
25 minutes ago
[-]
This is really the essence of it. Section 230 is critical to a healthy internet, but there is large grey area between editorial and platform. Places like youtube, meta, X, etc. are pretending to be platforms when really they are algorithmic editors, gatekeepers, and curators. They are much more like traditional media newspapers than say your ISP, and they need to be treated as such.
reply
epistasis
43 minutes ago
[-]
A few years ago this seemed a bit too extreme for me. Now, with the web mostly burned down anyway, I see little to lose and lots to gain in a section 230 repeal. My, how the Overton Window changes on some ideas. And when it's changing on some things it tends to accelerate on others too, like a social momentum on reconsidering past norms.
reply
Pxtl
33 minutes ago
[-]
My compromise pitch, since the "You need ID from your users" ship has sailed:

Companies are not liable if they have proper ID of the person who submitted the content and can provide that to a plaintiff. If they have not made a good-faith effort to know who submitted this info (like taking ID, not just an email address) then they're taking responsibility for the submitted content.

Which means sites that have responsible moderation can still allow anonymous contributions.

The real problem is the inherent asymmetry of legal battles, where the wealthiest can fight forever with endless motions and have near-total impunity while a legal action would basically nuke a normal person's life. Not to mention the fact that an international border can often make this whole conversation moot.

reply
epistasis
23 minutes ago
[-]
> Which means sites that have responsible moderation can still allow anonymous contributions.

Anonymous contributions, up to the point of somebody compromising the service? With the quantity of password hash thefts, I suspect we'll see even more ID thefts this way.

I can't imagine using any service that asks for ID, except perhaps from the well-established giants, so an exception for identifiability would effectively be a gigantic moat granted to the largest internet companies to keep out competition. Anything like that would need to be paired with massive anti-trust changes, as well as perhaps government take-over of the giants as utilities, none of which sounds very appealing...

That said, don't take any of my rambling as discouragement, your type of thinking is exactly what we need, we need massive amounts of policy discussion and your suggestion is very innovative.

reply
2OEH8eoCRo0
31 minutes ago
[-]
I like this compromise.

One of my issues is the lack of liability in practice. The poster is technically liable but they're anon, behind proxies, foreign, etc. and unaccountable. It results in people being harmed online without recourse.

These companies should have a duty to know who their users are.

reply
kelnos
43 minutes ago
[-]
That phrase does not mean what you think it means.
reply
bilekas
2 hours ago
[-]
> "We will not allow trial lawyers to profit from our platforms while simultaneously claiming they are harmful."

Wow.. That is quite a statement. Am I right in saying that in order to claim for the class action lawsuit, which facebook has been 'found negligent', that the victims need to take an action collectively in order to claim ? IE They need to be reached somehow to inform them of the possibility ?

Seems the most obvious place to advertise would be Meta.

I understand Meta can basically do whatever they like with their ToS but the statement from the Meta spokesperson seems like an extremely bad idea.

reply
pixl97
1 hour ago
[-]
Tobacco lawyers "Putting that cigarettes are harmful on the box would be devastating to our profits!"
reply
akersten
48 minutes ago
[-]
It would be a better analogy if tobacco companies sold ad space on their packs and chose not to do business with a private for-profit anti-smoking solicitation group.
reply
adi_kurian
19 minutes ago
[-]
No it would not. Meta is an advertising company that sells ad space. More specifically, Meta is the dominant firm in the social advertising market which is an oligopoly.

It is "the business", not an imagined side revenue stream.

reply
roysting
48 minutes ago
[-]
I understand the impulse, but there are not only significant differences, i.e., the requirement to add labeling to cigarettes was mostly a judicial or legislative action, but there is also that rather perverse fact that this kind of legislation that people are championing is often funded by profit and greed just like the harm being sued over.

The article even at least mentions that at least one of the suits is private equity funded; which generally will result in the partners and/or investors of the private equity firm and the attorneys suing, which are often all one and the same in what is just a financial and legal shell game, net tens of millions of dollars, while the supposed victims will end up with nothing but pennies on the dollar of harm and injury.

I get the impulse to also “cheer” for the lawsuits, but if you thought Meta, etc. are bad; you really don’t want to look into the vile pestilence that is the law firms that are basically organized crime too by the core definition of crime being an offense and harm upon society.

I don’t really know a solution for this problem because it is so rooted in the core foundation of this rotten system we still call America for some reason, but for the time being I guess, the only moderately effective remedy for harm and injury is to combat it with more harm and injury.

reply
reactordev
1 hour ago
[-]
Literally every ceo
reply
deaux
52 minutes ago
[-]
You missed an adjective: literally every megacorp CEO. Plenty of small companies with transparent and honest CEOs.

Also why we need much less megacorps than there are now.

reply
bko
50 minutes ago
[-]
Imagine NYT banning an ad in it's newspaper telling people how to cancel and sue NYT?

Wild stuff

reply
giancarlostoro
2 hours ago
[-]
Would be really entertaining if all the lawyers affected banded together and made a class action lawsuit full of lawyers as the plaintiffs.
reply
stronglikedan
57 minutes ago
[-]
> the statement from the Meta spokesperson seems like an extremely bad idea.

All corporate CYA ideas sound that way, but ultimately end up benefiting the company in the end. Meta is right to do this. That's not to say it's right to do, but it's right for the company.

reply
bwestergard
2 hours ago
[-]
They wouldn't profit if the cases didn't have merit.
reply
HumblyTossed
2 hours ago
[-]
The judge should have ordered Meta to place a banner on FB so that everyone can see it and join if they're a victim.
reply
shimman
1 hour ago
[-]
Wow this is a really good idea. I wonder if the various state trials happening as well should use this for remediation too.

It's not a hard thing to implement on their end and should be mandated by a judge as you said.

Filing this away for later use.

reply
miki123211
39 minutes ago
[-]
Europe (Poland) loves this kind of stuff.

It often comes up in (anti) free-speech trials, where the government compels the perpetrator to issue a public apology to the victim. Forcing them to buy an ad in a newspaper for example is not unheard of.

As far as I understand, Americans consider this to be "compelled speech" and hence prohibited, but I might be wrong on this.

reply
dcrazy
26 minutes ago
[-]
The same thing happens here. Courts are allowed to compel speech as a method of remedy, but my recollection is that this is sometimes successfully challenged.

An interesting variant I’ve seen on anti-smoking banners at convenience stores is “A federal court has ordered a Philip Morris USA to say: …”

reply
smsm42
50 minutes ago
[-]
Not likely to survive 1st Amendment challenge - it is possible to compel somebody to certain speech as a result of losing a case, but doing this as a prerequisite when the case has just started is not likely to fly. Otherwise I could force Facebook (or any other platform) to publish anything just by suing them - and anybody could sue anybody else on virtually any grounds.
reply
3form
2 hours ago
[-]
"Lawyer benefitting from cases about prostitution equals to a pimp" kind of argument.
reply
boringg
2 hours ago
[-]
I mean those class action lawsuits enrich trial lawyers and maybe force companies to behave better (though i bet empirical evidence would show that its more a cost of business).

The 20$ dollars people get is nothing but a guise that the trial lawyers are helping people.

reply
bilekas
2 hours ago
[-]
I'm not sure if the lower price means that class actions shouldn't still be taken.

It's to allow companies to not have to deal with individual claims for each person. I see that the ranges can be substantial though, several thousands, but seems to be criteria.

> Nearly nine months later, Mark received a notification that his claim had been approved. Two weeks after that, $186 was deposited into his bank account. While the amount wasn’t substantial, it covered a grocery run and a phone bill—and more importantly, it reminded him that companies can be held accountable, even in small ways. [0]

[0] https://peopleforlaw.com/blog/how-much-do-people-typically-g...

If the fine's don't dissuade companies from bad practices, the class actions with theoreticaly no upper limit might be a better option to enforce proper behaviour.

reply
boringg
47 minutes ago
[-]
I can agree with that -- however the amount of money the trial lawyers make comparatively is wildly disproportionate. I think that 186$ figure is an example on the high side of payouts to individuals.
reply
Xeoncross
1 hour ago
[-]
As an aside, class-action lawsuits seem less than ideal for the public. The awards benefit the lawyers and perhaps a small handful, but the actual plaintiffs only get $0.05. In addition, successful class-action suits prevent further litigation from being allowed for the same issue.

Individuals bringing their own lawsuits seems like it would affect better change as 1) the award money would be better distributed instead of concentrated and 2) the amounts levied against the companies would be higher and more of concern than the class-action slap-on-the-wrist they currently get.

reply
bityard
23 minutes ago
[-]
> successful class-action suits prevent further litigation from being allowed for the same issue.

Only if you don't opt out. Individuals who opt out of being part of the class can still file their own suits. (Although it's not clear how successful you will be if your situation/harm is not substantially different from the other members of the class.)

reply
rurp
23 minutes ago
[-]
How does this address the most common case where many people were harmed a modest amount? Causing $100 of harm to a million people is a huge amount of damage that should be punished, but nobody is going to launch a full independent lawsuit for $100.
reply
rokkamokka
1 hour ago
[-]
A hundred million identical court cases might not be too good for the legal system
reply
ed312
43 minutes ago
[-]
1. Why should harming a million people identically reduce their right to a fair legal evaluation and possibly compensation for damages? <-- maybe it makes sense for large corporations to carry insurance to pay for the potentially massive legal costs they could impose on governments? 2. Shouldn't we be able to quickly resolve these cases assuming there are no substantially different pieces of evidence?
reply
CrazyStat
25 minutes ago
[-]
> 1. Why should harming a million people identically reduce their right to a fair legal evaluation and possibly compensation for damages?

It doesn’t. You can almost[1] always opt out of class action lawsuits to pursue your own suit. This would be expensive and unwise for most people, but you have right.

[1] There are rare exceptions.

reply
wongarsu
19 minutes ago
[-]
Isn't that trivially fixed by raising court costs (that should go to whoever loses the suit) to cover the cost of judges, jury, admin expenses etc? I don't get the impression that this would make the justice system that much more prohibitively expensive than it already is, and would allow the legal system to scale to the case load
reply
SecretDreams
40 minutes ago
[-]
Agreed. Naturally, the solution is to get meta to compensate for the actual and cumulative damage they've done to mankind. Then plaintiffs might actually benefit.

This is humanity vs Mark Zuckerberg.

reply
varispeed
10 minutes ago
[-]
I wonder when they'll tackle literal porn showing up in Instagram shorts. If you want to browse Instagram in public, forget it.
reply
bastard_op
1 hour ago
[-]
I wonder what would happen posting these ads to truth social and twitter.
reply
ginkgotree
38 minutes ago
[-]
Social Media, and specifically Facebook / Meta, will go down in history as one of the worst developments in technology in the 21st century. As Frances Haugen stated in her testimony, Mark Zuckberg needs to be removed from the helm at Meta.
reply
vachina
8 minutes ago
[-]
They started out good and then cranked the engagement trap to the max when they realize value of a captive audience.
reply
wvenable
10 minutes ago
[-]
I think television has done more harm, politically.
reply
fdeage
47 minutes ago
[-]
"Anxiety. Depression. Withdrawal. Self-harm. These aren't just teenage phases — they're symptoms linked to social media addiction in children."

Seems like they couldn't write even three lines without a LLM.

reply
WesolyKubeczek
37 minutes ago
[-]
LLMs love this style, but they love it because it's just about every single piece of advertisement writing for the last aeon or so, and it's a mighty chunk of their training corpora.
reply
boelboel
39 minutes ago
[-]
Maybe being unable to write us another symptom
reply
pcardoso
2 hours ago
[-]
Reminds me of Carl Sagan’s Contact, where Haden, the millionaire funding Ellie’s work, made a TV ad blocker and then sued the TV companies when they refused to play ads for his product.

I wonder if that is what will happen next.

reply
guywithahat
1 hour ago
[-]
There is a humor that these law firms won a case against Meta and the first thing they did is give them advertising money won from the court case. That said the ads sound pretty aggressive, and from what I've read it sounds like it wasn't a very fair decision. I understand the conflict of interest but I have sympathies for Meta here
reply
HumblyTossed
2 hours ago
[-]
Do photogs do that on purpose, or does Zuck really always have that sociopath stare?
reply
SpicyLemonZest
1 hour ago
[-]
Zuckerberg is a rich and high profile guy, so photographers capture many pictures of him, and news editors often find that choosing unflattering pictures of people their readers don't like is helpful for reach. This picture in particular was taken after he'd just finished testifying for 8 hours in a February trial, which I think would wear down the best of us, and even among Getty's extensive gallery of pictures taken then (https://www.gettyimages.com/detail/news-photo/mark-zuckerber...) this one is particularly unflattering IMO.
reply
folkrav
1 hour ago
[-]
Both.
reply
IncreasePosts
1 hour ago
[-]
I'm sure if people were taking 500 pictures of you, they would capture you in a state like that. Are you a sociopath?
reply
alex1138
1 hour ago
[-]
Keep in mind Zuckerberg is someone who supports things like this https://news.ycombinator.com/item?id=10791198

Zuckerberg was told about gay people being added to groups and it outed them by posting to their wall, and he ignored it https://www.youtube.com/watch?v=nRYnocZFuc4

And obviously https://news.ycombinator.com/item?id=1692122 (guess we don't get access to his other messages, though https://news.ycombinator.com/item?id=16770818)

His stare isn't the only thing about him that's sociopathic

Edit: oh yeah and https://news.ycombinator.com/item?id=42651178

reply
alex1138
1 hour ago
[-]
Guys, there's no need to insta-downvote. I provided substantive evidence. Look in the mirror, and evaluate who you work for
reply
josefritzishere
1 hour ago
[-]
So they remove class action lawsuits but not pedos. Got it.
reply
stronglikedan
53 minutes ago
[-]
Since literally everyone is calling everyone they don't like a pedo nowadays, it's pretty much impossible for any platform to get rid of the pedos.
reply
k33n
2 hours ago
[-]
The idea that Meta is obligated to be so impartial that it must allow lawsuits against itself to be promoted on its own platform is a bit naive and utopian.

Its own TOS states that they won’t allow that.

reply
schubidubiduba
2 hours ago
[-]
TOS are not laws. In fact, they often partially violate laws and those parts are then void. In some countries, anything written in TOS that is not "expected to be there" is void.
reply
zeroonetwothree
1 hour ago
[-]
Ok but I don’t really see why this specific term would violate any law? Do we really want a society where platforms are forced to present speech that is harmful to them? If you own a store and I put a sign up on your wall advertising a rival store wouldn’t it be reasonable for you to disallow that?
reply
quantum_magpie
50 minutes ago
[-]
An alternative reply, with analogy, if you like them:

You own a restaurant, where you sell poisoned (intentionally and knowingly) food. A group of people band up for class action lawsuit for poisoning them, and have the lawyers post a sign at your restaurant, that everyone poisoned there should reach out and get some compensation.

Should you be allowed to take the sign down?

reply
quantum_magpie
55 minutes ago
[-]
It’s not a rival store, or speech against them.

It’s a lawsuit, with the users of the platform as the damaged party, against the platform. Removing the possibility to reach the users should result in a default judgement with maximum damages immediately.

reply
mywittyname
1 hour ago
[-]
I kind of wish countries would just define, "terms of service" for everyone and not allow companies to modify them further.
reply
raincole
2 hours ago
[-]
No one says ToS are laws and especially not the parent commenter.
reply
Fraterkes
1 hour ago
[-]
The parent comment brings up the ToS as an example of why it's naive to believe Meta is obligated to do something, but what Meta is obligated to do depends on the law.
reply
raincole
1 hour ago
[-]
And which laws state that Meta is obligated to show ads like this?
reply
nkrisc
2 hours ago
[-]
Fair enough. If they're not impartial then lets hold them accountable for the content published in their platform.
reply
k33n
2 hours ago
[-]
I’m not against these companies losing their Section 230 immunity. Social media platforms are, in my personal opinion, publishers in their current form.

If they went back to operating as “friends and family feed providers” then letting them keep their 230 immunity would be easier to justify.

reply
TheCoelacanth
1 hour ago
[-]
Yes, if they went back to being chronological feeds of people you follow, then they should get to keep Section 230 immunity.

When they are making editorial decisions about what to content to promote to you and what content to hide from you, then they should lose it.

reply
wbobeirne
2 hours ago
[-]
You are relying on the wrong people to be able to understand that nuanced distinction.
reply
mc32
2 hours ago
[-]
To me that’s how it should be. They shouldn’t have to run ads against themselves yet they should be liable or accountable for harm they are found guilty of.
reply
pixl97
2 hours ago
[-]
>They shouldn’t have to run ads against themselves

This is not how it works when you're found guilty of committing harm. Tobacco companies are a good example of this.

reply
mc32
2 hours ago
[-]
If the government mandates them then yes. If it’s not mandated they have the right to refuse service.
reply
pixl97
1 hour ago
[-]
The bigger you get the more iffy it gets refusing service to others. Also it can and will be used against you in future civil and criminal cases.
reply
iinnPP
2 hours ago
[-]
I tend to agree with you on this. I wanted to add however that Meta itself lets so many TOS violating ads in, that it seems like special treatment for ads that are much less undesirable than the ads normally pushed.

It's not just a Meta issue either.

reply
hansvm
1 hour ago
[-]
Companies have to inform affected individuals of data breaches, especially when HIPAA gets involved. Brokers have to inform clients of transaction errors. Auto manufacturers have to inform owners of recalls. Retirement funds have to inform plan participants of lawsuits involving those funds.

You don't even have to invoke the idea that Meta is big enough to be regulated as a public utility for this to have broad precedent in favor of forcing a malicious actor to inform its victims that they might be entitled to a small fraction of their losses in compensation.

reply
zeroonetwothree
1 hour ago
[-]
Well we aren’t discussing the government requiring meta to inform users. We are discussing whether meta can choose which private actors’ ads to allow. It would seem silly that a platform would be forced to allow all ads.
reply
mirashii
2 hours ago
[-]
That idea was not expressed in the article, only the fact that the ads were removed. This is worth covering, especially when coupled with the context for what ads Meta regularly does allow. One does not have to believe that they're obligated to do so while also believing that it's incredibly scummy behavior that consumers should be aware of and question.

https://www.reuters.com/investigations/meta-is-earning-fortu...

reply
dcrazy
1 hour ago
[-]
This is why courts are empowered to infringe upon the rights of parties to the case.
reply
Zigurd
2 hours ago
[-]
There are so many ads for nostrums, cults, get rich quick scams, and other junk that violate TOS, that Meta has a legitimacy problem with their TOS.
reply
freejazz
2 hours ago
[-]
Okay? They're exactly the assholes everyone says they are. That's the point.
reply
gilrain
2 hours ago
[-]
Let’s force them to be obligated to do that, then. “Just let them hurt people, and then let them hide that hurt” kind of sucks for society.
reply
3form
2 hours ago
[-]
Maybe, but so what? Your remark lacks a conclusion.

Mine is that it could then well be required to do so by law. Companies are not individuals, so I don't think they are owed any freedoms beyond what is best for utility they can provide.

reply
streetfighter64
2 hours ago
[-]
The idea that a company can override laws via its TOS is a bit strange.
reply
BeetleB
1 hour ago
[-]
Genuinely curious. By not allowing a specific type of ad, what law are they breaking?
reply
hashmap
2 hours ago
[-]
at certain scales, reality has to win out over whatever ideal you have in your head about how things should be. facebook is massive, a lot of society is on it, and its a problem to make recourse invisible to people most affected by the thing stealing their attention.
reply
swiftcoder
2 hours ago
[-]
> The idea that Meta is obligated to be so impartial

Is their defence of Section 230 protections not in part rooted in that claim of impartiality?

reply
nradov
2 hours ago
[-]
No. Section 230 doesn't mention anything about impartiality.
reply
swiftcoder
1 hour ago
[-]
It indeed doesn't, but conservative lawmakers signalled repeatedly that they were unhappy about Meta's protection under section 230 if their moderation policies were not politically neutral
reply
neuroelectron
2 hours ago
[-]
Reminds me of ChatGPT insisting all news about OpenAI is unverified speculation.
reply
glaslong
55 minutes ago
[-]
Thus begins another Streisand Effect meme campaign of

"MZ Is A Punk-Ass B

payed for by Person & Guy LLP"

reply
skeeter2020
38 minutes ago
[-]
Can't we all just agree there are no GOOD people in this situation? Meta, class-action lawyers, PE and big money that funds the lawsuits as a profit venture... The one thing they all appear to share: parasites extracting resources from their host.
reply