Does Zuckerberg have some kind of clinical condition where he just can't imagine how other people might see him?
Sure this will slow down the personal injury lawyers finding clients but it won't stop them, meantime it is more ammunition for Facebook's enemies to use against it.
It is one thing to do shady business, it is another thing to incriminate yourself. If you were involved with weed and somebody sent you an email asking if they could come around and pick up a Q.P. next Saturday I'd expect you to give the person a correction in person that they shouldn't do that again.
Not to say you should be like Epstein but I mean he and the people he corresponded with had some sense so there is is very little evidence of criminal activity in millions of emails.
At Facebook on the other hand all the time people sent emails about things that could just as easily been left as "dark matter" unexplained and minimally documented decisions but no it is like that M.F. Doom song "Rapp Snitch Knishes", like a bunch of children or something with no common sense at all.
Nah, he just doesn't care. Nothing he does will ever get people (en masse, onesie, twosies don't matter) to stop using Meta products.
People can/will complain about him forever, but shitty people will continue to help him build things, and shitty people will continue to use them.
They analyze the video posts on instagram. If they detect the video has even a small amount of commercial value, they classify it as branded content and you need to pay for it to get promoted.
I don't think you can in the US. Maybe elsewhere, but in the US AFAIK the author is responsible for the content they publish, not the bookstores carrying the books.
>This means that if a platform knows it hosts illegal or defamatory content and doesn't take it down, they aren't liable and any legal cases against them will get thrown out due to 230.
No it doesn't. Section 230 doesn't allow sites to host illegal content, of course only "legality" within the framework of US law matters.
All it says is that the liability for user posted content lies with the user posting the content, not the platform hosting it.
I believe we need to strengthen 230, but with the added caveat that affected platform owners must stop gaming the algorithms, that it must require user-driven curation. Let me curate my own feed, stop shoving shit in front of my eyes. When you do so, you're making heavy editorial decisions, and should be open to liability.
Companies are not liable if they have proper ID of the person who submitted the content and can provide that to a plaintiff. If they have not made a good-faith effort to know who submitted this info (like taking ID, not just an email address) then they're taking responsibility for the submitted content.
Which means sites that have responsible moderation can still allow anonymous contributions.
The real problem is the inherent asymmetry of legal battles, where the wealthiest can fight forever with endless motions and have near-total impunity while a legal action would basically nuke a normal person's life. Not to mention the fact that an international border can often make this whole conversation moot.
Anonymous contributions, up to the point of somebody compromising the service? With the quantity of password hash thefts, I suspect we'll see even more ID thefts this way.
I can't imagine using any service that asks for ID, except perhaps from the well-established giants, so an exception for identifiability would effectively be a gigantic moat granted to the largest internet companies to keep out competition. Anything like that would need to be paired with massive anti-trust changes, as well as perhaps government take-over of the giants as utilities, none of which sounds very appealing...
That said, don't take any of my rambling as discouragement, your type of thinking is exactly what we need, we need massive amounts of policy discussion and your suggestion is very innovative.
One of my issues is the lack of liability in practice. The poster is technically liable but they're anon, behind proxies, foreign, etc. and unaccountable. It results in people being harmed online without recourse.
These companies should have a duty to know who their users are.
Wow.. That is quite a statement. Am I right in saying that in order to claim for the class action lawsuit, which facebook has been 'found negligent', that the victims need to take an action collectively in order to claim ? IE They need to be reached somehow to inform them of the possibility ?
Seems the most obvious place to advertise would be Meta.
I understand Meta can basically do whatever they like with their ToS but the statement from the Meta spokesperson seems like an extremely bad idea.
It is "the business", not an imagined side revenue stream.
The article even at least mentions that at least one of the suits is private equity funded; which generally will result in the partners and/or investors of the private equity firm and the attorneys suing, which are often all one and the same in what is just a financial and legal shell game, net tens of millions of dollars, while the supposed victims will end up with nothing but pennies on the dollar of harm and injury.
I get the impulse to also “cheer” for the lawsuits, but if you thought Meta, etc. are bad; you really don’t want to look into the vile pestilence that is the law firms that are basically organized crime too by the core definition of crime being an offense and harm upon society.
I don’t really know a solution for this problem because it is so rooted in the core foundation of this rotten system we still call America for some reason, but for the time being I guess, the only moderately effective remedy for harm and injury is to combat it with more harm and injury.
Also why we need much less megacorps than there are now.
Wild stuff
All corporate CYA ideas sound that way, but ultimately end up benefiting the company in the end. Meta is right to do this. That's not to say it's right to do, but it's right for the company.
It's not a hard thing to implement on their end and should be mandated by a judge as you said.
Filing this away for later use.
It often comes up in (anti) free-speech trials, where the government compels the perpetrator to issue a public apology to the victim. Forcing them to buy an ad in a newspaper for example is not unheard of.
As far as I understand, Americans consider this to be "compelled speech" and hence prohibited, but I might be wrong on this.
An interesting variant I’ve seen on anti-smoking banners at convenience stores is “A federal court has ordered a Philip Morris USA to say: …”
The 20$ dollars people get is nothing but a guise that the trial lawyers are helping people.
It's to allow companies to not have to deal with individual claims for each person. I see that the ranges can be substantial though, several thousands, but seems to be criteria.
> Nearly nine months later, Mark received a notification that his claim had been approved. Two weeks after that, $186 was deposited into his bank account. While the amount wasn’t substantial, it covered a grocery run and a phone bill—and more importantly, it reminded him that companies can be held accountable, even in small ways. [0]
[0] https://peopleforlaw.com/blog/how-much-do-people-typically-g...
If the fine's don't dissuade companies from bad practices, the class actions with theoreticaly no upper limit might be a better option to enforce proper behaviour.
Individuals bringing their own lawsuits seems like it would affect better change as 1) the award money would be better distributed instead of concentrated and 2) the amounts levied against the companies would be higher and more of concern than the class-action slap-on-the-wrist they currently get.
Only if you don't opt out. Individuals who opt out of being part of the class can still file their own suits. (Although it's not clear how successful you will be if your situation/harm is not substantially different from the other members of the class.)
It doesn’t. You can almost[1] always opt out of class action lawsuits to pursue your own suit. This would be expensive and unwise for most people, but you have right.
[1] There are rare exceptions.
This is humanity vs Mark Zuckerberg.
Seems like they couldn't write even three lines without a LLM.
I wonder if that is what will happen next.
Zuckerberg was told about gay people being added to groups and it outed them by posting to their wall, and he ignored it https://www.youtube.com/watch?v=nRYnocZFuc4
And obviously https://news.ycombinator.com/item?id=1692122 (guess we don't get access to his other messages, though https://news.ycombinator.com/item?id=16770818)
His stare isn't the only thing about him that's sociopathic
Edit: oh yeah and https://news.ycombinator.com/item?id=42651178
Its own TOS states that they won’t allow that.
You own a restaurant, where you sell poisoned (intentionally and knowingly) food. A group of people band up for class action lawsuit for poisoning them, and have the lawyers post a sign at your restaurant, that everyone poisoned there should reach out and get some compensation.
Should you be allowed to take the sign down?
It’s a lawsuit, with the users of the platform as the damaged party, against the platform. Removing the possibility to reach the users should result in a default judgement with maximum damages immediately.
If they went back to operating as “friends and family feed providers” then letting them keep their 230 immunity would be easier to justify.
When they are making editorial decisions about what to content to promote to you and what content to hide from you, then they should lose it.
This is not how it works when you're found guilty of committing harm. Tobacco companies are a good example of this.
It's not just a Meta issue either.
You don't even have to invoke the idea that Meta is big enough to be regulated as a public utility for this to have broad precedent in favor of forcing a malicious actor to inform its victims that they might be entitled to a small fraction of their losses in compensation.
https://www.reuters.com/investigations/meta-is-earning-fortu...
Mine is that it could then well be required to do so by law. Companies are not individuals, so I don't think they are owed any freedoms beyond what is best for utility they can provide.
Is their defence of Section 230 protections not in part rooted in that claim of impartiality?
"MZ Is A Punk-Ass B
payed for by Person & Guy LLP"