CivitAI Policy Update: Removal of Real-Person Likeness Content
41 points
13 days ago
| 7 comments
| civitai.com
| HN
minimaxir
13 days ago
[-]
Relatedly, CivitAI has recently had issues with their credit card processors for the reasons you suspect, and today they lost the ability to process credit card payments: https://civitai.com/articles/14945

I suspect the company will be operating more conservatively policywise for the foreseeable future.

reply
beeflet
13 days ago
[-]
I think this is a pretty acceptable use-case for cryptocurrency/micropayments. Credit card companies are notoriously restrictive.
reply
tommica
13 days ago
[-]
Its scary how much power they have - get blacklisted by them and you are out of business. For a system that is so critical to everyday life, almost like electricity, it's insane how much control they exert over it.
reply
benterix
12 days ago
[-]
I agree with you in general - it seems supereasy to pressure them into blocking CC payments via (potential) lawsuits etc.

However, as for crypto and banks, I don't really blame them they don't want to have anything to do with these. Its really easy to scam people, lose money and so on. Who is to blame then? With crypto, you have nobody to blame. Except if there is a terminal, in this case a bank. They become a scapegoat for everyone, including lawmakers. So they decided not to play the game. Want to cash out crypto? Deal with the ones that are willing to accept the risk.

reply
quotemstr
13 days ago
[-]
If there's ever been an industry that needs to be subject to common carrier regulations, it's the payments industry. MasterCard and Visa have no business unilaterally, secretly, and unaccountably policing their idiosyncratic idea of moral righteousness. They need to move money and shut up.
reply
jallmann
12 days ago
[-]
> MasterCard and Visa have no business unilaterally, secretly, and unaccountably policing their idiosyncratic idea of moral righteousness

That's not why they do it. The reasons are regulatory compliance and risk. Processors would be in big trouble if they facilitate payments when they shouldn't, or broke due to rampant fraud in certain sectors.

I get that you might not like it, but take it up with the US government. The processors would be happy to move as much money as possible to make as much money as possible.

reply
TheNewsIsHere
12 days ago
[-]
Something that people don’t understand, which your comment sort of indirectly alludes to, is that Visa and MasterCard started as and for most of their existence were essentially a shared services center for financial institutions that wanted to participate in a common payment network.

The legal structure has changed, but the boards of both are still primarily comprised of executives from other major financial services institutions.

Risk averse and sensitive to regulatory pressure by nature, the issue isn’t with Visa and MasterCard directly. They’re just operating in a space.

reply
jallmann
12 days ago
[-]
Yes! The book "One From Many" from Dee Hock, the Visa founder, discusses this formation in depth. For a long time, Visa's governance structure was quite unusual due to it being a consortium of stakeholders, some with very different interests.

Somewhat ironically, Hock's "chaordic" management philosophy has strong parallels with ethos of decentralization held by some crypto idealists.

reply
objclxt
12 days ago
[-]
> MasterCard and Visa have no business unilaterally, secretly, and unaccountably policing their idiosyncratic idea of moral righteousness. They need to move money and shut up.

Mastercard and Visa don't block companies from processing because of morals, they block them because they lose them money. They will happily process your payments for all kinds of shady schemes that are - to them - low risk.

reply
jazzyjackson
12 days ago
[-]
Can't force a company to do business with customers that they don't want to do business with (barring protected class discrimination laws)
reply
AnthonyMouse
12 days ago
[-]
No, but you can break up concentrated markets so that being refused by some payment processor is irrelevant because there are a thousand others willing to do it and they're all completely fungible.
reply
ygjb
12 days ago
[-]
Nah. I don't mind businesses that take a moral stance that is intended to de-risk finances. We (well, decent people at least, IMO) need mechanisms to prevent human trafficking, illicit or illegal transactions, etc. That said, the rules should be completely transparent. When a transaction is blocked for a reason, or an account is suspended or terminated, there should be a clear audit trail of the reason why, which rules were broken, and what the steps to remediate are, if possible. There should be an appeals process that is equally as transparent, and countries should impose ombudspersons to oversea the enforcement with significant financial penalties for payment companies that fail to uphold these rules. Some part of this may already exist, I have been out of finance and payments for almost 20 years at this point, but there was and remains a lot of shady stuff going on in the industry :/
reply
elaus
12 days ago
[-]
> We (well, decent people at least, IMO) need mechanisms to prevent human trafficking, illicit or illegal transactions, etc.

There are already laws (from actual governments) against human trafficking and illegal transactions. The issue is not about credit card companies obeying the law.

reply
Animats
13 days ago
[-]
They already have their own currency, called "Buzz". Maybe that's why they're in trouble with credit card processors. Using credit cards to buy money equivalents is considered high-risk.
reply
OsrsNeedsf2P
13 days ago
[-]
> acceptable use-case for cryptocurrency

Once in a while I think this, and then I remember what a disaster cryptocurrency became

reply
beeflet
13 days ago
[-]
A lot of cryptocurrency's problems are fundamentally social and unsolvable. But there has also been significant technological improvements since 2014 including payment channels and zero-knowledge proofs that could patch some of crypto's gaping flaws.
reply
jeffhuys
13 days ago
[-]
Bitcoin at its highest point in history right now but whatever
reply
djur
12 days ago
[-]
People making money off of something doesn't make it not a disaster, obviously. Within the past month crypto's biggest headline has been its use in bribing the US president.
reply
BoorishBears
13 days ago
[-]
Yeah, such a mark of success for a currency to appreciate 24,000% in under a decade.

When buying a pizza today can cost you a house in 10 years, you have a failed currency.

reply
vachina
13 days ago
[-]
Highest point in what? Filthy fiat?
reply
saurik
12 days ago
[-]
In anything? Measure it in hamburgers or fancy chairs, if you'd like.
reply
conception
13 days ago
[-]
That’s not good if you want to use it as, you know, a currency.
reply
mvdtnz
12 days ago
[-]
Are people using it as a currency? At scale?
reply
protocolture
13 days ago
[-]
Crypto is fine its the users that are the problem.
reply
okayishdefaults
13 days ago
[-]
Is it at this point? When used earnestly, it's regulated, traceable, and slower than other methods of transacting money. History can and has been rewritten, but not when someone is scammed.

Seems like bad currency, but maybe you're aware of something meaningful that crypto contributes.

reply
BoorishBears
12 days ago
[-]
I mean we're in the comment section of a provider removing models primarily used to make non-consensual porn of celebrities (https://arxiv.org/html/2407.12876v1), then talking about how crypto is the answer.

Visa and co are a cartel, a lot of the pressure Civitai is facing is unreasonable, but even a broken clock is right twice a day: and they had a lot of problematic content.

Even if they turn to crypto, this is a change they shouldn't walk back, or other providers are probably going to turn on them too.

reply
pseudo0
12 days ago
[-]
This paper is trash. They preemptively define any models they don't like as "abusive models". This includes any model that can generate real people (including for transformative, fair use purposes like parody) and separately any NSFW model, including stuff like cartoons.

Also they are using a ridiculous definition of "NSFW" to achieve the correlation they want to find. They are putting the prompt (not the image) into ChatGPT and applying an arbitrary metric of NSFW-ness sentiment analysis that returns false positives. Actually NSFW content of real people was always banned on CivitAI.

reply
BoorishBears
12 days ago
[-]
I'm not sure how anyone's really going act like the vast majority of the deepfakes generated with celebrity LORAs isn't porn when the term itself has become synonymous with non-consensual porn.

And it was just in April (less than a month ago) that they stepped up moderation of celebrity NSFW content: the study is from June of 2024.

The study was an attempt to avoid someone immediately trying to argue about a really obvious truth, but some people will still try to argue about the study about the really obvious truth.

reply
pseudo0
12 days ago
[-]
I have been using CivitAI practically since it was created. Real person NSFW content has been banned since day one, because they didn't want to get sued. The change in April was a cosmetic update to how they displayed search results. It completely separated out real person content and NSFW content, to ensure that NSFW could not be displayed in proximity to real person content. This changed how content was displayed, not what content was allowed.

I'm sure some people do generate porn with celebrity LoRAs, but there are also plenty of legitimate uses such as parody, criticism, transformative art, etc. If people do post inappropriate content, there are civil remedies and now also federal criminal remedies via the TAKE IT DOWN Act. CivitAI is fully legally compliant, but they are being held hostage by an unelected, unaccountable payment processor cartel.

reply
protocolture
12 days ago
[-]
>Even if they turn to crypto, this is a change they shouldn't walk back, or other providers are probably going to turn on them too.

Yep

reply
protocolture
12 days ago
[-]
>Regulated

Well humans are regulated so I dont know what you are driving at here.

>Traceable

The core concept was built on traceability. Privacy coins are the aberration. Like crypto was developed by people who want provable auditing of banks online.

Actually if you compare Bitcoin to later standards, Bitcoins biggest weakness is that it wants to track coins individually instead of just balances. Literally invented by goldbugs.

>Slower

Depends on both parties. I can go Crypto -> Crypto -> Fiat in like 15 minutes. Osko can be faster.

>History can and has been rewritten

Thats a fail state but its done rather less than traditional currency

>Seems like bad currency, but maybe you're aware of something meaningful that crypto contributes.

My fondest memory was watching a whole bunch of libertarian crypto guys using it to donate to Venezuelans who would pop up in crypto spaces to talk about how hard their lives were and how bad government had screwed up their lives. I liked to think the libertarians were getting scammed but it didnt really matter, because there werent many other onramps into VZ at the time.

Really its best feature is that its largely unpreventable. Sure you can police the on and off ramps to an extent. But if I need to evade financial censorship, I can. Mostly I see people against crypto throw up a big smokescreen but at the end of the day they tend to be in favor of the financial censorship that crypto is avoiding at the moment. Be that donations to wikileaks, purchasing services without a credit card or what have you.

reply
vachina
13 days ago
[-]
In most countries there are usually multiple (viable) alternative payment methods to credit cards.
reply
beeflet
13 days ago
[-]
for micropayments? like what?
reply
makingstuffs
12 days ago
[-]
Can’t speak for other countries but India has PayTM, UPI, BHIM and others
reply
throwanem
13 days ago
[-]
Yeah, they tend to frown on violations of property rights. So do most segments of society. Cryptocurrency advocates?
reply
aussieguy1234
12 days ago
[-]
I'm against misuse of real life likeness AI content, but also against credit card companies deciding what we get to see and hear based on the whims of the backwards ultra hard right conservative leadership at the credit card companies.

I'm aware of a site that was blocked by credit card companies due to some controversial content, but survived and kept growing with a high rate of paying members.

The payments space is now a bit more compilcated than just credit cards. There's alot of country/region specific methods i.e. SEPA in Europe, PayID in Australia, QR payments in Thailand and of course crypto. Many of these options are just as easy or easier to use than credit cards.

Basically, the site started accepting lots of these different payment methods, you see different options depending on your country. Free members kept upgrading to paid members using these new options. This was more than 6 years ago and they still are blocked from taking credit cards.

reply
nullc
12 days ago
[-]
> whims of the backwards ultra hard right conservative leadership at the credit card companies

the credit card companies executives like money, this isn't originating with them.

It originated in the US federal government. (e.g. https://en.wikipedia.org/wiki/Operation_Choke_Point )

It works quite similar to the great firewall in China. ISPs are not so much told what they are to block, they're told Do Not Embarrass the Party or Else.

So even as the pressure has come off in this administration, the memory of the or Else remains, and processors will continue to swat random shit in the hopes of appeasing an uncommunicative but clearly spiteful god.

reply
speedgoose
13 days ago
[-]
Are the reasons a dislike for porn, pornographic deep fakes of real people, or pedophilia?
reply
rnd0
12 days ago
[-]
Litigation in some form or another. This is undoubtably related to the take it down legislation recently passed.

Porn isn't the only worry; there's also getting sued by the estates of dead celebrities, being misused for misinformation purposes.

Things are going to become increasingly restrictive until it is not worth using unless you're a corporation or a state actor. But for hobbiests? Resources are going to become thin on the ground and no, that is not a good thing.

reply
Jubijub
12 days ago
[-]
I have no beef against porn, but what makes you think the other two use cases at acceptable ???
reply
speedgoose
12 days ago
[-]
I don’t think deep fakes of real people and pedophilia are acceptable.
reply
sinuhe69
13 days ago
[-]
Will that also exclude users’ own person? That’ll be to bad. You not only take away the agency of the people, you also limit the creative space severely.

I guess open source models for image diffusion will get a huge boost then.

reply
ZaoLahma
12 days ago
[-]
This is an interesting one. Do we own the rights to our own likeness? And if we do, what about doppelgangers - people who look eerily similar to celebrities, or other "unknown" people?
reply
AnthonyMouse
12 days ago
[-]
> Do we own the rights to our own likeness? And if we do, what about doppelgangers - people who look eerily similar to celebrities, or other "unknown" people?

Is this somehow a novel question? We've had the same issue since at least the invention of cameras.

reply
throwanem
12 days ago
[-]
It's fun for these dorks to fantasize. Knowing things is for nerds.
reply
Rohitcss
13 days ago
[-]
Deepfakes are already becoming a mess to handle.
reply
Joel_Mckay
12 days ago
[-]
Indeed, if businesses will rip off Mark Hamill of all people, than most actors stood zero chance of protecting their image.

It is an odd part of US copyright well known to figures who sue already, and it is good normal citizens now have similar rights.

Some folks took things way to far already, and unfortunately copyright enforcement on the web has always been ridiculous. =3

reply
Rohitcss
12 days ago
[-]
Agree!!
reply
userbinator
13 days ago
[-]
As a saying I once heard goes: Trying to stuff the genie back in the lamp will only ensure it replicates itself into other lamps.
reply
sureIy
12 days ago
[-]
What has been the problem exactly? Just generic "fake news" or did the platform generate porn?
reply
altairprime
12 days ago
[-]
Recent legislative changes (further) increased the risk tier of generative AI when human beings could be reasonably considered to have grounds to sue the business, and the bank likely also sought and received regulatory advice directing them to classify such businesses as higher risk (than their tolerance threshold might have permitted previously). To quote the recent Debanking article:

> This particular bank did not, at the time, have a small business practice within its personal banking division. Very many banks do, but this particular bank did not. And thus this bank had not built out the higher degree of policies and procedures [required]

https://news.ycombinator.com/item?id=42371476

Most likely, the payment processor does not wish to invest operational spending into the necessary banking and processing policies that continuing to transact with CivitAI without restrictions that reduce legal risk would have demanded of them. And then when CAI encouraged everyone to spend money rapidly and fast, it massively spiked their transaction volume, leading the bank to kill services a day early due to the additional risk flag that sort of processing behavior carries on an everyday basis.

(I am not your lawyer, this is not legal advice.)

reply
Aeolun
12 days ago
[-]
What a surprise. Not. All of these companies go the same way once they become popular.

Count the number of days until models that are capable of generating real-person likeness content are banned too.

reply
duxup
13 days ago
[-]
> We are removing models and images depicting real-world individuals

So any images that look real, or images that look like specific real people?

reply
godelski
12 days ago
[-]
Likely the latter. They hosted a lot of LoRAs and embeddings that help generate images of specific people. Users would post samples of those people nude.

Honestly, I'm surprised it lasted this long. Like that was clearly egregious. At least pretend to not be a deep fake porn generator...

reply
Aeolun
12 days ago
[-]
> At least pretend to not be a deep fake porn generator

Why? Like, people are going to do it anyway. Whether in their minds or with some tool. Public or behind closed doors. That genie is very firmly out of the bottle, and it’s not going to go back in. At least when it’s on Civit it’s abundantly clear it is fake content.

You can run any of these models on a consumer GPU.

reply
throwanem
12 days ago
[-]
I'll credit this argument when I see you publish a training set of selfies, or a model finetuned thereon.

If you want to argue no one should expect to have anything to hide, you'd better do it from a glass house and in the altogether if you mean to be taken seriously, don't you think? It is a radical departure. The example is therefore necessarily yours to set, else you show yourself a hypocrite.

reply
Aeolun
12 days ago
[-]
Just because I wouldn’t have an issue with random people generating porn of me, doesn’t mean I want to seek it out, see it, or facilitate it.

It’s fundamentally no different from acknowledging that there may be a fanclub on the other side of the country with a stash of aeolun porn. As long as they’re not using it to advertise their services, and don’t tell me what kind of degenerate things they’re doing with it, what does that hurt me?

Now that you’ve said it I’m actually mildly curious what would happen if I fine-tune a model purely on myself. But what would I do with it even if I made it? I can’t post it on Civit any more can I?

reply
throwanem
12 days ago
[-]
I assume you could use it to produce generated images featuring yourself, which seems fine to me.

But we make laws for the citizenry, not for you and me in particular. What news since diffusion models reached consumers suggests the citizenry can be trusted with tools this powerful? You can't buy a howitzer at Home Depot, either. These things are dangerous. I believe I recall hearing that people have died. Making the tools require some effort and skill to find seems warranted.

Or did you think they cease to exist for no longer being so openly hosted? I'm sure such things are already traded in secret, and any traffic analyst with a sufficiently privileged perspective no doubt today knows much more about where and by whom.

reply
jazzyjackson
12 days ago
[-]
Because we live in a society and so get to have some influence on what behaviors we deem acceptable from our peers, and creating sexual images with the faces of your neighbors, acquaintances and even your foes is considered by some of us unacceptable, degenerate behavior

Just because something is possible doesn't mean it has to be allowed to happen

reply
pseudo0
12 days ago
[-]
They always had a pretty strict policy against mixing any NSFW content with depictions of real people. Anything tagged as a real person would get run through a NSFW -detection API and would get automatically blocked if it failed. So no, no one was posting deepfaked nudes.

Meanwhile I used a couple LoRAs to make JD Vance memes when that was the craze a few weeks ago, and now those are getting nuked... No fun allowed, thanks payment processors.

reply
godelski
12 days ago
[-]
Regardless, I still saw it happen quite a lot. Maybe I was seeing the images under different LoRAs. But I'm extremely confident I saw the celebrities in bikinis or other sexy settings. So let's be serious, any sexy setting, regardless of nudity, is making a certain implication... And sorry, but your NSFW detector is not going to catch that stuff (and it is going to frequently fail on even nudes).

And did they apply the detector uniformly? Like you get the detector on Emma Watson, but does it apply to Hermine Granger? Because I definitely saw that situation reoccurring. Seemed to be extremely common with cartoon characters. Especially given the explicit nature of PonyDiffusion, mixing anime generation and realsm... Just turn of NSFW filters and go look at top models and top LoRAs. Nude content dominates the charts.

Let's also be clear, there is a gender bias. I'd wager quite a lot of money that there are far fewer JD Vance nude deep fakes than many female celebrities.

reply
pseudo0
12 days ago
[-]
They had the NSFW detection dialed up to the point where it would flag a bit of cleavage as inappropriate for real person depictions. They were very on top of this issue and really wanted to avoid any bad press over "deepfakes".

Characters are a bit tricky. For example, Lara Croft could refer to the fictional video game character, or the character in the movie played by Angelina Jolie. They allowed the former, but not the latter. Or a model trained on the concept of "Hermione Granger" from fan art would be allowed, but not a LoRA trained on Emma Watson from the movies. That seemed like a pretty reasonable position to take.

reply
godelski
11 days ago
[-]
I'm not sure what you're going on about. I logged in, and the 3rd model (1st non anime model) is CyberRealistic Pony. Without the filters practically every example is NSFW (>9/10 of the images). Are you trying to say that these images are not frequently based on real people?

I don't mean just trained on. I mean there's a ScarJo lookalike front and center.

If you're expecting the LoRA attached, I guess you can go with one of those "AI Characters" and that's at least feigning compliance. But come on, you can hide the LoRAs that were used too. But come on... the models themselves ingest a lot of porn photos and celebrity photos. You really think this is going to stop someone from just describing the celebrity and getting their nude? Don't be so dense. You aren't fooling anyone here

reply
pseudo0
11 days ago
[-]
I can't find the ScarJo image you are referring to, so it either looks nothing like her, or it got reported and removed. Yes, people use NSFW models to generate NSFW content... What a shocking discovery.

What exactly are you proposing here that CivitAI didn't do already? Yes, people can scrub metadata from images. This is the exact same issue that image hosts have faced for decades - people can upload abusive or illegal images. The solution has been to do automated filtering, and have users report content that slips through. The solution is not to ban Photoshop because some people use it to make fake nudes.

reply
rustcleaner
12 days ago
[-]
Hopefully these LoRAs invigorate bittorrent again.
reply
throwanem
13 days ago
[-]
> This applies to any depiction of a real person, regardless of context or rating - including PG and PG-13 content. Whether it’s a public figure, celebrity, influencer, or private individual, if a model or image is based on a real human being, it will be removed. This includes fan-art depictions of characters portrayed by a celebrity - e.g. Indiana Jones (Harrison Ford).

It is the second or third paragraph in the link.

reply
duxup
12 days ago
[-]
The way they word it each time "real person" comes up I'm honestly unsure. I'll assume that means someone in real life.

But ... won't that happen accidentally with AI depictions?

reply
throwanem
12 days ago
[-]
I assume they mean any natural person, and whether it happens accidentally (as opposed to "accidentally") does not appear at issue here.

You can probably get Stable Diffusion to violate likeness rights if you try hard enough, but there is a material difference between that and publishing a tool making it trivial for anyone to do so. These are, or I suppose were, such tools: Loras specifically trained to replicate the appearance of specific natural persons. For public figures the rules are slightly different when it comes to one's likeness, but not absent. I can't imagine anyone sensible hosting that stuff for one second, Section 230 or no. But either way, however long this was going on, only a fool would assume it would last forever.

reply
Waterluvian
13 days ago
[-]
It’s ambiguous but I have a sense that it’s the latter. The former would be quite the change.
reply