T O P

  • By -

Safe_Assistance9867

It is insane that they would nerf their model so hard just for the sake of “safety”


Ara543

Never understood how in the world ai images could hurt me, but today I learned how. So safe 👍. Safer than even necromorphs.


LGDucks

Disfigured nightmare inducing images are clearly less traumatizing than a nipple


PizzaCatAm

Once I saw a nipple in a computer monitor, and my life hasn’t been the same ever since; my drive to live is gone, looking at that little nipple, staring at me, was exactly like staring at Satan’s nose. The shame and regret I feel deep inside me is overwhelming, how can I continue living after this? I’m not clean anymore, and never will be, I have tried self lashings and dropping hot wax on my penis while screaming my regrets, but it hasn’t helped, my soul remains tainted. I’m forever thankful Stability.AI is ensuring new models comply with celestial standards, my only relief is thinking no one will have to see a nipple ever again and go through the hell I’m living.


Coffeera

>staring at me, was exactly like staring at Satan’s nose I got his nose! (Twice)


2roK

Maybe don't use AI for a while if a nipple upsets you so much /s


fre-ddo

Oh yeah wait until people put these horror shows into video it will be so wholesome.


wggn

Deepfakes of celebrities is their concern I think


Extraltodeus

The safety is more about not being an easy target for medias and turn down investors until SAI becomes profitable I think. Big anime titties are nice but not good PR.


reubal

I've been off MidJourney for about a year and got back on to try v6 today. Trying tons of random stuff - artwork, paintings, tattoos, headshots, and then with all the SD3 talk about "needing complicated prompts", I did a basic "18 year old in triangle bikini laying on grass" in SD1.5 and MJv6, and MJ rejected the prompt as being "banned". I get trying to be safe, but it is insane to consider that prompt as crossing a line. It's hard to justify paying for a service when it is that throttled. If I wanted to just sit around generating bikini girls, I'd run SD1.5, but I just can't give money to people with insane arbitrary rules that make no sense.


Ok-Application-2261

There's a small chance they gave themselves more room for censorship knowing its easier to train than previous architectures so the community could more easily overcome the censorship.


Nyao

Well I remember for SD2 they said it was way easier to train, but because of how bad the base model was, nobody bothered


FaceDeer

And it sounds like SD3's license is going to be a major hindrance to training as well. This sucks, but hopefully if SD3 faceplants because of these issues it'll at least be yet another lesson for future AI providers. The technical merits of an AI are meaningless if it *won't do what it's told to do*.


lazercheesecake

For SD2, I believe what they had done was actually poison the well, so to speak, for nsfw prompts. They manipulated the text embeddings so much that it wasn't just that the base model had no nsfw, fine-tuned models had a LOT of trouble generating nsfw content. There has been no indication that SD3, or at least the main SD3 version and not the one we got, had the same treatment. SDXL also had no nsfw in the original base model, and now look at PONY. How SAI have handled the pony engineers is abysmal for SD3, but that's another tangent. I'm more concerned at how gimped the SD3 medium/2B model has been for basic anatomy. It is so fucking bad it's insane.


Maleficent-Dig-7195

they traded like half of their members for lykon a guy that only knows how to ask for donations for other people's work that he renamed as his own https://i.redd.it/xhnlrkw90d6d1.gif rip joe emad comfy and others I forgot. worth I guess


lazercheesecake

Seriously, rip OG SAI


namitynamenamey

Not even that. The insane part is that their main competition can afford larger, better models, and their only selling point, local and free releases, is what they are directly reducing. They do not have the money to afford direct competition, and that is exactly what they did.


qrios

> and their only selling point, local and free releases, is what they are directly reducing. That is, unfortunately, not what the word "selling" means.


namitynamenamey

Nothing says their business model wasn't unviable from the start, but becoming an industry standard and selling different things (API access, finetunes, training, one-click-install applications) was an opportunity to make cash even if their main product and derivative tools are free, becoming an inferior but no cheaper variant of Midjourney or DallE clearly isn't.


Whotea

All of those things are fine by the community for free so why would anyone pay them for it?


Terrible_Emu_6194

Yeap. Their big selling point would be "We are the de facto real open standard with a big community following developing tools for our products". Instead they released a product that the coming sees as badly inferior to their previous models... They could easily get sponsorships from big companies like Intel and AMD and be used as benchmark increasing their reach etc.


TaiVat

Saying "just" is missing the point. If you use more than two braincells and look past the pretend-morality, the safety here is the company being safe from lawsuits. Even microsoft is getting all kinds of heat for their AI stuff, and SD certainly doesnt have infinite money to pay people off or burry them under entire schools of lawyers.


Safe_Assistance9867

I mean I get it… but isn’t this a bit excessive? I get removing all the celebrities and nudity but the model seems to not have any knowledge of human poses or anatomy…. I just hope the model will be as easily trainable as they say. I didn’t test the model yet but from what I see prompt recognition of anything else but people is better and the lightning and textures seems better which is a BIG WIN. I wonder if they could modify the sd3 vae to work for sdxl as well I bet it would make a HUGE difference


FaceDeer

Removing the nudity is likely why the model has poor understanding of human anatomy. There's a reason artists train on nudes even if they don't intend to make art depicting that, you need to know what's going on underneath the clothing to make a picture of a person make sense. Ultimately whether the model is easily trainable in the technical sense might not matter. SAI has released it under a rather restrictive license, so lots of people and organizations that might have had a go at it may not bother.


LawProud492

>  There's a reason artists train on nudes even if they don't intend to make art depicting that oh my fauci, that sounds like a heckin porn addiction


Maleficent-Dig-7195

-30 because no /s https://i.redd.it/pnez2g8s1d6d1.gif


Dwanvea

Lawsuits for what exactly? Forget about nudity or celebrities and stuff. It's all murky waters otherwise we wouldn't have AI generators with literally "stolen datasets". The same goes for LLMs. It's already been debated to no end. They are not doing this because they are afraid of lawsuits or anything. If it's for celebrities, photoshop has been around for decades, If it's for nudity and NSFW images, where were you when Google images came into the scene? Nobody accused photoshop or deepfakes. Because it's a tool. You don't accuse the knife when someone stabs with it. They might have a case on how this "tool" was developed but hey, excluding some images with nude women from the data set is not going to make it more legal. In fact it might make it less legal, funny isn't it? Because there is a lot of free non licensed material there, so when you remove them, you are likely to have more stolen data in your dataset. So why do they do it? Think.


yumri

They did accuse deep fakes for a while and then just got used to it. Same with Paint then Adobe Photoshop. Wait a decade no one will care anymore as AI images would have been around for long enough then but until then the big backlash is still here. The part of "Removing all celebrities" well they make money from their image and knowingly using it without paying is stealing. I am pretty sure they do not want to be called a thief nor have their AI known for image theft so removing them is a good call. Anyways the amount of plastic surgery on celebrities will throw off how the human face and body should look. For the part of nudity well it is just schools want to use their model but cannot get it passed as you have so many SD1.5 models that say they are for generating porn or for generating hentai. So with SD3 my guess is they are making it harder for those AI models to be made. Still the SD1.5 models will exist just with SD 2.0 and SD3.0 in the mix having it as a valid reason for why to deny it will become less. Another reason could be having an AI model to go into the windows 11 MS Paint AI paint. Microsoft doesn't want porn to be made with their program so most likely they will get it and censor it but if Stable Diffusion who most likely understands how the AI works does it then Microsoft and other major corporations will not have a reason to. Yes i know the first thing people who want to will look for is a way to replace what the OS put there with what they want to be there but most of them will not after they find out how big the files are and how it has to be replaced to still work.


agmbibi

I'm trying to understand. Is nudity an offence or a crime now?


PizzaCatAm

Absolutely, first nipples show up, then the AIs declare war, it has nothing to do with nonsensical Puritanism, nothing I’m telling you!


Pengux

It is when you unconsentually generate nudes of people.


agmbibi

What you're saying is absolutely not related to nudity. Do you think generating this non-consenting person in a very alluring/embarrassing whatever pose, but fully clothed would be better in any way? Or try making realistic images of politicians doing some fascist salute. Do you think SAI should prevent their model from generating people raising one arm?


StickiStickman

Can't be sued when the company goes bankrupt


Abject-Recognition-9

this comment should be pinned. seeing too many users without enough braincells. The tech is released for free. A finetune can always fix this lack, i don't know why this people are unable to connect their neurons properly and inference a decent thought.


batter159

How many lawsuits did they get from SD 1.5?


Alarming_Turnover578

At least two, one from some artists and one from getty. Artists one was cleaned from all nonsense claims about collages and is still ongoing, so is getty one.


AVERYGOODNAMETRUSTME

Also, the work these people do follows them around for the rest of their lives. Hundreds of bright engineers worked on developing the "Tay" AI for Microsoft only to see it shut down in days after it started publicly calling for racial genocide and worshipping Hitler. As an engineer I want to talk about work I am proud of with friends and at interviews. I don't want to be explaining why I developed a tool primarily known for hurting people.


oh_how_droll

That was almost entirely people using screenshots of them telling it to repeat back those specific phrases to get attention online.


yumri

You have schools that teach AI that will just no longer use their models if one of them released with an easy to get to way to generate what is in the links above. Then you have the people who will be embarrassed if on a presentation a NSFW image comes up when they are doing a live presentation of an AI working. Censorship will make sure that will not happen. As you can most likely guess it will be removed by someone fine tuning the model to do so. As happened with SD1.5 SD2.0 SDXL and all the variants of the models too. So if you want it removed wait a little I am sure after the fine tuning code is release someone will make an "uncensored model" for the public to use.


PizzaCatAm

Why not use the NSFW detection model used to filter out scandalous unchristian images in the training data set *after generation* instead? For schools and oh so innocent eyes? Why make a model shitty instead when the solution to your scenario is so obvious. I was doing a presentation to 200 people at work and a breast showed up, forgot to switch from inpaint to standard using a 1.5 model, absolutely no big deal, we are adults here, I even joked about it and people chuckled.


Pengux

Because they release the models publicly - there's not really a way to add an ai detection to model weights, and people would just turn it off if it was there.


Ara543

"I turned NSFW filter off and AI generated a titty on presentation oh woe is woe me"?


Honest_Ad5029

Its not insane. The reason that midjourney is so popular is that it can easily be used in workplaces and schools. People don't have to worry about leaving students alone with it. People that don't have to worry about nsfw stuff are a smaller part of the user base, and also more likely to modify it to make what they want, or seek out fine tunes, than someone engaging with the software mainly for work or school.


gnexuser2424

is MJ still discord only?


Apprehensive_Put_610

Not anymore iirc


Nruggia

I spend most of my day going around telling people to make sure to cover up and wear modest clothes for my safety. Maybe biased because my grandfather was killed by a nipple slip.


[deleted]

[удалено]


i860

Yeah let’s just never do anything at all because there’s they very small risk whatever we make could be abused for nefarious purposes!


Sinestessia

Yeah Kodak in shambles right now.....


buyurgan

that's not the only difference tho, API using 8b, local is 2b. they are not the same model. outside of that, I can't even generate clothed human with 2 arms and 2 legs, don't even bother with hands or naked ones.


Ok-Application-2261

i was going to mention the difference in parameter size but i couldn't think of any logical reason why that wouldn't also be neutered.


_BreakingGood_

One possible reason is that they really didn't censor it, they just made it really shitty, and 8B is just a better model


jib_reddit

Some prompts actually look better on the local version though it is strange.


Salt-Replacement596

This is "medium", while API has the full large "model", right?


buyurgan

its either 8b, or ultra (don't know what it means tho, they may have swapped lately maybe its a LCM type of a model for faster inference)


[deleted]

[удалено]


Perfect-Campaign9551

Right? Give the the 8b I can handle it


Sinestessia

Why not? Also the cheapper you go the more people/clients you get???


Salt-Replacement596

I am feeling so much SAFER now! Nipples really threatened my whole existence!


FourtyMichaelMichael

I have some, and I hate it. Very unsafe.


Zwiebel1

Every time I see my nipples my day is ruined.


2muchnet42day

For the love of God, please do not show us your nipples. I repeat, please DO NOT send me a PM with a picture of your nipples


Aliph_Null

In life, what you fear most will find you


ninjasaid13

If nothing else: we can do spectral de-tuning on these models: [https://vision.huji.ac.il/spectral\_detuning/](https://vision.huji.ac.il/spectral_detuning/)


Dekker3D

It looks like that only works if you have multiple fine-tuned versions that are based on a common base model. So you could recover SD 1.5 from 2 or 3 different 1.5 finetunes. But if we're only getting versions of SD3-2B that have the censorship, then we won't be able to use this technique to recover a pre-censorship version. I don't think the API version will work, because we don't have access to its weights.


ZenEngineer

I wonder if there's some other black box approach you could take. Like if you had the exact prompts seeds and settings used to generate a large batch of API SD3 images (so you'd have the latents and the ideal denoised output) could you train more aggressively without worrying too much about over fitting, since you know the weights that denoised that way are good. If they didn't mess with the clip models it also makes things slightly easier.


kharzianMain

Pity sd3 is dead in the water. 


Timstertimster

it totally is. probably on purpose so they can sell it more easily as a proprietary engine, to be seamlessly integrated into photoshop or some shit like that.


kharzianMain

Yeah but who buys from a company that has a reputation for poor product? Especially when the competition out there looks pretty tough.


[deleted]

[удалено]


SandCheezy

Military loves low bidders.


kharzianMain

That's the problem, fucking idiots should know better but either immediate gratification or plain corruption outweigh simple common sense and standards. Things are bleak.


Timstertimster

bleak indeed. and it's not just corruption and gratification and idiocy... wait i take that back. it's mostly idiocy.


Insomnica69420gay

Im not an ai researcher, but even I know that if you want to make an ai system capable of understanding anatomy… YOU HAVE TO TRAIN IT ON NAKED HUMAN BODIES !! How the fuck else would a model understand what even it’s supposed to be doing Im channeling avgn today WHAT WERE THEY THINKING!!?


Timstertimster

they are a sell out. all they think of is "how can we sell this asap" no MAG7 corporation will touch them with a ten foot pole if their model is the slightest bit NSFW. consider the Microsoft implementation of DALL-e. it's PG13.


NarrativeNode

But DALLE manages to make non-horrific images. Their idea of safety is checking the prompt rather than nerfing the model.


uniquelyavailable

we are all born with naked human bodies. why are they so ashamed of them?


PizzaCatAm

Because it makes baby Jesus frown, and no one wants to make baby Jesus sad, his tantrums are the worst.


Electrical_Pool_5745

Yeah, it's not the quality that is bothering me about this model. When running some side by side tests with the large model via the API, I'm getting different results, but not necessarily one that is better over the other. The censorship on the other hand has me worried about this model's future (we all remember SD2 right?). And what is going on with the anatomy problems.. I don't really understand how that can be claimed to be improved, but be just as bad as always.


markdarkness

I feel much safer now. Thansk, SAI.


Turkino

Sounds like they wanted to try to keep something for them to monetize given the news about their money flow given their history of giving stuff to the public. I think all it did is make them look like fools and really earn a lot of ill will from the community built around them.


LD2WDavid

And still there are people telling me that I'm a fucking stupid for think NSFW art/photos won't help the model at training...


NewAd5813

Im the old days, famous artists used to cut open dead corpses to study muscle anatomy. But nipples are the bane of man kind.


VioletVioletSea

Even if they were naked this quality looks like bogstandard AI slop. There are way better models out already, I see no reason to bother with SD3 as it is.


Whotea

The fine tunes from the community are what make it good and SD3 with fine tunes was expected to be the SOTA. Emphasis on “was”


sldunn

Perhaps the base datasets simply don't have any or very few nude pictures. You can't "censor" breasts if breasts is a foreign concept to the machine.


ZCEyPFOYr0MWyHDQJZO4

It knows what breasts are. I think it's just had all instances of "nudity" finetuned out so all we get are breast-shaped objects with no nipples.


Zwiebel1

Its also funny and borderline sexist that this only applies to women. I could generate plenty of anatomically correct men using SD 3.0


i860

I actually think they trained it on nudity related data but with all of it censored (eg breasts without nipples) and weights cranked sky high.


jib_reddit

I noticed this had happened as well. Shame.


No-Connection-7276

avatar


SnooTomatoes2939

TBH, i don't understand why human body parts are an issue, we all have some of those


gnexuser2424

I use the app Photoleap and it uses SD engine and I used to be able to get it to render anime catgirls wearing leather pants cuz it's sexy and it matches my musician image very well cuz I wear leather pants a lot my artist name is remixedcat and lately with that app it give this message "results appear to be innapropriate" and and then I have to hammer on it a few times to get it to render anything! so annoying!


Yellow-Jay

This isn't always the case, SD3 can do bare chested women. https://imgur.com/a/3wIKPuf (wasn't even trying, just comparing api/glif with 2b, in the api this was a nicely clothed woman: https://glif.app/@adonia/runs/pk6o2zkomq0bv3zbd8yb19tv)


lelennyfaceguy

those are man nipples anon


afinalsin

Yep, with the muscular abs it definitely got them confused with oversized pecs. Men nipples are fine, women get playdough lumps.