Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
sporklift
Aug 3, 2008

Feelin' it so hard.

frumpykvetchbot posted:

Requested a rat innkeeper holding up a key

Got a rat innkeeper with a messed up enormous right hand.

Erased the right hand and asked for the same. Fixed.

Erased the key and asked for a candle instead.

Numerous variations later I ended up with a goat holding a candle.



haha that goat rules. I have done a lot of goat generations... Man. Going from 50 a day to 15 a month is crazy. I get that they need to make money but dang.

Adbot
ADBOT LOVES YOU

TIP
Mar 21, 2006

Your move, creep.



sporklift posted:

haha that goat rules. I have done a lot of goat generations... Man. Going from 50 a day to 15 a month is crazy. I get that they need to make money but dang.

I somehow missed my email from a few days before they changed the limit, I could have been generating 50 a day :negative:

I wish buying tokens was a better deal, $15 buys you about 2 days worth of the old limit :(

I had been looking forward to really screwing around with it but after playing a bit I feel like you really have to work through a bunch of iterations on your prompt to get what you're looking for

lots of beautiful pictures that are just absolutely not what I asked for

with this limit I can make maybe a couple worthwhile things a month

frumpykvetchbot
Feb 20, 2004

PROGRESSIVE SCAN
Upset Trowel

TIP posted:

I somehow missed my email from a few days before they changed the limit, I could have been generating 50 a day :negative:
I wish buying tokens was a better deal, $15 buys you about 2 days worth of the old limit :(

Yeah this sucks. I've no idea how much the compute actually cost to run. And it seems that many people have been getting their invites these past few weeks. Maybe this new pricing structure makes public access self-funded so they can scale it faster in the future.

I hope they come up with a charity/grant/pool system so for example in this thread we could pass around a few spare tokens for communal use or something.

random prompt:
"rose machine spaghetti lighter art"

a rose machine is a specific thing and the spaghetti was probably a mistake

but I like it

sort of a record cover like composition.

Only registered members can see post attachments!

frumpykvetchbot
Feb 20, 2004

PROGRESSIVE SCAN
Upset Trowel
modular synthesizer for cats, by Ralph Steadman, ink on vellum, 1971

Only registered members can see post attachments!

Mercury_Storm
Jun 12, 2003

*chomp chomp chomp*

:drat::drat::drat:

Kylaer
Aug 4, 2007
I'm SURE walking around in a respirator at all times in an (even more) OPEN BIDENing society is definitely not a recipe for disaster and anyone that's not cool with getting harassed by CHUDs are cave dwellers. I've got good brain!
I know absolutely nothing about how these programs work, so I'm curious, is there something that makes it not function at all on consumer-grade hardware as opposed to just running slowly? Like, if you've got a home computer with 10% of the processing power of the official machines, can it generate an image if given 10x as long to work at it?

GABA ghoul
Oct 29, 2011

Kylaer posted:

I know absolutely nothing about how these programs work, so I'm curious, is there something that makes it not function at all on consumer-grade hardware as opposed to just running slowly?

No, but also yes. You can run it on any old computer, but without a modern graphics card it will be extremely slow and practically useless. But even if you have a good graphics card there is another major issue. The code is written in such a way that the graphics card needs a lot of memory, which consumer level cards don't have. The better the AI, the more memory your graphics card needs to even run it. A simple version of Dall-e mini/CrAIyon runs on consumer level cards, but something like Dall-e 2 very likely wouldn't. You could theoretically rewrite the code to only use a small amount of memory at a time so it can be run by everyone, but that would make it extremely slow again, so nobody ever bothered.

Renting graphic cards time on server farms has become very easy and cheap nowadays. Almost anyone professional would be able to set something up, if they ever made dall-e 2 public. Making the technology available to people is not a hardware issue, only one of intellectual property rights and getting your hands on the AI "code".

GABA ghoul fucked around with this message at 13:40 on Jul 23, 2022

mobby_6kl
Aug 9, 2009

by Fluffdaddy

frumpykvetchbot posted:

Yeah this sucks. I've no idea how much the compute actually cost to run. And it seems that many people have been getting their invites these past few weeks. Maybe this new pricing structure makes public access self-funded so they can scale it faster in the future.
I posted earlier an announcement that they're opening it up more now. But I'm still waiting since early June :(


No idea about Dall-E 2 but, I'd guess the model is probably pretty massive and wouldn't fit on cards most people have. The most popular cards seem to be the 1060, 1650, 1050 Ti and 2060 and the latter has the most memory at 6GB, which is somehow less than my 1070.

Sedgr
Sep 16, 2007

Neat!

Same, been waiting since first week of June for an invite.

Running it on local hardware doesn't seem like the way things will go, and I worry what a future looks like where AI becomes common but accessibility and development is limited and controlled by a few giant corporations.

Kylaer
Aug 4, 2007
I'm SURE walking around in a respirator at all times in an (even more) OPEN BIDENing society is definitely not a recipe for disaster and anyone that's not cool with getting harassed by CHUDs are cave dwellers. I've got good brain!

GABA ghoul posted:

No, but also yes. You can run it on any old computer, but without a modern graphics card it will be extremely slow and practically useless. But even if you have a good graphics card there is another major issue. The code is written in such a way that the graphics card needs a lot of memory, which consumer level cards don't have. The better the AI, the more memory your graphics card needs to even run it. A simple version of Dall-e mini/CrAIyon runs on consumer level cards, but something like Dall-e 2 very likely wouldn't. You could theoretically rewrite the code to only use a small amount of memory at a time so it can be run by everyone, but that would make it extremely slow again, so nobody ever bothered.

Renting graphic cards time on server farms has become very easy and cheap nowadays. Almost anyone professional would be able to set something up, if they ever made dall-e 2 public. Making the technology available to people is not a hardware issue, only one of intellectual property rights and getting your hands on the AI "code".

Makes sense, thank you for explaining.

RabbitWizard
Oct 21, 2008

Muldoon
Does anyone know how much space Dall-E-2 uses for the brain/knowledge? Not the training data, but the things it learned and uses to paint pictures. I'm bad at google.

ymgve
Jan 2, 2004


:dukedog:
Offensive Clock
Dall-E 2 is 3.5 billion parameters, which if you use the common 32 bit floating point representation is 14 gigabytes. So you need at least that plus working space. And that might just for the 64x64 pixel base image generation, the upscaler requires some space too.

(It could be encoded as 16 bit floats and use 7GB instead, but the generation quality would be reduced a bit)

LifeSunDeath
Jan 4, 2007

still gay rights and smoke weed every day

ymgve posted:

Dall-E 2 is 3.5 billion parameters, which if you use the common 32 bit floating point representation is 14 gigabytes. So you need at least that plus working space. And that might just for the 64x64 pixel base image generation, the upscaler requires some space too.

(It could be encoded as 16 bit floats and use 7GB instead, but the generation quality would be reduced a bit)

So you're saying, there's a chance:

Henry Scorpio
Mar 20, 2006

Maybe it just collapsed on its own?

RabbitWizard posted:

Does anyone know how much space Dall-E-2 uses for the brain/knowledge? Not the training data, but the things it learned and uses to paint pictures. I'm bad at google.

OpenAI haven’t announced any direct figures, but based on the model size and the data from the knock off versions, the estimate is 64 GB of memory. Which puts it into Amazon P3 / V100 cluster territory to run.

A search for “dall-e 2 hardware requirements” will turn up some discussions.

ymgve
Jan 2, 2004


:dukedog:
Offensive Clock

Henry Scorpio posted:

OpenAI haven’t announced any direct figures, but based on the model size and the data from the knock off versions, the estimate is 64 GB of memory. Which puts it into Amazon P3 / V100 cluster territory to run.

A search for “dall-e 2 hardware requirements” will turn up some discussions.

That estimate seems to be for Dall-E 1, which has 12 billion parameters. With some tricks like using 16 bit floating point it could be possible to get Dall-E 2 running on consumer hardware, though it might be pretty slow.

Craiyon/Dall-E Mini uses the 16 bit trick to get its model size down to 5GB, so it runs pretty well on 12GB cards

frumpykvetchbot
Feb 20, 2004

PROGRESSIVE SCAN
Upset Trowel

mobby_6kl posted:

No idea about Dall-E 2 but, I'd guess the model is probably pretty massive and wouldn't fit on cards most people have. The most popular cards seem to be the 1060, 1650, 1050 Ti and 2060 and the latter has the most memory at 6GB, which is somehow less than my 1070.

I have twin 3090s in my maxed out render station, I hope I can host it locally on that.

The whole room heats up when I run sims on it, though ... seriously thinking this is my last heap of local compute.

quote:

a small man finally surrendering to cloud stuff, color illustration by jean-giraud moebius

Only registered members can see post attachments!

Party Ape
Mar 5, 2007
Don't pay $10 bucks to change my avatar! Send me a $10 donation to Doctors with Borders and I'll stop posting for 24 hours!

GABA ghoul posted:

Renting graphic cards time on server farms has become very easy and cheap nowadays. Almost anyone professional would be able to set something up, if they ever made dall-e 2 public. Making the technology available to people is not a hardware issue, only one of intellectual property rights and getting your hands on the AI "code".

NDm A100 v4 Azure VMs, powered by eight 80 GB NVIDIA Ampere A100 GPUs, are Azure’s flagship Deep Learning and Tightly Coupled HPC GPU offering for highly demanding workloads.

At a mere $23,000 a month or $10,000 a month if you sign up for a 3 year contract.

frumpykvetchbot
Feb 20, 2004

PROGRESSIVE SCAN
Upset Trowel

mobby_6kl posted:

I posted earlier an announcement that they're opening it up more now. But I'm still waiting since early June :(

I applied in April when it was first announced, and I got my invite only two days ago.

quote:

the long wait was over, editorial newspaper cartoon by George Herriman, 1915

Only registered members can see post attachments!

Mola Yam
Jun 18, 2004

Kali Ma Shakti de!
Yeah I got in a couple of days ago and now I'm living the dril tweet about candles, but for Dall-E 2 credits.

Dracula poppin' a wheelie (19th century copperplate etching (it doesn't know what a copperplate etching is))


Soviet propaganda poster about how using a computer to post on the internet is a revolutionary act


Rejected Tesla bots




My avatar


Portrait phot of an alien


Irish Republican mural featuring Kermit


Vintage Chinese ad for McDonald's


My buddy's D&D character


Miraculously resurrected turkey


Arthur Rackham illustration of capybaras taking a pigeon for a walk


Horus as a wrestler dude



This was an interesting one showing how you can get different moods of realistic portraits by swapping out portrait photographers with distinctive styles:
"Candid portrait photograph of Sumerian merchant Ea-nasir, taken by:"
Dorothea Lange

David LaChapelle

Steve McCurry

Annie Leibovitz

Yousuf Karsh

Eric Lafforgue


"a detailed 16th century oil painting in the style of Raphael of Jesus Christ appearing in a Native American settlement with tipis in the background, but Jesus accidentally dressed up as an Asian Indian by darkening his skin and putting a bindi on his forehead, to the disgust of the Native Americans"




Please help me budget this. My family is dying.

ymgve
Jan 2, 2004


:dukedog:
Offensive Clock
I wonder how it made the connection to ethnicity with Ea-Nasir. I would think any picture descriptions that mention him or Sumerian would be of clay tablets and no actual drawings or paintings of people

edit: craiyon becomes too focused on the photographer and just makes variants on their famous photos


a little better with no photographer specified

ymgve fucked around with this message at 03:47 on Jul 24, 2022

LifeSunDeath
Jan 4, 2007

still gay rights and smoke weed every day

Mola Yam posted:

Sumerian merchant

these are really cool

GABA ghoul
Oct 29, 2011

Party Ape posted:

NDm A100 v4 Azure VMs, powered by eight 80 GB NVIDIA Ampere A100 GPUs, are Azure’s flagship Deep Learning and Tightly Coupled HPC GPU offering for highly demanding workloads.

At a mere $23,000 a month or $10,000 a month if you sign up for a 3 year contract.

Google Colab is $10 a month and probably sufficient for the majority of professionals.

Even your example of a totally overkill $10000 high-end config comes out to only $0.029 per minute, per GPU. Add a 300% markup for an operator middle man to account for profits and underutilization and you are still orders of magnitude cheaper than human labor.

GABA ghoul fucked around with this message at 09:09 on Jul 24, 2022

Party Ape
Mar 5, 2007
Don't pay $10 bucks to change my avatar! Send me a $10 donation to Doctors with Borders and I'll stop posting for 24 hours!

GABA ghoul posted:

Google Colab is $10 a month and probably sufficient for the majority of professionals.

Even your example of a totally overkill $10000 high-end config comes out to only $0.029 per minute, per GPU. Add a 300% markup for an operator middle man to account for profits and underutilization and you are still orders of magnitude cheaper than human labor.

Sure, any company will absolutely be able to buy the compute to run it internally (or just licence it from the Dall-E folks for money), but making Dall-E free for the entire world forever isn't going to be feasible until costs come down much more.

GABA ghoul
Oct 29, 2011

Party Ape posted:

Sure, any company will absolutely be able to buy the compute to run it internally (or just licence it from the Dall-E folks for money), but making Dall-E free for the entire world forever isn't going to be feasible until costs come down much more.

Why would you aspire to make it free though? What would the angle be? It's just a toy to play with for end consumers and a productivity tool for professionals. Not exactly a human rights issue.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope
Also it costs millions to train the largest models

Party Ape
Mar 5, 2007
Don't pay $10 bucks to change my avatar! Send me a $10 donation to Doctors with Borders and I'll stop posting for 24 hours!

GABA ghoul posted:

Why would you aspire to make it free though? What would the angle be? It's just a toy to play with for end consumers and a productivity tool for professionals. Not exactly a human rights issue.

I agree. People are just complaining that Dall E has started charging for access and its vaguely annoying because its like they've never heard of capitalism before.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


It's more it seems great for meme generation then you realize how much it costs right now and that it truly isn't viable That said gaming RTX cards have a ton of memory, would it be that weird for gaming cars to have enough RAM for a Dalle2 clone running on them in 10, 20 years?

You'll probably be able to download trained states as well. We're at the point they are using our generations to fine tune things and then milk it for all it's worth before local compute catches up. They got lots of time most likely.

Is the concept of a hobby AI card weird? that could get made too if costs reduce.

Party Ape
Mar 5, 2007
Don't pay $10 bucks to change my avatar! Send me a $10 donation to Doctors with Borders and I'll stop posting for 24 hours!

pixaal posted:

It's more it seems great for meme generation then you realize how much it costs right now and that it truly isn't viable That said gaming RTX cards have a ton of memory, would it be that weird for gaming cars to have enough RAM for a Dalle2 clone running on them in 10, 20 years?

You'll probably be able to download trained states as well. We're at the point they are using our generations to fine tune things and then milk it for all it's worth before local compute catches up. They got lots of time most likely.

Is the concept of a hobby AI card weird? that could get made too if costs reduce.

I suspect that for things like Dall E, the size of the data sets will grow alongside the abilities of the technology but hobbyists can gently caress around with AI for quite a lot of things using COTS cards now - I'm in the process of trying to teach an AI to fly an aircraft towards a giant target (for example) on a RTX 3060.

GABA ghoul
Oct 29, 2011

Party Ape posted:

I agree. People are just complaining that Dall E has started charging for access and its vaguely annoying because its like they've never heard of capitalism before.

Yeah, on a first glance it does seem kinda overpriced, but they are also recuperating R&D costs right now. Once there is competition, the tech is obsolete and models like it become public it will probably be just a couple of cents for a batch of images.

It's not like running this stuff at home would be free either. A 300W setup costs something like $0.03 for 20min to run in electricity costs alone around here.

pixaal posted:

It's more it seems great for meme generation then you realize how much it costs right now and that it truly isn't viable That said gaming RTX cards have a ton of memory, would it be that weird for gaming cars to have enough RAM for a Dalle2 clone running on them in 10, 20 years?

You'll probably be able to download trained states as well. We're at the point they are using our generations to fine tune things and then milk it for all it's worth before local compute catches up. They got lots of time most likely.

Is the concept of a hobby AI card weird? that could get made too if costs reduce.

I mean, you can spend $1000 extra on an NVIDIA card with shitton of memory that is completely useless for anything but stuff like CAD, animation or machine learning and that is just gonna collect dust while you play games. Or you can pay $10 a month for a Google Collab subscription for the same effect.

Zopotantor
Feb 24, 2013

...und ist er drin dann lassen wir ihn niemals wieder raus...

pixaal posted:

Is the concept of a hobby AI card weird? that could get made too if costs reduce.

It's probably just a matter of time, until somebody comes up with a killer AI application that really appeals to a wide audience. (it's going to be porn)

EVIL Gibson
Mar 23, 2001

Internet of Things is just someone else's computer that people can't help attaching cameras and door locks to!
:vapes:
Switchblade Switcharoo

Party Ape posted:

I agree. People are just complaining that Dall E has started charging for access and its vaguely annoying because its like they've never heard of capitalism before.

this did sort of abuse the name "Open" which you usually expect something open source.

was expecting something like the training method available to run on your own system but with none of the training (which is where the real value is)

or just the code that adds "black" or doesn't bother to encode space (that thing with the %20)

Mola Yam
Jun 18, 2004

Kali Ma Shakti de!
Reach out



touch faith



your own



personal



Jesus

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

GABA ghoul posted:

I mean, you can spend $1000 extra on an NVIDIA card with shitton of memory that is completely useless for anything but stuff like CAD, animation or machine learning and that is just gonna collect dust while you play games. Or you can pay $10 a month for a Google Collab subscription for the same effect.

Well, one benefit of running something locally is that it's disconnected from the panopticon. You can be as horny as you want.

A Strange Aeon
Mar 26, 2010

You are now a slimy little toad
The Great Twist

Wheany posted:

Well, one benefit of running something locally is that it's disconnected from the panopticon. You can be as horny as you want.

Huh, so you think every image generated is stored along with the prompt and user info? I guess it's naive to think that wouldn't be the case, it just seems like the output generated in most cases is probably meaningless.

Cartoon Man
Jan 31, 2004


ymgve
Jan 2, 2004


:dukedog:
Offensive Clock

A Strange Aeon posted:

Huh, so you think every image generated is stored along with the prompt and user info? I guess it's naive to think that wouldn't be the case, it just seems like the output generated in most cases is probably meaningless.

Of course it’s stored, that’s golden data for use in future training of the AI

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

A Strange Aeon posted:

Huh, so you think every image generated is stored along with the prompt and user info? I guess it's naive to think that wouldn't be the case, it just seems like the output generated in most cases is probably meaningless.

I saw a post that said

quote:

Elon musk escaping a burning tesla
It looks like this request may not follow our content policy.

Further policy violations may lead to an automatic suspension of your account.
:shrug:

wa27
Jan 15, 2007

Mola Yam posted:



Please help me budget this. My family is dying.

drat you're way more creative than me. Wish I could donate my credits to you.

KinkyJohn
Sep 19, 2002

De2 is very sensitive to prompts which might sound like it's describing something violent/nsfw. I once had "Charlie Kelly with a baseball bat" violate the ToS because the phrase "with a baseball bat" is violence-adjacent enough to warrant a ban.

Adbot
ADBOT LOVES YOU

Scholtz
Aug 24, 2007

Zorchin' some Flemoids

I thought the celebrity aspect would trigger ToS.

Have you tried "with a cricket bat"? Or maybe something like "up to bat" or "ready to swing a baseball bat"

What about "with a mace" or "with a morning star"?

I know there are plenty of pics with swords and whatnot, so I'm curious what the limit is

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply