|
frumpykvetchbot posted:Requested a rat innkeeper holding up a key haha that goat rules. I have done a lot of goat generations... Man. Going from 50 a day to 15 a month is crazy. I get that they need to make money but dang.
|
# ? Jul 23, 2022 03:48 |
|
|
# ? May 30, 2024 02:30 |
|
sporklift posted:haha that goat rules. I have done a lot of goat generations... Man. Going from 50 a day to 15 a month is crazy. I get that they need to make money but dang. I somehow missed my email from a few days before they changed the limit, I could have been generating 50 a day I wish buying tokens was a better deal, $15 buys you about 2 days worth of the old limit I had been looking forward to really screwing around with it but after playing a bit I feel like you really have to work through a bunch of iterations on your prompt to get what you're looking for lots of beautiful pictures that are just absolutely not what I asked for with this limit I can make maybe a couple worthwhile things a month
|
# ? Jul 23, 2022 04:03 |
|
TIP posted:I somehow missed my email from a few days before they changed the limit, I could have been generating 50 a day Yeah this sucks. I've no idea how much the compute actually cost to run. And it seems that many people have been getting their invites these past few weeks. Maybe this new pricing structure makes public access self-funded so they can scale it faster in the future. I hope they come up with a charity/grant/pool system so for example in this thread we could pass around a few spare tokens for communal use or something. random prompt: "rose machine spaghetti lighter art" a rose machine is a specific thing and the spaghetti was probably a mistake but I like it sort of a record cover like composition.
|
# ? Jul 23, 2022 04:35 |
|
modular synthesizer for cats, by Ralph Steadman, ink on vellum, 1971
|
# ? Jul 23, 2022 05:11 |
|
|
# ? Jul 23, 2022 05:43 |
|
I know absolutely nothing about how these programs work, so I'm curious, is there something that makes it not function at all on consumer-grade hardware as opposed to just running slowly? Like, if you've got a home computer with 10% of the processing power of the official machines, can it generate an image if given 10x as long to work at it?
|
# ? Jul 23, 2022 13:16 |
|
Kylaer posted:I know absolutely nothing about how these programs work, so I'm curious, is there something that makes it not function at all on consumer-grade hardware as opposed to just running slowly? No, but also yes. You can run it on any old computer, but without a modern graphics card it will be extremely slow and practically useless. But even if you have a good graphics card there is another major issue. The code is written in such a way that the graphics card needs a lot of memory, which consumer level cards don't have. The better the AI, the more memory your graphics card needs to even run it. A simple version of Dall-e mini/CrAIyon runs on consumer level cards, but something like Dall-e 2 very likely wouldn't. You could theoretically rewrite the code to only use a small amount of memory at a time so it can be run by everyone, but that would make it extremely slow again, so nobody ever bothered. Renting graphic cards time on server farms has become very easy and cheap nowadays. Almost anyone professional would be able to set something up, if they ever made dall-e 2 public. Making the technology available to people is not a hardware issue, only one of intellectual property rights and getting your hands on the AI "code". GABA ghoul fucked around with this message at 13:40 on Jul 23, 2022 |
# ? Jul 23, 2022 13:38 |
|
frumpykvetchbot posted:Yeah this sucks. I've no idea how much the compute actually cost to run. And it seems that many people have been getting their invites these past few weeks. Maybe this new pricing structure makes public access self-funded so they can scale it faster in the future. No idea about Dall-E 2 but, I'd guess the model is probably pretty massive and wouldn't fit on cards most people have. The most popular cards seem to be the 1060, 1650, 1050 Ti and 2060 and the latter has the most memory at 6GB, which is somehow less than my 1070.
|
# ? Jul 23, 2022 13:47 |
Same, been waiting since first week of June for an invite. Running it on local hardware doesn't seem like the way things will go, and I worry what a future looks like where AI becomes common but accessibility and development is limited and controlled by a few giant corporations.
|
|
# ? Jul 23, 2022 14:02 |
|
GABA ghoul posted:No, but also yes. You can run it on any old computer, but without a modern graphics card it will be extremely slow and practically useless. But even if you have a good graphics card there is another major issue. The code is written in such a way that the graphics card needs a lot of memory, which consumer level cards don't have. The better the AI, the more memory your graphics card needs to even run it. A simple version of Dall-e mini/CrAIyon runs on consumer level cards, but something like Dall-e 2 very likely wouldn't. You could theoretically rewrite the code to only use a small amount of memory at a time so it can be run by everyone, but that would make it extremely slow again, so nobody ever bothered. Makes sense, thank you for explaining.
|
# ? Jul 23, 2022 14:09 |
|
Does anyone know how much space Dall-E-2 uses for the brain/knowledge? Not the training data, but the things it learned and uses to paint pictures. I'm bad at google.
|
# ? Jul 23, 2022 14:59 |
|
Dall-E 2 is 3.5 billion parameters, which if you use the common 32 bit floating point representation is 14 gigabytes. So you need at least that plus working space. And that might just for the 64x64 pixel base image generation, the upscaler requires some space too. (It could be encoded as 16 bit floats and use 7GB instead, but the generation quality would be reduced a bit)
|
# ? Jul 23, 2022 15:28 |
|
ymgve posted:Dall-E 2 is 3.5 billion parameters, which if you use the common 32 bit floating point representation is 14 gigabytes. So you need at least that plus working space. And that might just for the 64x64 pixel base image generation, the upscaler requires some space too. So you're saying, there's a chance:
|
# ? Jul 23, 2022 15:31 |
|
RabbitWizard posted:Does anyone know how much space Dall-E-2 uses for the brain/knowledge? Not the training data, but the things it learned and uses to paint pictures. I'm bad at google. OpenAI haven’t announced any direct figures, but based on the model size and the data from the knock off versions, the estimate is 64 GB of memory. Which puts it into Amazon P3 / V100 cluster territory to run. A search for “dall-e 2 hardware requirements” will turn up some discussions.
|
# ? Jul 23, 2022 15:33 |
|
Henry Scorpio posted:OpenAI haven’t announced any direct figures, but based on the model size and the data from the knock off versions, the estimate is 64 GB of memory. Which puts it into Amazon P3 / V100 cluster territory to run. That estimate seems to be for Dall-E 1, which has 12 billion parameters. With some tricks like using 16 bit floating point it could be possible to get Dall-E 2 running on consumer hardware, though it might be pretty slow. Craiyon/Dall-E Mini uses the 16 bit trick to get its model size down to 5GB, so it runs pretty well on 12GB cards
|
# ? Jul 23, 2022 15:54 |
|
mobby_6kl posted:No idea about Dall-E 2 but, I'd guess the model is probably pretty massive and wouldn't fit on cards most people have. The most popular cards seem to be the 1060, 1650, 1050 Ti and 2060 and the latter has the most memory at 6GB, which is somehow less than my 1070. I have twin 3090s in my maxed out render station, I hope I can host it locally on that. The whole room heats up when I run sims on it, though ... seriously thinking this is my last heap of local compute. quote:a small man finally surrendering to cloud stuff, color illustration by jean-giraud moebius
|
# ? Jul 23, 2022 16:47 |
|
GABA ghoul posted:Renting graphic cards time on server farms has become very easy and cheap nowadays. Almost anyone professional would be able to set something up, if they ever made dall-e 2 public. Making the technology available to people is not a hardware issue, only one of intellectual property rights and getting your hands on the AI "code". NDm A100 v4 Azure VMs, powered by eight 80 GB NVIDIA Ampere A100 GPUs, are Azure’s flagship Deep Learning and Tightly Coupled HPC GPU offering for highly demanding workloads. At a mere $23,000 a month or $10,000 a month if you sign up for a 3 year contract.
|
# ? Jul 23, 2022 19:03 |
|
mobby_6kl posted:I posted earlier an announcement that they're opening it up more now. But I'm still waiting since early June I applied in April when it was first announced, and I got my invite only two days ago. quote:the long wait was over, editorial newspaper cartoon by George Herriman, 1915
|
# ? Jul 24, 2022 02:15 |
|
Yeah I got in a couple of days ago and now I'm living the dril tweet about candles, but for Dall-E 2 credits. Dracula poppin' a wheelie (19th century copperplate etching (it doesn't know what a copperplate etching is)) Soviet propaganda poster about how using a computer to post on the internet is a revolutionary act Rejected Tesla bots My avatar Portrait phot of an alien Irish Republican mural featuring Kermit Vintage Chinese ad for McDonald's My buddy's D&D character Miraculously resurrected turkey Arthur Rackham illustration of capybaras taking a pigeon for a walk Horus as a wrestler dude This was an interesting one showing how you can get different moods of realistic portraits by swapping out portrait photographers with distinctive styles: "Candid portrait photograph of Sumerian merchant Ea-nasir, taken by:" Dorothea Lange David LaChapelle Steve McCurry Annie Leibovitz Yousuf Karsh Eric Lafforgue "a detailed 16th century oil painting in the style of Raphael of Jesus Christ appearing in a Native American settlement with tipis in the background, but Jesus accidentally dressed up as an Asian Indian by darkening his skin and putting a bindi on his forehead, to the disgust of the Native Americans" Please help me budget this. My family is dying.
|
# ? Jul 24, 2022 03:34 |
|
I wonder how it made the connection to ethnicity with Ea-Nasir. I would think any picture descriptions that mention him or Sumerian would be of clay tablets and no actual drawings or paintings of people edit: craiyon becomes too focused on the photographer and just makes variants on their famous photos a little better with no photographer specified ymgve fucked around with this message at 03:47 on Jul 24, 2022 |
# ? Jul 24, 2022 03:39 |
|
Mola Yam posted:Sumerian merchant these are really cool
|
# ? Jul 24, 2022 03:40 |
|
Party Ape posted:NDm A100 v4 Azure VMs, powered by eight 80 GB NVIDIA Ampere A100 GPUs, are Azure’s flagship Deep Learning and Tightly Coupled HPC GPU offering for highly demanding workloads. Google Colab is $10 a month and probably sufficient for the majority of professionals. Even your example of a totally overkill $10000 high-end config comes out to only $0.029 per minute, per GPU. Add a 300% markup for an operator middle man to account for profits and underutilization and you are still orders of magnitude cheaper than human labor. GABA ghoul fucked around with this message at 09:09 on Jul 24, 2022 |
# ? Jul 24, 2022 09:07 |
|
GABA ghoul posted:Google Colab is $10 a month and probably sufficient for the majority of professionals. Sure, any company will absolutely be able to buy the compute to run it internally (or just licence it from the Dall-E folks for money), but making Dall-E free for the entire world forever isn't going to be feasible until costs come down much more.
|
# ? Jul 24, 2022 09:17 |
|
Party Ape posted:Sure, any company will absolutely be able to buy the compute to run it internally (or just licence it from the Dall-E folks for money), but making Dall-E free for the entire world forever isn't going to be feasible until costs come down much more. Why would you aspire to make it free though? What would the angle be? It's just a toy to play with for end consumers and a productivity tool for professionals. Not exactly a human rights issue.
|
# ? Jul 24, 2022 09:42 |
|
Also it costs millions to train the largest models
|
# ? Jul 24, 2022 09:45 |
|
GABA ghoul posted:Why would you aspire to make it free though? What would the angle be? It's just a toy to play with for end consumers and a productivity tool for professionals. Not exactly a human rights issue. I agree. People are just complaining that Dall E has started charging for access and its vaguely annoying because its like they've never heard of capitalism before.
|
# ? Jul 24, 2022 09:47 |
|
It's more it seems great for meme generation then you realize how much it costs right now and that it truly isn't viable That said gaming RTX cards have a ton of memory, would it be that weird for gaming cars to have enough RAM for a Dalle2 clone running on them in 10, 20 years? You'll probably be able to download trained states as well. We're at the point they are using our generations to fine tune things and then milk it for all it's worth before local compute catches up. They got lots of time most likely. Is the concept of a hobby AI card weird? that could get made too if costs reduce.
|
# ? Jul 24, 2022 10:43 |
|
pixaal posted:It's more it seems great for meme generation then you realize how much it costs right now and that it truly isn't viable That said gaming RTX cards have a ton of memory, would it be that weird for gaming cars to have enough RAM for a Dalle2 clone running on them in 10, 20 years? I suspect that for things like Dall E, the size of the data sets will grow alongside the abilities of the technology but hobbyists can gently caress around with AI for quite a lot of things using COTS cards now - I'm in the process of trying to teach an AI to fly an aircraft towards a giant target (for example) on a RTX 3060.
|
# ? Jul 24, 2022 11:38 |
|
Party Ape posted:I agree. People are just complaining that Dall E has started charging for access and its vaguely annoying because its like they've never heard of capitalism before. Yeah, on a first glance it does seem kinda overpriced, but they are also recuperating R&D costs right now. Once there is competition, the tech is obsolete and models like it become public it will probably be just a couple of cents for a batch of images. It's not like running this stuff at home would be free either. A 300W setup costs something like $0.03 for 20min to run in electricity costs alone around here. pixaal posted:It's more it seems great for meme generation then you realize how much it costs right now and that it truly isn't viable That said gaming RTX cards have a ton of memory, would it be that weird for gaming cars to have enough RAM for a Dalle2 clone running on them in 10, 20 years? I mean, you can spend $1000 extra on an NVIDIA card with shitton of memory that is completely useless for anything but stuff like CAD, animation or machine learning and that is just gonna collect dust while you play games. Or you can pay $10 a month for a Google Collab subscription for the same effect.
|
# ? Jul 24, 2022 11:48 |
|
pixaal posted:Is the concept of a hobby AI card weird? that could get made too if costs reduce. It's probably just a matter of time, until somebody comes up with a killer AI application that really appeals to a wide audience. (it's going to be porn)
|
# ? Jul 24, 2022 12:16 |
|
Party Ape posted:I agree. People are just complaining that Dall E has started charging for access and its vaguely annoying because its like they've never heard of capitalism before. this did sort of abuse the name "Open" which you usually expect something open source. was expecting something like the training method available to run on your own system but with none of the training (which is where the real value is) or just the code that adds "black" or doesn't bother to encode space (that thing with the %20)
|
# ? Jul 24, 2022 12:51 |
|
Reach out touch faith your own personal Jesus
|
# ? Jul 24, 2022 13:45 |
|
GABA ghoul posted:I mean, you can spend $1000 extra on an NVIDIA card with shitton of memory that is completely useless for anything but stuff like CAD, animation or machine learning and that is just gonna collect dust while you play games. Or you can pay $10 a month for a Google Collab subscription for the same effect. Well, one benefit of running something locally is that it's disconnected from the panopticon. You can be as horny as you want.
|
# ? Jul 24, 2022 14:06 |
|
Wheany posted:Well, one benefit of running something locally is that it's disconnected from the panopticon. You can be as horny as you want. Huh, so you think every image generated is stored along with the prompt and user info? I guess it's naive to think that wouldn't be the case, it just seems like the output generated in most cases is probably meaningless.
|
# ? Jul 24, 2022 14:33 |
|
|
# ? Jul 24, 2022 14:46 |
|
A Strange Aeon posted:Huh, so you think every image generated is stored along with the prompt and user info? I guess it's naive to think that wouldn't be the case, it just seems like the output generated in most cases is probably meaningless. Of course it’s stored, that’s golden data for use in future training of the AI
|
# ? Jul 24, 2022 15:07 |
|
A Strange Aeon posted:Huh, so you think every image generated is stored along with the prompt and user info? I guess it's naive to think that wouldn't be the case, it just seems like the output generated in most cases is probably meaningless. I saw a post that said quote:Elon musk escaping a burning tesla
|
# ? Jul 24, 2022 15:12 |
|
Mola Yam posted:
drat you're way more creative than me. Wish I could donate my credits to you.
|
# ? Jul 24, 2022 17:11 |
|
De2 is very sensitive to prompts which might sound like it's describing something violent/nsfw. I once had "Charlie Kelly with a baseball bat" violate the ToS because the phrase "with a baseball bat" is violence-adjacent enough to warrant a ban.
|
# ? Jul 24, 2022 19:16 |
|
|
# ? May 30, 2024 02:30 |
|
I thought the celebrity aspect would trigger ToS. Have you tried "with a cricket bat"? Or maybe something like "up to bat" or "ready to swing a baseball bat" What about "with a mace" or "with a morning star"? I know there are plenty of pics with swords and whatnot, so I'm curious what the limit is
|
# ? Jul 24, 2022 19:47 |