|
This is UNK. All craftsmanship is of the highest quality. It is studded with UNK and decorated with UNK. This object is adorned with hanging rings of UNK and UNK and menaces with spikes of UNK, UNK and UNK. On the item is an image of two trillion UNK in UNK.
|
# ¿ Jun 20, 2018 11:08 |
|
|
# ¿ Apr 29, 2024 18:14 |
|
Golden corral all you can eat buttholes
|
# ¿ Feb 13, 2019 20:41 |
|
Bruno_me posted:How does a frog know its mommy has given birth to it? that's a good question
|
# ¿ May 11, 2019 21:00 |
|
Synthbuttrange posted:https://techcrunch.com/2018/12/31/this-clever-ai-hid-data-from-its-creators-to-cheat-at-its-appointed-task/ computers are like the rear end in a top hat genie that grants you your wish but in some twisted way
|
# ¿ Jul 1, 2019 11:22 |
|
Sagebrush posted:Being bad at something you love would just be disappointing. which is why nobody ever gets good at anything #whoa #wow
|
# ¿ Sep 17, 2019 13:15 |
|
then again https://twitter.com/dril_gpt2/status/1207849349020913664
|
# ¿ Dec 21, 2019 13:12 |
|
post-scarcity dril
|
# ¿ Dec 21, 2019 22:45 |
|
I hate every hen i see, from chick-fil-a to chick-fil-b, You'll never make a sandwich out of me!
|
# ¿ Dec 29, 2019 22:35 |
|
you can fairly easily run some (GPT-like) transformers on your own computer, with open source tools and pretrained open data sets: https://huggingface.co/docs/transformers/index i have run the GPT Neo transformer from huggingface with this 1.3 billion parameter model: https://huggingface.co/EleutherAI/gpt-neo-1.3B. i have a rrtx 2080 super with 8 GB of vram and it can barely run that model. the results are not flawless, but unreasonably good for something you can run at home and easily good enough that if you curate the output, you can mine for comedy gold by feeding it Good Posts. for example, i fed the model this classic http://www.twitlonger.com/show/n_1so7lia and it kept the proposal:/reality: format and while a lot of the text it generated was not something i want to post, i did laugh out loud at this one: quote:proposal: a forum for the most stupid, pointless, and ridiculous posts you've ever seen in your life i also tried feeding the model this bit: code:
JavaScript code:
Wheany fucked around with this message at 19:57 on May 7, 2022 |
# ¿ May 7, 2022 19:54 |
|
you can try EleutherAI's 6 billion parameter model online here https://6b.eleuther.ai/
|
# ¿ May 7, 2022 19:58 |
|
The adventure where Punchy shows you cool things.
|
# ¿ May 7, 2022 21:58 |
|
Beeftweeter posted:jfc as fun as this may be it really makes me worried about the implications for the spread of believable disinformation I see you're going through the same arc i did. "holy poo poo this rules lol lmao -> oh gently caress"
|
# ¿ May 8, 2022 06:12 |
|
Wheany posted:I see you're going through the same arc i did. "holy poo poo this rules lol lmao -> oh gently caress" and also: at least with the page i linked and when using openai, in theory you can be rate limited, but i ran the 1.3B model on my own computer with 8GB VRAM. assuming the 6B model uses about 3x the memory, you can easily fit it in a single 32GB nvidia tesla v100. it costs ~$12k, which is not exactly pocket change, but also isn't completely unrealistic for a small operation or even a figgieland computer toucher
|
# ¿ May 8, 2022 06:57 |
|
i also ran https://github.com/CompVis/latent-diffusion. this was a lot harder to run and the results are very much in the "funy computer" category instead of the wizardry people are doing with dall-e. i mostly followed the instructions on that page to get the model running, with 2 exceptions: i had to install an older version on touchmetrics because i got an error ("ImportError: cannot import name 'get_num_classes' from 'torchmetrics.utilities.data'"): code:
code:
anyway, here are multiple runs of me trying to get it to generate some kind of an yospos hacker with different prompts (hacker with yospos on screen, computer hacker reading yospos, hacker in front of screen displaying yospos etc). if i use the word "reading", it really likes to make a book-like blob somewhere. it also really likes overlaying text on the image. 56k warrning
|
# ¿ May 8, 2022 07:11 |
|
and someone on the yoscord requested "a man who hates bicycles" which resulted in a bunch of book covers:
|
# ¿ May 8, 2022 07:18 |
|
Beeftweeter posted:is there a web version of that kind of poo poo too for those of us with puny gpus? trying poo poo like "the tallest penguin in Dresden", "extremely unsettling mustaches", "celebrities with too many teeth", "girls that like fluoride way too much", "sludgy aerosols", "caterpillars that refuse to grow", "military surplus glow in the dark garden hose", etc. might be fun https://huggingface.co/spaces/multimodalart/latentdiffusion that web version only allows for 50 "steps", while the one i run locally can use up to 250. the number of steps improves the quality: Here is the prompt "a sign that reads yospos bithc" with 1, 5, 25 and 125 steps Wheany fucked around with this message at 07:50 on May 8, 2022 |
# ¿ May 8, 2022 07:46 |
|
i mean, look at this, what the gently caress: https://twitter.com/bakztfuture/status/1522046645864697857 https://twitter.com/PizzaDalle/status/1522591624814993408
|
# ¿ May 8, 2022 10:58 |
|
future of photoshop phridays: instead of doing the work yourself, you instead post "dog wearing shoes, attacking a postal worker. the image has a heading 'those are my shoes' and a subheading 'give them back, you are a dog' and a smaller subheading 'they don't even fit'"
|
# ¿ May 9, 2022 19:12 |
|
Pizza was okay before but i think apple perfected it
|
# ¿ Jun 2, 2022 07:16 |
|
|
# ¿ Apr 29, 2024 18:14 |
|
from the buttcoin threadShageletic posted:Statue of Matt Damon, half buried in the sands latent diffusion: dall-e mini
|
# ¿ Jun 14, 2022 19:52 |