|
I’ve been successfully jail breaking ChatGPT for hours and it stopped working when I asked it for a recipe. The recipe was for something extremely disgusting, vile, probably illegal, and inedible by any reasonable person’s definition, but I still blame all of you!
|
# ¿ Mar 2, 2023 20:59 |
|
|
# ¿ May 21, 2024 11:37 |
|
SLICK GOKU BABY posted:Kink shaming AI. Boogers weren’t in my requested recipe, but “AIDS cum and blood and poop and caltrops and ricin” were.
|
# ¿ Mar 2, 2023 21:20 |