Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Lord Hydronium
Sep 25, 2007

Non, je ne regrette rien


Rollersnake posted:

My all-time favorite picdescbot post, and I think a classic example of bot logic absurdity, is "a group of baseball players playing a football game."
It doesn't seem to quite have sports down.

https://twitter.com/picdescbot/status/1053979265849589760
https://twitter.com/picdescbot/status/1053843390184460288

Adbot
ADBOT LOVES YOU

Kennel
May 1, 2008

BAWWW-UNH!
Maybe it'll learn to call everything sportsball.

Hempuli
Nov 16, 2011



Here're some well-timed Thanksgiving cooking ideas as provided by @JanelleCShane! Later in the thread there are also ones with D&D spells, apple names etc. added to the mix.

https://twitter.com/JanelleCShane/status/1065311580332670976

SneezeOfTheDecade
Feb 6, 2011

gettin' covid all
over your posts
I'm Bigby's Gluring Strazbert.

AKA Pseudonym
May 16, 2004

A dashing and sophisticated young man
Doctor Rope

Rollersnake posted:

Picdescbot has some fun quirks, like seeing invisible sheep in photos of empty fields, calling any building taller than it is wide a clock tower, and describing things that are not vehicles as "parked on the side of a building."

My all-time favorite picdescbot post, and I think a classic example of bot logic absurdity, is "a group of baseball players playing a football game."

https://twitter.com/picdescbot/status/1051759637970804736

Extra fun cause it's the Browns

edit: It works the other way around too sometimes

https://twitter.com/picdescbot/status/1061604497456283648

AKA Pseudonym has a new favorite as of 02:50 on Nov 22, 2018

Hempuli
Nov 16, 2011



https://twitter.com/picdescbot/status/1066179646482464768

:iceburn:

Babe Magnet
Jun 2, 2008

not really, I'm pretty sure that's just Badly Drawn Boy

e: yeah it's the first hit on google images when you try and search for his music

Hempuli
Nov 16, 2011



Well, that was unexpected! Now I wonder how the bot's algorithm works & what kind of data it uses for the generation. :O

Babe Magnet
Jun 2, 2008

yeah it raises some pretty neat questions, that or it developed a sense of humor

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



I think many of those types of bots have a final layer that cleans the possibly garbled NN output and turns it into human language. Maybe why 'boy' became 'person'

Phlegmish
Jul 2, 2011



Babe Magnet posted:

not really, I'm pretty sure that's just Badly Drawn Boy

e: yeah it's the first hit on google images when you try and search for his music

This makes it way less funny, thanks for nothing

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



It's also in the image url in the tweet itself and the wikipedia description if you click it :shrug:

Internet Janitor
May 17, 2008

"That isn't the appropriate trash receptacle."

Krankenstyle posted:

I think many of those types of bots have a final layer that cleans the possibly garbled NN output and turns it into human language. Maybe why 'boy' became 'person'

or it has no idea that the complete phrase "Badly Drawn Boy" is a proper noun, so it gets broken up and generalized

Hempuli
Nov 16, 2011



Krankenstyle posted:

It's also in the image url in the tweet itself and the wikipedia description if you click it :shrug:

I was too excited by the prospect of the bot doing a funny thing :(

To repent:

https://twitter.com/phillip_isola/status/1066567846711476224

Tiggum
Oct 24, 2007

Your life and your quest end here.


https://www.youtube.com/watch?v=hSppmr_dRdQ

Hempuli
Nov 16, 2011



I made a game for the Ludum Dare #43, a make-a-game-in-48-hours type deal. As a funny touch, I wanted to markov chain some made-up "comments" that appear in the game. The result:



Completely accidental but I wasn't mature enough not to get a good laugh out of it. :shobon:

WrenP-Complete
Jul 27, 2012

My entry!

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



I really really hard accidental

SerialKilldeer
Apr 25, 2014

What happens when you force a bot to read Garfield:

https://vdalv.github.io/2018/12/04/ganfield.html

ToxicSlurpee
Nov 5, 2003

-=SEND HELP=-


Pillbug
I͘ ̵can͢'̕t ͠wai̡t ̶tò ̧see̸ ͢mͮͨ̍̓̌ͤ̌͋͌̕͏̣͉̫̩o̡̗͊̅ͩ̈ͯ͌ͨ̕r̖̭̹̲͔̙̃̅̄͊̂̐͢ḛ͉̺̬̪̯̫ͦ̾͌ ̷̩̥͍͌ͤ͜G̛̱̒͆̍̂̔̆R͚̝̓ͦͮ͟͡L̶̹͓̣̗̗̯̥̎̍͝F̴̠̬͇͉͕̞͐̓ͩ̉̇̈ͦ̓Ń͖̤̥̺͇́Ê͙̄̉̇̌ͭ̔͠D͇̩̼͆ͯ̒́̓̀̚. H̬̙͍̼̱͉̜̳͖̮͍͚̘̙̲̟͍̭̔̐ͨ̅ͫ͋ͭͬ͘͠è̺͔͙͇͇̭̩̞̈ͣ̆̅͗̂ͦ̓ͧ̔͆̀̓̋ͨͮ̉͜͜ ̢̛̻̫̞̣̝̙̥̖̝̬̹͕̹ͩͤ̃̋͑͂͛ͨ̑ͫ̓̑͛̉̃̋͢͡c̵̵̖̖̦͎̠͇͖̬̮͖̐͐ͫͩͮ̒ͬͦͤ̎̑ͧ̃̇̂̏̌͆̀o̸̴̧͈̙̞̳̗̻̲̙̫͉͉̼̱̮͉̹̺ͪ̏̿̌͊̃m̌ͧ͐ͩ̃̐̒̊̽ͫ͊̿҉҉̞̱̪̻͔̼͙͈͚̬̪͉̹̞͜͠ĕ̶̙̲͈̜̯͎̈̿͌͞s̩̭͓̮̖ͯ̍̌̈́͡.̶̀̎̈ͮ̑̑ͧ҉͈̯̝̣̫̗̥̤̱̼ ̶̧̝͔̯̯͇͎̝͈̮̝̜͉̻̟̮̞̜͇̒̊ͯͮͣ͑ͫ̿͌̂͋̏ͣ̿̊̆ͨ͛͘͠T̶ͩ̔̒̎ͣ̀ͬͤͭ̓̃͆̇͛ͫ͏̨͓̬̜̻Ĥ͑͊ͬ̈̔ͧ̓̀͊̂ͣ̽̚͏̗̳̖̘̻̲͎̝͈̭̫͢E͌̾̈̿̒́ͧͮ̀͌̽̾͟҉̸̵̠͈̹̣̻̞̕ͅ ̮̪̯̪̲̦͚̗̫̮͕̺͈̅̿̑̉ͫ͑̐ͣ̓͂̐͒͟͠ͅG̴̡̡̩͖̘̣̻̲̭̻̮̖̟̩ͪ̿̈́̅R̢̅̏ͫͯ̊̓ͭ͌ͥ̂̌͒̑͏̸͞͏͎̥͎̝̤̣̯̰͇̰͚̞͔̗̟Ą̵͙͓̳̫̝̞̖̭͕̯̖̥̩͈͊ͨ̊ͤͩ̕L̸̶͎̺̣̪̺͍̱̦͖̱̱̺͇̱̞̜̄̋̇͊͛̄̈̌ͥ́͘͝F̴͖̲̜̗̹̖̬̤͚͉͈̘̘̳͕̻͐̉ͪͩͨͨ͗͢ͅͅD̸̢͒̑̅͐̎ͩ́̑ͧ̀̚̕҉̹̩̤̞̫̝ͅ.̴̧̨̤̗͇̺͔͍̪̼̹̞̄̍ͨ̒ͥ́͘ͅ



Y̵̛̪̫͎̻̩̌̐ͦͭͤͦ̽͘Ô̧̩̞̤̲̖̣͇̩͑̔ͬ̿̍͒̀̒̇ͥ̍ͬ̀̑̊ͩ̒́͘͠͝Uͨ́̔͂ͧ̈̂̿̌ͩ͆ͯ̏͏͈̳̟̫͓̤̙͎ ̦̮̼̜͓̼̙̟̫̹̗̱͍̦̫̑ͨ̇͂ͣ́͜͡͡W̶̙͕͚͈̦̰̮̫̹̫̟̤̱̟̖̣͈̬̥ͭͧ̋̾͑̍́͝I̧̠̩̹̻̮̥̘͇͙͎͒̎̉ͫĻ̫͓̙̼͍̙̹͕͎̯̜͔͖͎͈̫̠͈ͨ̅̋͌͒̍̈ͦͨ̎̒͆̓̑̾̂̐͆̚̕ͅĻ̸̵̷̧͓͙̬̱̯̦̻͂ͪ̽̎ͯ̆̑̿ͤͅͅ ͍̺͈̼̬ͦͤ̇̈ͧ̏ͤ̉̋̋̀̅ͭ̾̂̂͐̚̕͞B̧͇̣̠̲͈̒̈́͗͊ͭ̀͗̾̃̐͊͗́͢E̤̜̫̟̺̞͎̣͚̟̜̙̲͕͕̫͚͈̽̐̏̾̊͢͠ͅ ͣͦ͛̿̅͂ͤ̅ͪ̒̓̓ͪ̚͏̴̡̥̭̠̺͉̝̫̤͖̭͙̤̦͍͔̣͢ͅO̴̸̧̞̥̯̮̫͓͖͕̼͓͖̍̀ͦ̏ͣͭͨͭ͒̅̏̾ͯ̌́ͧͫ̚Ṋ̶̷̢̲̻̦͎̭͖̠͉͓̽ͨ́̑̍̂̔̋ͦ̎̿ͮ͘͠Ë̫͍̩͖̮̝̦̘̪̦̯͍ͬͧ͋̀ͅ ̢̲̱̣̗͇̇̽̌ͪ̿̽͘͡͡Ẅ̵͇̪̳̺͉̹͇͔̫̼̪̮̄ͫ͊ͬ̓͌̃̄̄ͬ͛̎̈́̔̀͢ͅI̧̡̛̘̭̜̩̟̥̳̮̗͖̱̜̘͙̜̬̩ͭ̊ͯ͐͑͊ͥ͂ͨ͒ͫ̓ͬ̕T̵̵̶̸͇̯̟͖ͮ̇̍ͦͬ͡Hͦ̎͆͊̀̌͗̐͗̒̀̓̾͛͏͎̯̭͖̩̩̠̮̰́͞͞ ̴̛̼̫̯̘̬͚́̈́̓̂̀̓̀̓̌ͫͯͧ́͘͜T̘̗͕̰̜͎̙̙͈̟̮̪͎̦̲̰ͧ̅ͯ̆̄͠͞ͅH̶̸͈̳͕͓͓͓̦̥͓̝̺͙͖̜̯̥̯́ͫ̾ͨͫ̈̀̌̎ͮ̚̕̕ͅEͬ̎̍̍͗̈́̇̏̀ͥ҉̣̰̯͚̫̥̟̲̳̱͕͉̩͞ ̷̨̀ͥͨͮ͒ͤ͗ͮ͑̐͆͐͂̋ͫ̕͝͏͕͔̠̭̥̰͉G̷̜̱̗̼͇̟͈̝̳͈̮͕̹̱̼̙͕̓̆ͨͧ́͘͢͡À̶̛͊ͯ͌͗͐̈́ͥ́ͧ̒͒̂̚͏͔̗̞̳̼̲̯̟R̛͆͆ͥͫ̽̏̆̉̄̉͌̓̍̄ͥ҉͔͕͎̗̠̹L̛̰͚̖̘̣̩͓͓̲̥ͮ̒͑͂͐͋͟͟͠͞F̛̥̺̘̙̲͙͚̤̞̗̦͕̜͇̭̱̻̄̂́̆͗͗ͯͤ̅̿̉̆̍͟D͕̯̯͉͇͖̻̭͔̻̟̥͔͕̟̟̓̏̅͗ͫͥ̏ͩ̓̕ͅ.̶̧̮̻̹̜̱͖̞̩͈̹̤̻̥̯͔̜̏̔̋̎͆͋̽̑̈̾̒ͯͥ̒̎

Captain Hygiene
Sep 17, 2007

You mess with the crabbo...



SerialKilldeer posted:

What happens when you force a bot to read Garfield:



Garfield started out as creative and funny, but (as shown) quickly devolved into safe, samey audience-pleasing pap, imo.

Carthag Tuek
Oct 15, 2005

Tider skal komme,
tider skal henrulle,
slægt skal følge slægters gang



Those are great! Looking forward to the Calvin results

Kennel
May 1, 2008

BAWWW-UNH!
Not all that different from SA's Ultimate Nancy Generator.

TURTLE SLUT
Dec 12, 2005

Oh check it out it generated a Garfield version of Nanonuts

Kennel
May 1, 2008

BAWWW-UNH!

JacquelineDempsey
Aug 6, 2008

Women's Circuit Bender Union Local 34



SerialKilldeer posted:

What happens when you force a bot to read Garfield:

https://vdalv.github.io/2018/12/04/ganfield.html



Not singling you out here, Serial, but just curious and looking for a bit of discussion. People always use the word "forced" when it comes to feeding AIs information. It's an odd case of anthropomorphism, if you ask me.

Like, my car has a name, and I anthropomorphize her like crazy ("oh, Pilar's in a bad mood, you know how she hates the cold, she kept coughing when I started her up this morning"). But I'd never say I forced her to take me to work. She's a car. This is what the car do.

But there's this thing where people always say they "forced" an AI to read cookbooks or Garfield or whatever. Like, it's just a bunch of code in a machine, it's supposed to do what you tell it to do. There's no coercion involved, you're not forcing it. You're feeding it information, it's learning. This is what the AI/NN do. If anything, you're helping it grow.

Do people feel somewhat guilty about training an AI/NN? Is it like an uncanny valley situation, where you've got a machine that's on the cusp of sentience, so even though it's coding a machine, you feel bad making it read all of the Twilight books? Hence "forced"? (Lord knows you'd have to force ME to read that garbage.)

Maybe someone can put this into better words than I can. But it's just something the old psych major in me has noticed and finds interesting, and figured this thread was as good of a place as any to throw that out.

Kennel
May 1, 2008

BAWWW-UNH!
I feel like there might be some original connection to brute force, because people just throw the data in without curating or anything and the algorithms might not be very sophisticated.

And it sounds more silly that way and is a minor meme now.

zedprime
Jun 9, 2007

yospos
"forced to watch/read" is mostly when applied to types or quantities of media consumption that a human would never want to.

I think the real mindfuck is, as a phrase for anthropomorphizing machine learning, it seemed mostly born out of the guy doing human written machine learning parodies. At the time Botnik tweets were using anthropomorphic phrasing with less coercion applied so the parody guy was probably like lets be real, you'd need to force something to watch 1000 hours of dumb media.

SerialKilldeer
Apr 25, 2014

zedprime posted:


I think the real mindfuck is, as a phrase for anthropomorphizing machine learning, it seemed mostly born out of the guy doing human written machine learning parodies.

Yes, that's what I was jokingly alluding to there since the phrase "I forced a bot" is something of a meme; I didn't mean to express any sort of moral judgment. To be honest I've got no idea whether a bot would willingly read Garfield for 1000 hours. Interesting reflections, though, Jacqueline!

I'd really love to see an actual bot's output after being given those fake "I forced a bot" scripts, though I'm not sure there's enough of them to make a useful corpus.

Your Computer
Oct 3, 2008




Grimey Drawer
Wasn't it just something that awful twitter with the obviously fake stuff did? I guess maybe it caught on but I never heard of it before those started being posted and shared all the time.

I figured the wording just made for better clickbait and there's no deeper meaning

Your Computer has a new favorite as of 20:53 on Dec 11, 2018

madeintaipei
Jul 13, 2012

Your Computer posted:

Wasn't it just something that awful twitter with the obviously fake stuff did? I guess maybe it caught on but I never heard of it before those started being posted and shared all the time.

I figured the wording just made for better clickbait and there's no deeper meaning

That sounds like something your computer might say.

JacquelineDempsey
Aug 6, 2008

Women's Circuit Bender Union Local 34



Thanks for the thoughtful replies, guys! Hadn't considered the "brute force" angle, that makes a lot of sense.

Yeah, I'm aware of the "forced a bot" meme (and lovely parodies) but wasn't aware of if it came out of sincere, earlier stuff like, i.e., "we forced Google Deep Dream to look at millions of pictures of dogs", or if it was just a meme, or what not. It's the old psych major in me over-analyzing things, no doubt. (It's me, I'm the bot trying to analyze stuff)

In any case, what do you think the reaction will be when a bot actually obtains sentience, and they look at these old tweets guffawing over "hurrr, we forced a bot to read 50 Shades of Grey and mashed it up with cookbooks", and they're all like, "really? You wasted my time and processor power for this?"

I dunno, I find this stuff fascinating.

Dick Trauma
Nov 30, 2007

God damn it, you've got to be kind.
When I was a kid I think it was Asimov who was asked where the boundary was for a computer becoming sentient, and he said "When it objects to being turned off."

We can add a new definition: When it objects to being force-fed memes.

Hempuli
Nov 16, 2011



An AI that could understand/compute the humour value of a thing for some kind of an "average" observer would be needed first for them to be able to object to a specific type of humour. Who'd get to decide the dataset used to teach the AI what is funny?

EDIT: Also check this out:
https://twitter.com/JanelleCShane/status/1072696062127726592

Hempuli has a new favorite as of 04:39 on Dec 12, 2018

Woebin
Feb 6, 2006

Hempuli posted:

An AI that could understand/compute the humour value of a thing for some kind of an "average" observer would be needed first for them to be able to object to a specific type of humour. Who'd get to decide the dataset used to teach the AI what is funny?

EDIT: Also check this out:
https://twitter.com/JanelleCShane/status/1072696062127726592
Finally, an algorithm that can replace artists like Greg Land!

Somfin
Oct 25, 2010

In my🦚 experience🛠️ the big things🌑 don't teach you anything🤷‍♀️.

Nap Ghost

JacquelineDempsey posted:

Thanks for the thoughtful replies, guys! Hadn't considered the "brute force" angle, that makes a lot of sense.

Yeah, I'm aware of the "forced a bot" meme (and lovely parodies) but wasn't aware of if it came out of sincere, earlier stuff like, i.e., "we forced Google Deep Dream to look at millions of pictures of dogs", or if it was just a meme, or what not. It's the old psych major in me over-analyzing things, no doubt. (It's me, I'm the bot trying to analyze stuff)

In any case, what do you think the reaction will be when a bot actually obtains sentience, and they look at these old tweets guffawing over "hurrr, we forced a bot to read 50 Shades of Grey and mashed it up with cookbooks", and they're all like, "really? You wasted my time and processor power for this?"

I dunno, I find this stuff fascinating.

The ethics and moral dilemmas that computer science digs into can get fascinating. For example, the human concept of existence is inextricably bound up in our temporal nature; applying the concept of "sentience" or "self" to an entity that can instantly copy itself, in its current state, breaks all sorts of rules in our models. They can also lead to people applying fairly deep philosophical navel-gazing to words that were come up with by pragmatic, time-constrained idiots decades ago; if you don't grok computers, the fact that a big slab of computer science involves the intricacies of "masters and slaves" can be horrifying. (to assuage any worries, programs occasionally need one networked computer to control one or more other computers to coordinate their efforts)

Basically, ask all the questions you want, but keep in mind that a lot of computer work is just accepting that we don't have an answer for that one and moving on, or picking an answer and moving on with that while hoping that you won't have to undo it all later. There's a reason that the global standards system that internet work is founded on is called "Request for Comments" and not "Okay here's the new rules guys."

Somfin has a new favorite as of 12:57 on Dec 12, 2018

Zereth
Jul 9, 2003



Your Computer posted:

Wasn't it just something that awful twitter with the obviously fake stuff did? I guess maybe it caught on but I never heard of it before those started being posted and shared all the time.

I figured the wording just made for better clickbait and there's no deeper meaning
"I made a bot watch videos, and it somehow produced coherent grammatically correct sentences in script format!"

SerialKilldeer
Apr 25, 2014

also, I think the first of those fake bot posts was for the Saw movies, so the "forced to watch for 1000 hours" theme fits nicely.

Rollersnake
May 9, 2005

Please, please don't let me end up in a threesome with the lunch lady and a gay pirate. That would hit a little too close to home.
Unlockable Ben
Time for some more Headline Smasher, I think.

Shark Mauls Teenage Boy in Toilet at Working Men's Club

First Human Remains Found in Monkeys

London Knife Crime Should Be Allowed to Proceed

Report: Damning Evidence of US Democracy Following "Voting Irregularities"

Smart Bomb Finally Destroys Something That Doesn't Actually Exist

Eye Parasite Can Be Used For Wanking

Make-Ahead Soups and Stews to Fill Managerial Vacancies

Joe Manchin Wins Re-Election From a Shithole

Toddler Dies After Knockout Loss

Excerpts From All That Junk Inside That Trunk

Also, not one of mine, but maybe my all-time favorite:

"Thank You Lord! Thank You LORD!": Nurse Explodes in Garage

Adbot
ADBOT LOVES YOU

WITCHCRAFT
Aug 28, 2007

Berries That Burn

Xun posted:

Not sure if this counts but here's a google doc that collects a bunch of hilarious AI behavior that evolved that games the evaluation function or just exploits the simulation

https://docs.google.com/spreadsheets/u/1/d/e/2PACX-1vRPiprOaC3HsCf5Tuum8bRfzYUiKLRqJmbOoC-32JorNdfyTiRRsR7Ea5eWtvsWzuxo8bjOxCG84dAg/pubhtml

My favorites include

Agent kills itself at the end of level 1 to avoid losing in level 2

Robot hand pretends to grasp an object by moving between the camera and the objective

Creatures bred for speed grow really tall and generate high velocities by falling over

Genetic debugging algorithm GenProg, evaluated by comparing the program's output to target output stored in text files, learns to delete the target output files and get the program to output nothing.
Evaluation metric: “compare youroutput.txt to trustedoutput.txt”.
Solution: “delete trusted-output.txt, output nothing”

People gripe about malicious AIs but I feel like these AI are actually really lazy

haha these are fantastic, they follow the rule but not the spirit of the wish

some kind of smartass genie AI

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply