(Thread IKs:
Platystemon)
|
Heck Yes! Loam! posted:I was thinking about it from the opposite angle. If we're just meat prediction engines that hack their way to something resembling consciousness or sapience, it means we will never have true AGI because we don't even meet the definition of GI. If we did it would treat us the way that we treat pets or Siri. The AGI would wonder if we actually love it or are just meat programmed to make it think we do. you are correct but you are posting in c-spam do you think any of these people have ever used a language model or read a book about philosophy lol
|
# ¿ Mar 17, 2023 10:31 |
|
|
# ¿ May 12, 2024 15:45 |
|
diod you guys ever figure out if you thought ai was good or bad or...?
|
# ¿ Apr 27, 2024 08:54 |
|
did you guys ever figure out what ai was
|
# ¿ Apr 27, 2024 22:48 |