|
Tarkus posted:Call centers, drive thru people, technical writers, customer service representatives, telemarketers, technical support agents, live chat support staff, personal assistants, booking and reservation agents, email support staff, social media managers, language translators, content moderators. I mean, that's just a few but there is a list of jobs that are far easier to replace than the two you gave. What do you think MD's and mechanics do? Just look at symptoms and tell you what's wrong? All of these have been using NLP for a decade+ now. They still gently caress up a lot, and need a human to fix the gently caress ups. If/when AI-dedicated chips get cheap enough that they can refine their LLM on the fly, they may get better, or they may hallucinate that they're getting better and poo poo out more garbage
|
# ¿ Mar 11, 2024 14:59 |
|
|
# ¿ May 21, 2024 06:15 |
|
Tarkus posted:Poke it in the eyes, easy peasy, defeated bear. EDIT for content - AI , by which I mean Machine Learning specifically, has some very real and interesting applications in computer networks, specifically w/r/t configuration and packet loss. Generative AI to make as a rubber duckie takes a huge fuckoff amount of processing power, both in training the LLM and in generation. Figuring out what a missing packet looked like with the language model 'learning' from pre-post packets would be relatively easy, and would result in massive bandwidth reductions for human communication such as voice, video, email, SMS, etc. Vampire Panties fucked around with this message at 15:14 on Mar 17, 2024 |
# ¿ Mar 17, 2024 15:09 |
|
redshirt posted:AI ain't poo poo until it gets some hardware. also this. Machine learning, the fundamental basis of anything AI , requires huge amounts of very small, very stupid math calculations. Right now the entire processing world is based around doing extremely large math calculations per clock cycle, but x86/64 processors don't scale downward at all - they can only do one calculation per clock cycle, regardless if its a 64 bit integer or a 2 bit integer. AI hardware, such as nvidia's Tensor cores, have the ability to do lots very small, very stupid math calculations simultaneously per clock cycle, which ends up being way faster for AI poo poo than an Intel I7 or whatever. The computer chip manufacturing industry has to fundamentally shift into a different direction to realistically meet the AI needs of the future, but based on the different chip requirements, AI will likely become a hardware feature attached to specific equipment rather than the big boondoggle software platforms like ChatGPT are today
|
# ¿ Mar 17, 2024 15:20 |