|
012924_6 posted:This A.I. Respects No Math This Machine Kills Deeply To Fact
|
# ¿ Jan 30, 2024 05:26 |
|
|
# ¿ May 12, 2024 21:16 |
|
Boris Galerkin posted:Ansys, the engineering analysis software suite that pretty much every single engineering company in the world uses, wants to put ChatGPT into their products. Or rather, they have already done so and want to expand AI features. Is there any kind of security against researchers traipsing through accessible ontology to find backdoors into cached and firewalled content? Like could I just couch the prompt correctly to encourage the model to go right through pytorch on the back end and spit out a bunch of classified results?
|
# ¿ Feb 7, 2024 18:15 |
|
Kagrenak posted:Each inference instance is going to have independent resources. I doubt they'd reinforcement train their hosted models against the new data or anything insane like that. The risk profile would probably be similar to any other cloud service, which is to say pretty high for classified data. I would also imagine that if people are using this software in a classified mode, that module would be disabled or only work using some sort of on-prem edge node for model inference. Some laboratory software is like this, certain functionality turns off if it's not supported in the 21 CFR 11 compliant mode. Okay, so there is actually bulwark, that's fantastic. Although it'd be very cyberpunk and GET ME MY GOGGLES, the ability to back-alley your way around into everyone's topology would probably uproot some scary things.
|
# ¿ Feb 8, 2024 00:07 |
|
RPATDO_LAMD posted:This is definitely untrue, the very first thing that went viral with copilot was using it to generate the infamous "fast inverse square root" function from Quake III, which is open source nowadays but is licensed under the very restrictive GPL2 license. lol ontology Copilot just like look man, someone gave me proper justification and everything, I see nothing wrong here.
|
# ¿ Mar 4, 2024 16:52 |