|
nvidia is for chads, amd is for poors
|
# ? May 11, 2022 04:17 |
|
|
# ? Jun 10, 2024 13:12 |
|
we tircked rocks into thinking iwth lightinging i loving love science
|
# ? May 11, 2022 04:21 |
|
Deez nuts are semiconducting your chin.
|
# ? May 11, 2022 04:24 |
|
Twat McTwatterson posted:I get that they make computer chips. But I don't understand why some only make CPUs or GPUs or AMD happens to make both. But Intel just recently or is about to come out with their first GPU? But why hadn't they before? And why doesn't NVIDIA make CPUs? To address this part specifically, several years ago AMD bought ATI, which was a company that exclusively made graphics cards. So now they can do both pretty easily! Intel has been making GPUs since the 90s, they just kinda suck at it. Every time they announce that they're going to make another one a bunch of nerds get super excited and then it comes out and it's trash. Someone should probably tell them to stop. Nvidia doesn't make CPUs because I guess they don't want to? Sometimes companies don't make certain things that other companies make.
|
# ? May 11, 2022 04:27 |
|
While most computing operations only require a logic gate, and science has been yapping on about this “quantum superstate” for years, amd processors have been capable of running 4 electronic states for years. On, not, null, both, and the new quantum state, by. So we should really be able to invoke not only reason but also feeling (understanding) on existing systems.
|
# ? May 11, 2022 04:40 |
|
everyone like "intel making their 1st discrete gpu, comin soon"? and im like they already did that !?
|
# ? May 11, 2022 04:47 |
|
sometimes u look and theres like the 12900k but then theres a 10700kf or somethin and then aMD is like " we got 5600xh3d with 14 twizy cores" and ur like oh thats better maybe ??
|
# ? May 11, 2022 04:50 |
|
threadripper pronounced threa dripper
|
# ? May 11, 2022 04:51 |
|
remember the good days when we had pentium 2s in cartridges u just slap in there, and now we gotta be all gentle and say "don't bend the pins!"
|
# ? May 11, 2022 04:53 |
|
computers
|
# ? May 11, 2022 04:58 |
|
they are like democrats and republicans, they take turns being the one in charge or the scrappy underdog, and pretend they are in close competition. but actually both sides are the same and are conspiring to keep you oppressed using wedge issues like proprietary vectorized instruction standards to radicalize their respective user bases.
|
# ? May 11, 2022 05:00 |
|
Lisa Su can do what Nintendon't
|
# ? May 11, 2022 05:00 |
|
Big Beef City posted:wait until this motherfucker finds out about ARM architecture lmao fuckin sscrub JESUS
|
# ? May 11, 2022 05:12 |
|
Twat McTwatterson posted:computer Ok boomer
|
# ? May 11, 2022 06:30 |
|
Samuel L. ACKSYN posted:sometimes u look and theres like the 12900k but then theres a 10700kf or somethin and then aMD is like " we got 5600xh3d with 14 twizy cores" and ur like oh thats better maybe ??
|
# ? May 11, 2022 06:36 |
|
Computers are freaking confusing i just wabt to play games and look at naked ladies
|
# ? May 11, 2022 06:39 |
|
Nooner posted:Computers are freaking confusing i just wabt to play games and look at naked ladies You need more gigaschmitz and cooling tubes if you wanna see those flip-flaps in upper definition.
|
# ? May 11, 2022 06:42 |
|
It's nvidia business.
|
# ? May 11, 2022 06:56 |
|
im detecting a memory leak and it turns out its from yospos
|
# ? May 11, 2022 07:05 |
|
computers are for fuckin nerds
|
# ? May 11, 2022 07:06 |
|
What i want to know is how apparently the entire world depends on TSMC for good chips and intel and china and samsung have had literally no success in catching up. Like what the gently caress, all their billions combined haven’t been able to beat a company whose talent and resource pool is the island of taiwan. it’s nuts
|
# ? May 11, 2022 08:54 |
|
Vegetable posted:What i want to know is how apparently the entire world depends on TSMC for good chips and intel and china and samsung have had literally no success in catching up. Like what the gently caress, all their billions combined haven’t been able to beat a company whose talent and resource pool is the island of taiwan. it’s nuts Chip fabrication facilities take a long time to break even on vs the cost of just buying time at TSMC and until 2020 none of the companies had considered the possibility that the math might ever change
|
# ? May 11, 2022 09:30 |
|
Vakal posted:Has anyone ever met someone who works on developing new CPUs and GPUs? Yeah i met one. He was a complete weirdo in every way with strange hair and a big stupid hat and he just like stared at me with this judgemental look, like just straight in the eyes for minutes on end and didn't even blink. Not even once. Then he goes like "why you jacking off in the middle of the street?" WTF?! Don't talk to me while i'm jacking it you freak!!
|
# ? May 11, 2022 09:51 |
|
The GPU was superseded by the NKVD, which was itself superseded by the KGB. They all dealt in counterintel, though.
|
# ? May 11, 2022 10:09 |
|
I can make an attempt, I guess? A GPU is a specialized CPU - but to explain how, let's begin by describing a modern "normal" CPU. In a CPU, the actual "processing" hardware handles tasks like "do math on these numbers" and "compare these things" and "set the next instruction to be executed to be the one at this place in RAM". It works with registers, tiny fast pieces of memory that are as fast as the processing hardware but only hold one value each. I'll use "r1" and "r2" and so on as example registers. To keep this fed with instructions and data, another part of the hardware talks to the RAM chips. Those are way slower than a CPU: You can do hundreds of instructions in the time it takes to get something from RAM. To help cover that up, there's some cache on the CPU; memory that's almost as fast as the computing parts, but expensive and power hungry. In an old-school CPU, you grabbed one instruction from memory, did whatever it said (something like add r1 to [ the value that's in RAM at the address in r2 ] and store the result in r3), and when that was complete, you started on the next instruction. That can lead to a lot of idle time, though - sitting around to wait for RAM is a big one. The CPU also has parts that do different things: The largest and most power-consuming ones are typically the math ones - there's separate hardware for integer math (whole numbers) and floating-point math (numbers with decimals, which are way more complicated). If you do one instruction at the time, a lot of CPU will be idle at any time. To speed things up, a more modern CPU (and by modern I mean the 1995 Pentium Pro) can start executing multiple instructions at once. Part of the trick is to split instructions into smaller parts: Say you have compare [the memory at the address stored in r1] to r2. You can split that into read the value at [r1] into a temporary register; compare temporary register with r2. If you do this decoding ahead of time, you can skim through the queue of decoded instructions looking for anything that reads from memory - and start pulling things from memory early. You can also do independent instructions simultaneously: Say you have code:
code:
It gets way worse, though. The typical code you run on a CPU has loads of branches - places where you do different code depending on the results of a comparison. To keep busy, modern CPUs can get quite deep into "speculative execution", where they run entire blocks of code ahead of time because there's a decent chance they'll need the results ... but they may just end up throwing them away. To make this efficient, they have large caches, and their RAM interface will do all sorts of informed guesswork (like "if you've asked for these four locations in a row, surely you'll want the next 64 as well" to reduce waiting. When you run through the same code more than once, they'll also keep statistics on "how often is this test true", and use that to optimize what parts to speculatively run first. The end result of this is that a modern CPU is an absolute brute force monster at chewing through messy, branching, code, and can have a lot of things happening at once. The downside is that all this takes up a lot of physical space, and eats a lot of power. You also have multiple of these cores: A four-core CPU has four copies of most of the above. The cores typically have some "private" cache, and a larger one shared between them - and they share the memory controller. Sometimes, two adjacent cores may share some execution hardware - I think some older AMD designs shared the floating point math units between pairs of cores, for instance. On the other end of the scale, what makes GPUs different? GPU code is typically very simple: "multiply every value in this list with the corresponding value in this list of the same length"; "do this trigonometry operation to each number in this list", and so on. Lots of math, but next to no branching, and the memory access is usually more predictable. GPU code also tends to have special instructions for memory handling ("grab numbers from all these addresses and put them into a contiguous list"), I think? The end result is that you can get good performance from a CPU that's way simpler than the brute force beats described above - it's basically math units with a little bit of glue. That makes each CPU core comparatively tiny - so you can have hundreds or thousands of them in a chip that's still manageable in size and power consumption. They also prioritize floating point math, since that's much more common in GPU code than in typical CPU code. Modern CPUs all look kind of similar, since they are made to be fast at running the same code - but there's a lot of implementation details that differ. Exactly the same is true of GPUs. AMD and Intel make CPUs, and entirely separately, Intel, AMD and nVidia make GPUs. Computer viking fucked around with this message at 11:42 on May 11, 2022 |
# ? May 11, 2022 11:38 |
|
Lol, nerd
|
# ? May 11, 2022 12:12 |
|
Devils Affricate posted:To address this part specifically, several years ago AMD bought ATI, which was a company that exclusively made graphics cards. So now they can do both pretty easily! Also Intel makes a ton of poo poo outside of CPUs. So to answer the question in the first post, different companies have different business models. Shocking, I know.
|
# ? May 11, 2022 12:27 |
|
Remember that if you want to play 3d games you also need to buy voodoo graphic accelerator
|
# ? May 11, 2022 12:28 |
|
oh boy can't wait to spend $600 on a graphix card so I can play all the incredible games that've come out in the last 10 years, like uhhhhhhhhhh uhhhhhhhhhhhhhhhhhhhhhhhhh
|
# ? May 11, 2022 12:44 |
|
I don't know that much about computers either op, luckily dating sims usually have pretty low system requirements so I should be fine
|
# ? May 11, 2022 12:47 |
|
one of those giant corps is the lovable underdog while the others are evil.STABASS posted:oh boy can't wait to spend $600 on a graphix card so I can play all the incredible games that've come out in the last 10 years, like uhhhhhhhhhh
|
# ? May 11, 2022 13:40 |
|
Computer viking posted:I can make an attempt, I guess? This was a really fascinating and interesting post! I feel like I learned something. Thanks!
|
# ? May 11, 2022 14:01 |
|
ScRoTo TuRbOtUrD posted:computers are for fuckin nerds posting on a wooden abacus
|
# ? May 11, 2022 14:42 |
|
DrSunshine posted:This was a really fascinating and interesting post! I feel like I learned something. Thanks! Actually yeah. I didn't know why cache was so important until now.
|
# ? May 11, 2022 14:51 |
|
kntfkr posted:posting on a wooden abacus there are lots of ways to post on the internet without using computers https://www.ietf.org/rfc/rfc4824.txt https://datatracker.ietf.org/doc/html/rfc1149 etc. i use smoke signals.
|
# ? May 11, 2022 14:57 |
|
poverty goat posted:there are lots of ways to post on the internet without using computers Do you have any idea how long it would take to send unsolicited pictures of my own dick and balls using this method? Minutes probably
|
# ? May 11, 2022 16:33 |
|
Nuts and Gum posted:My friend overclocked his celeron. We would call them celery as a joke, which was the style at the time. I have a vague memory of dudes polishing the metal on the Celeron die with fine sandpaper so it would better conduct heat. All to get to 1 GHz! Good times.
|
# ? May 11, 2022 23:34 |
|
who's the dumb poo poo who bought thermal paste even though the heat sink comes with it already applied? THIS GUY RIGHT HERE.
|
# ? May 11, 2022 23:42 |
|
old beast lunatic posted:who's the dumb poo poo who bought thermal paste even though the heat sink comes with it already applied? THIS GUY RIGHT HERE. Yeah but you need the ULTRAMAXXX CoolPaste Alpha GRAPHITE extreme for Maxxximum heat transfer
|
# ? May 12, 2022 03:33 |
|
|
# ? Jun 10, 2024 13:12 |
|
Heaven't read a single post but I know there are no less than six (6) pairs of jorts in this thread. Get wedgied, dorks. JOCKS RULE
|
# ? May 12, 2022 03:38 |