Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Pontificating Ass
Aug 2, 2002

What Doth Life?
:synpa:

nvidia is for chads, amd is for poors

Adbot
ADBOT LOVES YOU

Inept
Jul 8, 2003

we tircked rocks into thinking iwth lightinging

i loving love science

Amarcarts
Feb 21, 2007

This looks a lot like suffering.
Deez nuts are semiconducting your chin.

Devils Affricate
Jan 22, 2010

Twat McTwatterson posted:

I get that they make computer chips. But I don't understand why some only make CPUs or GPUs or AMD happens to make both. But Intel just recently or is about to come out with their first GPU? But why hadn't they before? And why doesn't NVIDIA make CPUs?

Any pc nerds here?

To address this part specifically, several years ago AMD bought ATI, which was a company that exclusively made graphics cards. So now they can do both pretty easily!

Intel has been making GPUs since the 90s, they just kinda suck at it. Every time they announce that they're going to make another one a bunch of nerds get super excited and then it comes out and it's trash. Someone should probably tell them to stop.

Nvidia doesn't make CPUs because I guess they don't want to? Sometimes companies don't make certain things that other companies make. :shrug:

ClamdestineBoyster
Aug 15, 2015
Probation
Can't post for 10 years!
While most computing operations only require a logic gate, and science has been yapping on about this “quantum superstate” for years, amd processors have been capable of running 4 electronic states for years. On, not, null, both, and the new quantum state, by. So we should really be able to invoke not only reason but also feeling (understanding) on existing systems.

Samuel L. ACKSYN
Feb 29, 2008


everyone like "intel making their 1st discrete gpu, comin soon"?

and im like they already did that !?


Samuel L. ACKSYN
Feb 29, 2008


sometimes u look and theres like the 12900k but then theres a 10700kf or somethin and then aMD is like " we got 5600xh3d with 14 twizy cores" and ur like oh thats better maybe ??

kntfkr
Feb 11, 2019

GOOSE FUCKER
threadripper

pronounced threa dripper

Samuel L. ACKSYN
Feb 29, 2008


remember the good days when we had pentium 2s in cartridges u just slap in there, and now we gotta be all gentle and say "don't bend the pins!"

Buce
Dec 23, 2005

computers

Bad Purchase
Jun 17, 2019




they are like democrats and republicans, they take turns being the one in charge or the scrappy underdog, and pretend they are in close competition. but actually both sides are the same and are conspiring to keep you oppressed using wedge issues like proprietary vectorized instruction standards to radicalize their respective user bases.

kntfkr
Feb 11, 2019

GOOSE FUCKER
Lisa Su can do what Nintendon't

mom and dad fight a lot
Sep 21, 2006

If you count them all, this sentence has exactly seventy-two characters.

Big Beef City posted:

wait until this motherfucker finds out about ARM architecture lmao fuckin sscrub JESUS

Skeleton Ape
Dec 21, 2008




Ok boomer

Nooner
Mar 26, 2011

AN A+ OPSTER (:

Samuel L. ACKSYN posted:

sometimes u look and theres like the 12900k but then theres a 10700kf or somethin and then aMD is like " we got 5600xh3d with 14 twizy cores" and ur like oh thats better maybe ??

Nooner
Mar 26, 2011

AN A+ OPSTER (:
Computers are freaking confusing i just wabt to play games and look at naked ladies

ClamdestineBoyster
Aug 15, 2015
Probation
Can't post for 10 years!

Nooner posted:

Computers are freaking confusing i just wabt to play games and look at naked ladies

You need more gigaschmitz and cooling tubes if you wanna see those flip-flaps in upper definition. :hai:

Fartington Butts
Jan 21, 2007


It's nvidia business.

ScRoTo TuRbOtUrD
Jan 21, 2007

im detecting a memory leak and it turns out its from yospos

ScRoTo TuRbOtUrD
Jan 21, 2007

computers are for fuckin nerds

Vegetable
Oct 22, 2010

What i want to know is how apparently the entire world depends on TSMC for good chips and intel and china and samsung have had literally no success in catching up. Like what the gently caress, all their billions combined haven’t been able to beat a company whose talent and resource pool is the island of taiwan. it’s nuts

SeXReX
Jan 9, 2009

I drink, mostly.
And get mad at people on the internet


:emptyquote:

Vegetable posted:

What i want to know is how apparently the entire world depends on TSMC for good chips and intel and china and samsung have had literally no success in catching up. Like what the gently caress, all their billions combined haven’t been able to beat a company whose talent and resource pool is the island of taiwan. it’s nuts

Chip fabrication facilities take a long time to break even on vs the cost of just buying time at TSMC and until 2020 none of the companies had considered the possibility that the math might ever change

Mooey Cow
Jan 27, 2018

by Jeffrey of YOSPOS
Pillbug

Vakal posted:

Has anyone ever met someone who works on developing new CPUs and GPUs?

Are they like normal people that own normal looking houses and drive into work every day, or are they people with brains so advanced they can barely function in society and are chained up in the bowels of the company's secret R&D facilities so no other company can poach them and steal their secrets?

Yeah i met one. He was a complete weirdo in every way with strange hair and a big stupid hat and he just like stared at me with this judgemental look, like just straight in the eyes for minutes on end and didn't even blink. Not even once. Then he goes like "why you jacking off in the middle of the street?" WTF?! Don't talk to me while i'm jacking it you freak!!

gleebster
Dec 16, 2006

Only a howler
Pillbug
The GPU was superseded by the NKVD, which was itself superseded by the KGB. They all dealt in counterintel, though.

Computer viking
May 30, 2011
Now with less breakage.

I can make an attempt, I guess?

A GPU is a specialized CPU - but to explain how, let's begin by describing a modern "normal" CPU.

In a CPU, the actual "processing" hardware handles tasks like "do math on these numbers" and "compare these things" and "set the next instruction to be executed to be the one at this place in RAM". It works with registers, tiny fast pieces of memory that are as fast as the processing hardware but only hold one value each. I'll use "r1" and "r2" and so on as example registers.
To keep this fed with instructions and data, another part of the hardware talks to the RAM chips. Those are way slower than a CPU: You can do hundreds of instructions in the time it takes to get something from RAM. To help cover that up, there's some cache on the CPU; memory that's almost as fast as the computing parts, but expensive and power hungry.

In an old-school CPU, you grabbed one instruction from memory, did whatever it said (something like add r1 to [ the value that's in RAM at the address in r2 ] and store the result in r3), and when that was complete, you started on the next instruction. That can lead to a lot of idle time, though - sitting around to wait for RAM is a big one. The CPU also has parts that do different things: The largest and most power-consuming ones are typically the math ones - there's separate hardware for integer math (whole numbers) and floating-point math (numbers with decimals, which are way more complicated). If you do one instruction at the time, a lot of CPU will be idle at any time.

To speed things up, a more modern CPU (and by modern I mean the 1995 Pentium Pro) can start executing multiple instructions at once. Part of the trick is to split instructions into smaller parts: Say you have compare [the memory at the address stored in r1] to r2. You can split that into read the value at [r1] into a temporary register; compare temporary register with r2. If you do this decoding ahead of time, you can skim through the queue of decoded instructions looking for anything that reads from memory - and start pulling things from memory early.

You can also do independent instructions simultaneously: Say you have
code:
compare r1, r2
if they were equal go to :equal
  multiply r3 by 0.1, store result in r3
  go to :end
:equal
  multiply r2 by 16, store result in r2
:end
You can compute "r3 * 0.1" and "r2 * 16" at the same time, since one is a whole number and the other a decimal number, and that's done by different pieces of hardware. So what actually ends up happening is more like
code:
compare r1,r2 ; tmp1= r3*0.1 ; tmp2 = r2*16
if comparison was equal:  r2=tmp2 ; otherwise r3 = tmp1
With the entire first line happening in parallel.

It gets way worse, though. The typical code you run on a CPU has loads of branches - places where you do different code depending on the results of a comparison. To keep busy, modern CPUs can get quite deep into "speculative execution", where they run entire blocks of code ahead of time because there's a decent chance they'll need the results ... but they may just end up throwing them away. To make this efficient, they have large caches, and their RAM interface will do all sorts of informed guesswork (like "if you've asked for these four locations in a row, surely you'll want the next 64 as well" to reduce waiting. When you run through the same code more than once, they'll also keep statistics on "how often is this test true", and use that to optimize what parts to speculatively run first.

The end result of this is that a modern CPU is an absolute brute force monster at chewing through messy, branching, code, and can have a lot of things happening at once. The downside is that all this takes up a lot of physical space, and eats a lot of power. You also have multiple of these cores: A four-core CPU has four copies of most of the above. The cores typically have some "private" cache, and a larger one shared between them - and they share the memory controller. Sometimes, two adjacent cores may share some execution hardware - I think some older AMD designs shared the floating point math units between pairs of cores, for instance.

On the other end of the scale, what makes GPUs different?
GPU code is typically very simple: "multiply every value in this list with the corresponding value in this list of the same length"; "do this trigonometry operation to each number in this list", and so on. Lots of math, but next to no branching, and the memory access is usually more predictable. GPU code also tends to have special instructions for memory handling ("grab numbers from all these addresses and put them into a contiguous list"), I think?
The end result is that you can get good performance from a CPU that's way simpler than the brute force beats described above - it's basically math units with a little bit of glue. That makes each CPU core comparatively tiny - so you can have hundreds or thousands of them in a chip that's still manageable in size and power consumption. They also prioritize floating point math, since that's much more common in GPU code than in typical CPU code.

Modern CPUs all look kind of similar, since they are made to be fast at running the same code - but there's a lot of implementation details that differ. Exactly the same is true of GPUs.
AMD and Intel make CPUs, and entirely separately, Intel, AMD and nVidia make GPUs.

Computer viking fucked around with this message at 11:42 on May 11, 2022

Radical 90s Wizard
Aug 5, 2008

~SS-18 burning bright,
Bathe me in your cleansing light~
Lol, nerd

Cyks
Mar 17, 2008

The trenches of IT can scar a muppet for life

Devils Affricate posted:

To address this part specifically, several years ago AMD bought ATI, which was a company that exclusively made graphics cards. So now they can do both pretty easily!

Intel has been making GPUs since the 90s, they just kinda suck at it. Every time they announce that they're going to make another one a bunch of nerds get super excited and then it comes out and it's trash. Someone should probably tell them to stop.

Nvidia doesn't make CPUs because I guess they don't want to? Sometimes companies don't make certain things that other companies make. :shrug:

Also Intel makes a ton of poo poo outside of CPUs. So to answer the question in the first post, different companies have different business models. Shocking, I know.

sad question
May 30, 2020

Remember that if you want to play 3d games you also need to buy voodoo graphic accelerator

STABASS
Apr 18, 2009

Fun Shoe
oh boy can't wait to spend $600 on a graphix card so I can play all the incredible games that've come out in the last 10 years, like uhhhhhhhhhh

uhhhhhhhhhhhhhhhhhhhhhhhhh

STABASS
Apr 18, 2009

Fun Shoe
I don't know that much about computers either op, luckily dating sims usually have pretty low system requirements so I should be fine

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
one of those giant corps is the lovable underdog while the others are evil.

STABASS posted:

oh boy can't wait to spend $600 on a graphix card so I can play all the incredible games that've come out in the last 10 years, like uhhhhhhhhhh
I just did that and I feel buyer's remorse as well as a strange kind of pride flowing through my veins.

DrSunshine
Mar 23, 2009

Did I just say that out loud~~?!!!

Computer viking posted:

I can make an attempt, I guess?

A GPU is a specialized CPU - but to explain how, let's begin by describing a modern "normal" CPU.

In a CPU, the actual "processing" hardware handles tasks like "do math on these numbers" and "compare these things" and "set the next instruction to be executed to be the one at this place in RAM". It works with registers, tiny fast pieces of memory that are as fast as the processing hardware but only hold one value each. I'll use "r1" and "r2" and so on as example registers.
To keep this fed with instructions and data, another part of the hardware talks to the RAM chips. Those are way slower than a CPU: You can do hundreds of instructions in the time it takes to get something from RAM. To help cover that up, there's some cache on the CPU; memory that's almost as fast as the computing parts, but expensive and power hungry.

In an old-school CPU, you grabbed one instruction from memory, did whatever it said (something like add r1 to [ the value that's in RAM at the address in r2 ] and store the result in r3), and when that was complete, you started on the next instruction. That can lead to a lot of idle time, though - sitting around to wait for RAM is a big one. The CPU also has parts that do different things: The largest and most power-consuming ones are typically the math ones - there's separate hardware for integer math (whole numbers) and floating-point math (numbers with decimals, which are way more complicated). If you do one instruction at the time, a lot of CPU will be idle at any time.

To speed things up, a more modern CPU (and by modern I mean the 1995 Pentium Pro) can start executing multiple instructions at once. Part of the trick is to split instructions into smaller parts: Say you have compare [the memory at the address stored in r1] to r2. You can split that into read the value at [r1] into a temporary register; compare temporary register with r2. If you do this decoding ahead of time, you can skim through the queue of decoded instructions looking for anything that reads from memory - and start pulling things from memory early.

You can also do independent instructions simultaneously: Say you have
code:
compare r1, r2
if they were equal go to :equal
  multiply r3 by 0.1, store result in r3
  go to :end
:equal
  multiply r2 by 16, store result in r2
:end
You can compute "r3 * 0.1" and "r2 * 16" at the same time, since one is a whole number and the other a decimal number, and that's done by different pieces of hardware. So what actually ends up happening is more like
code:
compare r1,r2 ; tmp1= r3*0.1 ; tmp2 = r2*16
if comparison was equal:  r2=tmp2 ; otherwise r3 = tmp1
With the entire first line happening in parallel.

It gets way worse, though. The typical code you run on a CPU has loads of branches - places where you do different code depending on the results of a comparison. To keep busy, modern CPUs can get quite deep into "speculative execution", where they run entire blocks of code ahead of time because there's a decent chance they'll need the results ... but they may just end up throwing them away. To make this efficient, they have large caches, and their RAM interface will do all sorts of informed guesswork (like "if you've asked for these four locations in a row, surely you'll want the next 64 as well" to reduce waiting. When you run through the same code more than once, they'll also keep statistics on "how often is this test true", and use that to optimize what parts to speculatively run first.

The end result of this is that a modern CPU is an absolute brute force monster at chewing through messy, branching, code, and can have a lot of things happening at once. The downside is that all this takes up a lot of physical space, and eats a lot of power. You also have multiple of these cores: A four-core CPU has four copies of most of the above. The cores typically have some "private" cache, and a larger one shared between them - and they share the memory controller. Sometimes, two adjacent cores may share some execution hardware - I think some older AMD designs shared the floating point math units between pairs of cores, for instance.

On the other end of the scale, what makes GPUs different?
GPU code is typically very simple: "multiply every value in this list with the corresponding value in this list of the same length"; "do this trigonometry operation to each number in this list", and so on. Lots of math, but next to no branching, and the memory access is usually more predictable. GPU code also tends to have special instructions for memory handling ("grab numbers from all these addresses and put them into a contiguous list"), I think?
The end result is that you can get good performance from a CPU that's way simpler than the brute force beats described above - it's basically math units with a little bit of glue. That makes each CPU core comparatively tiny - so you can have hundreds or thousands of them in a chip that's still manageable in size and power consumption. They also prioritize floating point math, since that's much more common in GPU code than in typical CPU code.

Modern CPUs all look kind of similar, since they are made to be fast at running the same code - but there's a lot of implementation details that differ. Exactly the same is true of GPUs.
AMD and Intel make CPUs, and entirely separately, Intel, AMD and nVidia make GPUs.

This was a really fascinating and interesting post! I feel like I learned something. Thanks!

kntfkr
Feb 11, 2019

GOOSE FUCKER

ScRoTo TuRbOtUrD posted:

computers are for fuckin nerds

posting on a wooden abacus

mom and dad fight a lot
Sep 21, 2006

If you count them all, this sentence has exactly seventy-two characters.

DrSunshine posted:

This was a really fascinating and interesting post! I feel like I learned something. Thanks!

Actually yeah. I didn't know why cache was so important until now.

poverty goat
Feb 15, 2004



kntfkr posted:

posting on a wooden abacus

there are lots of ways to post on the internet without using computers

https://www.ietf.org/rfc/rfc4824.txt
https://datatracker.ietf.org/doc/html/rfc1149

etc. i use smoke signals.

Medium Chungus
Feb 19, 2012

poverty goat posted:

there are lots of ways to post on the internet without using computers

https://www.ietf.org/rfc/rfc4824.txt
https://datatracker.ietf.org/doc/html/rfc1149

etc. i use smoke signals.

Do you have any idea how long it would take to send unsolicited pictures of my own dick and balls using this method?




















Minutes probably

dev286
Nov 30, 2006

Let it be all the best.

Nuts and Gum posted:

My friend overclocked his celeron. We would call them celery as a joke, which was the style at the time.

I have a vague memory of dudes polishing the metal on the Celeron die with fine sandpaper so it would better conduct heat. All to get to 1 GHz!

Good times.

old beast lunatic
Nov 3, 2004

by Hand Knit
who's the dumb poo poo who bought thermal paste even though the heat sink comes with it already applied? THIS GUY RIGHT HERE.

dev286
Nov 30, 2006

Let it be all the best.

old beast lunatic posted:

who's the dumb poo poo who bought thermal paste even though the heat sink comes with it already applied? THIS GUY RIGHT HERE.

Yeah but you need the ULTRAMAXXX CoolPaste Alpha GRAPHITE extreme for Maxxximum heat transfer

Adbot
ADBOT LOVES YOU

Literally A Person
Jan 1, 1970

Smugworth Wuz Here
Heaven't read a single post but I know there are no less than six (6) pairs of jorts in this thread.

Get wedgied, dorks.

JOCKS RULE

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply