Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Microsoft was always fairly ahead of the curve on moving towards commodity architectures. OG Xbox was more or less a pentium 3 and an NVIDIA GPU. Xbox 360 was more or less a POWER tri-core CPU with an ATI GPU. They never did the super-esoteric cell/saturn style crap.

Probably due to their experiences in the PC space and the OG Xbox's adaptation of windows and directX as a basis for the xbox environment/runtime.

Ironically, while the hardware isn't off-the-shelf x86, PS1 and PS2 really were much more straightforward cpu-gpu arrangements, Sony just blew it with the Cell.

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Didn't the base Xbox One have some particular gamble on a hardware configuration that ultimately didn't work out, making it marginally weaker than the PS4?

CoolCab
Apr 17, 2005

glem

gradenko_2000 posted:

Didn't the base Xbox One have some particular gamble on a hardware configuration that ultimately didn't work out, making it marginally weaker than the PS4?

there were issues with the memory being incorrect, yeah

repiv
Aug 13, 2009

Yeah the OG Xbox One used plain old DDR3 for both system and graphics memory, and tried to offset it with 32MB of fast on-chip memory, while the PS4 just used GDDR5 for everything

It was awkward to program for and Microsoft switched to just using GDDR5 with the Xbox One X

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

gradenko_2000 posted:

Didn't the base Xbox One have some particular gamble on a hardware configuration that ultimately didn't work out, making it marginally weaker than the PS4?

My recollection is that Sony designed the PS4 around 4GB of GDDR5, and then memory prices fell much faster than anticipated and they were able to get 8GB in there. MS really wanted that 8GB of RAM and used cheaper plain DDR3 assuming anything faster would be cost prohibitive.

CoolCab
Apr 17, 2005

glem

repiv posted:

Yeah the OG Xbox One used plain old DDR3 for both system and graphics memory, and tried to offset it with 32MB of fast on-chip memory, while the PS4 just used GDDR5 for everything

It was awkward to program for and Microsoft switched to just using GDDR5 with the Xbox One X

that 32 megs wasn't enough for a 1080p image iirc, so they constantly had to rejigger the size of the frame because they were literally running out of pixels before they had to start going back to the hugely slow system memory, lol. it couldn't really do 1080p in a ton of titles partially for this reason i want to say?

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

Paul MaudDib posted:

Microsoft was always fairly ahead of the curve on moving towards commodity architectures. OG Xbox was more or less a pentium 3 and an NVIDIA GPU. Xbox 360 was more or less a POWER tri-core CPU with an ATI GPU. They never did the super-esoteric cell/saturn style crap.

What’s really great is that the 360 used the same powerpc core design (non-SPE) that Sony paid IBM to co-develop.

repiv
Aug 13, 2009

CoolCab posted:

that 32 megs wasn't enough for a 1080p image iirc, so they constantly had to rejigger the size of the frame because they were literally running out of pixels before they had to start going back to the hugely slow system memory, lol. it couldn't really do 1080p in a ton of titles partially for this reason i want to say?

It would depend on the design of the renderer but yeah it would be pretty easy to blow past 32MB when doing deferred shading at 1080p

IIRC Crytek published some wacky workaround where they rendered with chroma subsampling to save space

WhyteRyce
Dec 30, 2001

Wasn’t it not possible to directly access the 32MB and you had to hack your way through some bullshit paths to get there?

VorpalFish
Mar 22, 2007
reasonably awesometm

repiv posted:

Yeah the OG Xbox One used plain old DDR3 for both system and graphics memory, and tried to offset it with 32MB of fast on-chip memory, while the PS4 just used GDDR5 for everything

It was awkward to program for and Microsoft switched to just using GDDR5 with the Xbox One X

A concept that sort of lives on in current gen Radeons in the form of infinity cache, which lets them get away with "only" 256 bus width on their high end GPUs.

BlankSystemDaemon
Mar 13, 2009




Twerk from Home posted:

Hardware TSX also helps PS3 emulation performance a ton, to the point where people have been finding hacks to re-enable it on the generations of CPU that had it, which were at least Haswell and Broadwell, but maybe Skylake: https://rpcs3.net/blog/2020/08/21/hardware-performance-scaling/.

Ok, looking at the article, the God of War and Uncharted games only run well with TSX, which is only available on 4th-9th gen Intel CPUs, and requires disabling some security mitigations.

PS3 emulation is the wild west of performance engineering. I'm really curious if having >8 cotes with full AVX-512 support might finally crack that nut because it'll be more possible to pin threads and directly act like the SPUs.
Do note, that enabling hardware TSX on Intel CPUs, opens you up to all sorts of pretty credible information leaks, including ones that've been used to make a microcode decryptor.

repiv
Aug 13, 2009

VorpalFish posted:

A concept that sort of lives on in current gen Radeons in the form of infinity cache, which lets them get away with "only" 256 bus width on their high end GPUs.

The difference is the Xbox ESRAM wasn't a cache, it was an independent block of memory that the engine had to janitor manually

A closer analogue is the Series X/S which have fast RAM and slow RAM, but the fast RAM is the bigger chunk so it's easier to handle

BlankSystemDaemon posted:

Do note, that enabling hardware TSX on Intel CPUs, opens you up to all sorts of pretty credible information leaks, including ones that've been used to make a microcode decryptor.

Maybe so, but there's probably not much effort being put into exploiting a flaw that only a subset of PS3 emulation enthusiasts are still vulnerable to

repiv fucked around with this message at 20:33 on Jul 26, 2022

Shipon
Nov 7, 2005

BlankSystemDaemon posted:

Do note, that enabling hardware TSX on Intel CPUs, opens you up to all sorts of pretty credible information leaks, including ones that've been used to make a microcode decryptor.

Does this matter for anyone who isn't a sysadmin? I'm pretty tired of losing performance because some infosec researchers saw an attack that might only be employed against critical business/government servers and caused us gamers to lose out on performance through the mandatory mitigations.

BlankSystemDaemon
Mar 13, 2009




repiv posted:

Maybe so, but there's probably not much effort being put into exploiting a flaw that only a subset of PS3 emulation enthusiasts are still vulnerable to
Probably, but it's still something to be mindful of - because security doesn't come about by Doing One Thing That Hackers Hate.

Shipon posted:

Does this matter for anyone who isn't a sysadmin? I'm pretty tired of losing performance because some infosec researchers saw an attack that might only be employed against critical business/government servers and caused us gamers to lose out on performance through the mandatory mitigations.
The thing is, information leak by itself is fairly mild.

But what about when you're running often-obfuscated code in a language that's designed to let you do anything without any limits/constraints? Because that's the reality of browsing the modern web; JavaScript, or increasingly WebAssembly, gets executed at an astonishing rate, and not only is there too much of it for any single person or group of people to audit, that process is often intentionally made harder.

Add to this that a Spectre and Meltdown are not the first, nor the last of their lineage of information leaks that can be used to leak privileged information (such as data that's normally assumed to be cryptographically secure such as the various ciphers and hashes used for modern encryption, or the keys you've got stored in your password keeper), and you can probably see why it's not just a big deal for big corporations, when exploitation of these things eventually starts happening at a broad scale.

What's worse is that there's been an increasing trend whereby security researchers spend weeks and months decompiling/deobfuscating/working on an exploit chain used to deliver some sort of code, and eventually it turns out it's a cryptocoin miner - this has caused a not-insignificant amount of people to stop being in the infosec field, because it's utterly draining for them to spend all that effort, only to find the same thing.
This leads to even less people working on these security problems.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Twerk from Home posted:

Hardware TSX also helps PS3 emulation performance a ton, to the point where people have been finding hacks to re-enable it on the generations of CPU that had it, which were at least Haswell and Broadwell, but maybe Skylake: https://rpcs3.net/blog/2020/08/21/hardware-performance-scaling/.

Ok, looking at the article, the God of War and Uncharted games only run well with TSX, which is only available on 4th-9th gen Intel CPUs, and requires disabling some security mitigations.

PS3 emulation is the wild west of performance engineering. I'm really curious if having >8 cotes with full AVX-512 support might finally crack that nut because it'll be more possible to pin threads and directly act like the SPUs.

I got a 6400 in my other computer, but that's probably too hamstrung by only having 4 threads anyway.

repiv
Aug 13, 2009

Speaking of AVX512 and emulation

https://twitter.com/Dachsjaeger/status/1552621823975727106?t=ERHvPyKvSVF8jM0CvFoA7g&s=19

Zen4 is probably going to crush RPCS3

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

repiv posted:

Speaking of AVX512 and emulation

https://twitter.com/Dachsjaeger/status/1552621823975727106?t=ERHvPyKvSVF8jM0CvFoA7g&s=19

Zen4 is probably going to crush RPCS3

DDR5 typo on the 10900K side

I was hoping one of the big tech channels would do a video on RPCS3

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

repiv posted:

Zen4 is probably going to crush RPCS3

Wait, Zen 4 has full width AVX-512 support for the common instructions? And Intel client stuff doesn't? What kind of bizarro world are we in.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

Twerk from Home posted:

Wait, Zen 4 has full width AVX-512 support for the common instructions? And Intel client stuff doesn't? What kind of bizarro world are we in.

It executes the 512b instructions in two clocks instead of one.

repiv
Aug 13, 2009

in a well actually posted:

It executes the 512b instructions in two clocks instead of one.

Where'd that information come from? I thought it was still up in the air whether 512bit ops would run at full or half rate.

For emulation that's irrelevant anyway, neither RPSC3 or Yuzu use anything wider than 128bit. They only care about the new instructions and extra registers.

hobbesmaster
Jan 28, 2008

in a well actually posted:

It executes the 512b instructions in two clocks instead of one.

Ah, just like early AVX256.

It’s nice that the code works at least.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

repiv posted:

Where'd that information come from? I thought it was still up in the air whether 512bit ops would run at full or half rate.

For emulation that's irrelevant anyway, neither RPSC3 or Yuzu use anything wider than 128bit. They only care about the new instructions and extra registers.

Twitter, folks mostly working from the AMD isa manual, published floating-point numbers from future supercomputers, and publicly disclosed power and design stats.

kliras
Mar 27, 2021
bit of a miss

https://twitter.com/cnbcnow/status/1552747581427564544?s=21&t=8DOqsZ1-zEOdU96hV-hkPw

BlankSystemDaemon
Mar 13, 2009




I'm genuinely curious what you think this means?
Every single stock trading decision that has any effect is made by some ~algorithm~, that's optimized to make money for someone who values short-term profits over anything up-to-and-including the survival of the human species.

Not that I don't think Intel needs to prove themselves capable of competing with AMD, they absolutely do - but I'm just not sure what relevance it has.

Inept
Jul 8, 2003

BlankSystemDaemon posted:

I'm genuinely curious what you think this means?

They made 2.5 billion less in Q2 than they expected. That kind of drop usually leads to more drastic measures to appease investors, such as layoffs, mothballing unprofitable parts of the company, and so on.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
You can think what you want of the stock price, the earnings are real, and that's a big miss. AMD and Nvidia are probably going to be hurting for the next year or so as well. The tides have turned and right now is not the best time to be a tech hardware company.

kliras
Mar 27, 2021
here's a fun one to peruse

https://twitter.com/IanCutress/status/1552764407696031749

intel's also completely shutting down optane. maybe you'll be able to find some fun stuff on ebay

Cygni
Nov 12, 2005

raring to post

Yeah, demand softened insanely last quarter for PC consumer stuff basically across the board. Intel reported 17% drop QoQ for consumer, and the analyst info was that the entire PC market contracted by 13% this quarter. And Intel being (still) unable to get Sapphire Rapids, Arc, and Ponte Vecchio out the door in meaningful numbers is not helping, as intel said on the call. The CFO did say they believe its the bottom now, though. We will see.

There was this piece in the press release though:

quote:

Intel made significant progress during the quarter on the ramp of Intel 7, now shipping in aggregate over 35 million units. The company expects Intel 4 to be ready for volume production in the second half of this year and is at or ahead of schedule for Intel 3, 20A and 18A

If true, they might climb back into the fab competition with TSMC's delays.

wargames
Mar 16, 2008

official yospos cat censor
I do think we have another Quarter of contractions before a rebound.

Perplx
Jun 26, 2004


Best viewed on Orgasma Plasma
Lipstick Apathy
Seems like they lost money on gpus and foundry, which makes sense because of startup costs and they aren’t for sale yet.

BattleMaster
Aug 14, 2000

Perplx posted:

Seems like they lost money on gpus and foundry, which makes sense because of startup costs and they aren’t for sale yet.

hopefully MBAs who are learning for the first time that new products cost money to make don't do anything too stupid in response to this

mobby_6kl
Aug 9, 2009

by Fluffdaddy

BattleMaster posted:

hopefully MBAs who are learning for the first time that new products cost money to make don't do anything too stupid in response to this

https://mobile.twitter.com/anshelsag/status/1552768758023745536

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22

BattleMaster posted:

hopefully MBAs who are learning for the first time that new products cost money to make don't do anything too stupid in response to this

probably the more concerning piece is huge revenue decreases in key segments, particularly client computing

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
Yeah the datacentre figures are pretty yikes. I'd imagine Intel is getting squeezed between better value for money Epycs and higher density ARM-based designs like Amazon's Graviton.

Dr. Fishopolis
Aug 31, 2004

ROBOT

BlankSystemDaemon posted:

I'm genuinely curious what you think this means?
Every single stock trading decision that has any effect is made by some ~algorithm~, that's optimized to make money for someone who values short-term profits over anything up-to-and-including the survival of the human species.

Do you think that stock prices just go up and down randomly based on the whims of computers without any input from the realities of business or the economy? It means that intel missed its earnings targets by a mile and the market reacted in an expected fashion.

I agree that the stock market is an irrational casino but it's not that irrational.

WhyteRyce
Dec 30, 2001

To be fair Intel quite often beat market expectations and still dropped the next day

Dr. Fishopolis
Aug 31, 2004

ROBOT

WhyteRyce posted:

To be fair Intel quite often beat market expectations and still dropped the next day

well sure, but if you beat market expectations yet you're still losing datacenter share your stock is probably gonna drop anyway.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Dr. Fishopolis posted:

Do you think that stock prices just go up and down randomly based on the whims of computers without any input from the realities of business or the economy? It means that intel missed its earnings targets by a mile and the market reacted in an expected fashion.

I agree that the stock market is an irrational casino but it's not that irrational.

AMD stock goes down whenever they announce new products. Not even attached to earnings reports. Like, good, new products. I'm semi-convinced it's because there's a HFT system with a headline scraping algorithm that just automatically dumps AMD every time it shows up in the news somewhere.

CoolCab
Apr 17, 2005

glem
bunch of rumours going around of capabilities or whatever making people think the new chips are going to poo poo sunbeams and then expectation not living up to reality, maybe?

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
Over on team red, we were just coming off years and years of AMD being on a downward trajectory towards joining the ranks of Cyrix and Transmeta, barely scraping by with Construction cores.

But Zen 1 came out, and while it was a tiny light of hope, it was enough to give AMD the separation it needed from The Bad Years, and, fine, there was reason to be skeptical of AMD, after the entire Construction cores debacle, and their continued woes with GCN.

Zen 1+, no guarantee that AMD could follow up with a good second step.

And then we got into 2 and 3, and STILL stock was dropping after Computex keynotes. AMD has come out with enough products now that compete on-par with Intel products, has been consistently gaining server market share, and has now set their sights on improving mobile market share that I really do suspect that some legacy algorithm in someone's HFT stack just someone inherently associates a large number of AMD headlines with bad news and dumps it, despite it being on a consistent growth track.

But this isn't an x86 thread, so let's get back to Intel.

I really hope Intel doesn't take their current stock woes as an indication that they need to axe the GPU department before it's had a chance to do anything. The improved Xe cores in their iGPU silicon have been a really nice benefit when upgrading friends and family in the past few years. I'm hoping that if Intel can get their feet straight, they'll be able to make GPUs that live up to the promise that Nvidia Titan cards once had, with less driver screwyness.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply