Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Worf
Sep 12, 2017

If only Seth would love me like I love him!

fishmech posted:

and then 10% "people might use this to rip blu ray movies".

dang that would have been cool to do


tbh having linux on a ps3 just sounds overall like a ton of fun and im sad i cant do that

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

it wasn't all that good in practice because linux couldn't use the gpu

software rendering is no fun

The_Franz
Aug 8, 2003

Statutory Ape posted:

dang that would have been cool to do


tbh having linux on a ps3 just sounds overall like a ton of fun and im sad i cant do that

bluray crypto was compromised for years before this happened

using linux on the ps3 sucked because you only had access to 192 megs of system ram, the main cpu and io was really slow and there was no gpu access, probably to avoid people making unlicensed games for the system. someone did find a way to use the gpu under linux at some point, but it was patched out in the next firmware revision.

Cygni
Nov 12, 2005

raring to post

Wasnt the whole point of the linux thing to use as a bargaining chip to keep homebrew people from widely releasing copyright cracks?

Like "we know you can crack this system but we gave you linux, so dont release the crack to Johnny P. Fucko and we will let you keep it"

The_Franz
Aug 8, 2003

back in the mid-naughts ibm and sony seemed to really believe that the cell would be this amazing new processor that would take over the world and wanted to get it out there for people to use. ibm's cell-equipped blade servers cost somewhere in the neighborhood of $20k each, so letting people use linux on the ps3 was seen as an ideal way for students and others of lower means to get experience with the processor of the future. except it only took a few years for ibm to go "whelp, this thing has no future" and gave up on it

i think the linux thing was also a tax dodge in certain countries since it got the ps3 classified as a computer instead of a video game machine

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

The_Franz posted:

back in the mid-naughts ibm and sony seemed to really believe that the cell would be this amazing new processor that would take over the world and wanted to get it out there for people to use. ibm's cell-equipped blade servers cost somewhere in the neighborhood of $20k each, so letting people use linux on the ps3 was seen as an ideal way for students and others of lower means to get experience with the processor of the future. except it only took a few years for ibm to go "whelp, this thing has no future" and gave up on it

i think the linux thing was also a tax dodge in certain countries since it got the ps3 classified as a computer instead of a video game machine

Yeah, I have a Cell CPU powered blade for my IBM Bladecenter. Its nothing great

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
The wikipedia page on Cell still has a lot of that mid-naughts hype :allears:

lolling about how all the talk about possible future applications just ends around 2007-2008 though

TorakFade
Oct 3, 2006

I strongly disapprove


repiv posted:

it wasn't all that good in practice because linux couldn't use the gpu

software rendering is no fun

I confirm that, having had the PS3 as my main system for a period (it was the reason I bought it at launch, actually.. I had no computer and that seemed a decent compromise) and could tolerate it for a few months because linux was good enough for browsing, barely

I am glad they removed it frankly, it was piss poor, if I was in the USA I might've gone for the class action to get some money back :v:

The_Franz
Aug 8, 2003

TorakFade posted:

I confirm that, having had the PS3 as my main system for a period (it was the reason I bought it at launch, actually.. I had no computer and that seemed a decent compromise) and could tolerate it for a few months because linux was good enough for browsing, barely

I am glad they removed it frankly, it was piss poor, if I was in the USA I might've gone for the class action to get some money back :v:

I wasn't being hyperbolic when i used the :10bux: smilie for the settlement amount. It was exactly $10.07 USD.

Klyith
Aug 3, 2007

GBS Pledge Week

TheFluff posted:

The wikipedia page on Cell still has a lot of that mid-naughts hype :allears:

lolling about how all the talk about possible future applications just ends around 2007-2008 though

It wasn't like they were totally on an island by themselves with the Cell, remember that bulldozer had the weird off-balance 1 ALU / 2 FPU design as well. Even though bulldozer FPUs were not the restricted and difficult Cell SPEs, there was still the idea that the multimedia future was going to call for lots of FLOPS.

Turned out not to be the case, magic compilers that extract parallelism still don't exist and getting programmers to write code for your gimmicks is a tough sell.




edit: oh welp I totally got bulldozer backwards :sweatdrop: this was a dumb post

Klyith fucked around with this message at 16:13 on Apr 19, 2019

Arzachel
May 12, 2012

Klyith posted:

It wasn't like they were totally on an island by themselves with the Cell, remember that bulldozer had the weird off-balance 1 ALU / 2 FPU design as well. Even though bulldozer FPUs were not the restricted and difficult Cell SPEs, there was still the idea that the multimedia future was going to call for lots of FLOPS.

Turned out not to be the case, magic compilers that extract parallelism still don't exist and getting programmers to write code for your gimmicks is a tough sell.

Other way around, 1 FPU per 2 integer clusters. Itanium and Larabee are much better comparisons

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

Klyith posted:

It wasn't like they were totally on an island by themselves with the Cell, remember that bulldozer had the weird off-balance 1 ALU / 2 FPU design as well. Even though bulldozer FPUs were not the restricted and difficult Cell SPEs, there was still the idea that the multimedia future was going to call for lots of FLOPS.

Turned out not to be the case, magic compilers that extract parallelism still don't exist and getting programmers to write code for your gimmicks is a tough sell.

Sure, it's just funny to me how the article still has this very optimistic tone about the future of the Cell despite the fact that it was pretty much discontinued ten years ago.

TorakFade
Oct 3, 2006

I strongly disapprove


The_Franz posted:

I wasn't being hyperbolic when i used the :10bux: smilie for the settlement amount. It was exactly $10.07 USD.

Still better than the big fat $0.00 I got, and that thing also YLOD'd two months after the warranty expired (2 years). Luckily by then I could get a slim for relatively cheap, but still... worst console purchase I ever did.

Speaking of which, Ryzen 3600x 8 core @4.5Ghz when? I kind of want to upgrade, now that we know next gen consoles will have 8 cores at least I'd love to get 8 myself in order to "future-proof", and I "only" have a 6 core 2600x :v:

SwissArmyDruid
Feb 14, 2014

by sebmojo
Computex at the earliest.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
If we don't get any new info on Zen 2 (specs/price/release date) at Computex I will start to worry.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
Would it be safe to say that DDR5 won't be supported until Zen 2's successor?

Khorne
May 1, 2002

ConanTheLibrarian posted:

Would it be safe to say that DDR5 won't be supported until Zen 2's successor?
It might even be zen4. Zen3 likely rolls out next year with zen2-tier architecture improvements and 7nm+ EUV. No one knows if it will be DDR5 or not yet. If it isn't, there's a decent chance it will be AM4.

Indiana_Krom
Jun 18, 2007
Net Slacker
Wouldn't DDR5 require a different socket? Or at a bare minimum a different motherboard, because DDR5 is probably going to be keyed differently even if it has the same number of contacts?

Stickman
Feb 1, 2004

Indiana_Krom posted:

Wouldn't DDR5 require a different socket? Or at a bare minimum a different motherboard, because DDR5 is probably going to be keyed differently even if it has the same number of contacts?

It'll require a different motherboard because the RAM itself will likely have a different socket and pinout. It'll also require a different memory controller on the CPU. It won't necessarily require a different CPU socket, though, and if AMD includes both a DDR4 and a DDR5 memory controller on Zen 3 chips, they could still be backward-compatible with previous AM4 motherboards. There's some precedent - I believe Skylake included DDR3 and DDR4 controllers?

ItBurns
Jul 24, 2007

spasticColon posted:

If we don't get any new info on Zen 2 (specs/price/release date) at Computex I will start to worry.

Same, my goddamn computer is almost 7 years old and I'm eyeballing a 9900k real hard. It will be disappointing if it's only 8C/16T though and I fear not competitive at the high end either.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

ItBurns posted:

Same, my goddamn computer is almost 7 years old and I'm eyeballing a 9900k real hard. It will be disappointing if it's only 8C/16T though and I fear not competitive at the high end either.

My system is almost 8 years old so if Zen 2 is a delayed wet fart I'll just build a system around a 2700X because I want 8C/16T for parity with the next-gen consoles.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Stickman posted:

It'll require a different motherboard because the RAM itself will likely have a different socket and pinout. It'll also require a different memory controller on the CPU. It won't necessarily require a different CPU socket, though, and if AMD includes both a DDR4 and a DDR5 memory controller on Zen 3 chips, they could still be backward-compatible with previous AM4 motherboards. There's some precedent - I believe Skylake included DDR3 and DDR4 controllers?

Indeed. But it wouldn't even be the first time that a motherboard supported multiple flavors of DDR. From Intel, no less:

https://www.gigabyte.com/Motherboard/GA-P35C-DS3R-rev-21#ov
https://www.asus.com/us/Motherboards/P5G41CM_LX/

And then of course, you know that there are all manner of monstrous creations in ASRock's history, so of COURSE they have motherboards that supported multiple types of RAM.

But the key here is that they will only support one or the other at any given time, and you will not be able to mix types.

PC LOAD LETTER
May 23, 2005
WTF?!

Khorne posted:

No one knows if it will be DDR5 or not yet. If it isn't, there's a decent chance it will be AM4.
It'll be DDR4.

AM4 flat out can't support DDR5, AMD has already said as much.

AMD also tends to lag the industry a bit when supporting new memory standards too. I doubt they're gonna rush to ditch AM4 with its cheap and "good enough" DDR4 for extremely expensive DDR5.

The DRAM OEM's are saying DDR5 isn't really expected to be a thing for consumers until 2021 at the earliest and more realistically won't get mainstream acceptance/volume until well into (Q2 or Q3) 2022. If you look around you can find plenty of 1-2yr old articles saying DDR5 is coming in early to late 2019 and that the chips are done and demo'd and all that but a completed design and a demo is very very different from shipping a high volume of finished parts.

Part of the problem is the expected high prices of DDR5 (at launch, which will happen well before 2021, but about the only people who will be able to pay for it at those prices will be server guys), another is the expected technical challenges, but also continued slumping sales of PC's in general which will slow adoption rates.

Looks like the DRAM OEM's are gonna end up pushing DDR4 3200 into the mainstream to try and deal with the expected slow rollout of DDR5 for now.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug
GPD is making a Ryzen based mini gaming laptop

https://liliputing.com/2019/04/gpd-win-max-will-be-an-amd-ryzen-powered-handheld-gaming-pc.html

Worf
Sep 12, 2017

If only Seth would love me like I love him!

RAM is finally affordable... Long live ddr4

Mr.Radar
Nov 5, 2005

You guys aren't going to believe this, but that guy is our games teacher.

I would say :rip: Smatch Z but :lol: like that was ever going to actually ship, so instead I'll say :rip: Smatch Z backers.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Khorne posted:

This hasn't been an issue since the IHS became a thing. Modern motherboards are also increasingly thick.

there were people cracking Skylake's PCB after it released. It's a thinner PCB than previous generations, and those people were probably also applying incorrect amounts of pressure and letting the cooler torque on the ILM, but it has actually been a thing recently.

Paul MaudDib fucked around with this message at 09:10 on Apr 23, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BangersInMyKnickers posted:

Yeah, if you're okay with your servers being essentially disposable commodities or doing a lot of work to build your own automation tools then Supermicro can be the right choice but Dell and HP do a fair bit of work to make sure things are validated and you have the right management/recovery tools. I wouldn't be touching them unless I was running some kind of large-scale standardized infrastructure.

Even my NAS server, I had to go into BIOS to enable a custom "BMC DMA fixup" setting to get FreeBSD to boot. Took me a while to figure out that it might be a DMA issue, then to hunt down the option that fixed that. And that's on Intel.

That said, if no one has commented on the AM4 IPMI board that SuperMicro just released... it's basic but it's a start. No 10 GbE, no quad NICs, but it's a place to start.

Lambert posted:

Optane as it relates to caching is nothing special (and pretty much obsolete), AMD has had a similar solution forever.

But I don't believe it's going to be a cache drive because that increases manufacturing complexity (SSD & HDD) and doesn't solve the fundamental problem of extremely long load times.

Yup, Optane is just NVMe unless you do DIMM-level support.

It could potentially be a thing to provide a large, fast cache for stuff like open world games. You would have to specifically code your engine around the idea of a multi-level cache. Sony does exclusives and they could pull that off.

Not highly likely but there is a concept there.

Mister Facetious posted:

What were the 6/7 stream processors of the Cell even supposed to be good for?

They have very high bandwidth and computational intensity if the data processing you need is a pipeline-like arrangement.

There is a reason that PS3 takes a lot of horsepower to emulate. It actually is fast compared to a general-purpose uarch of its time.

eames
May 9, 2009

https://www.youtube.com/watch?v=qU0V5OcHE_4

DF offers some speculation on next-gen console APUs.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Does anyone know what the gently caress Gonzalo is? Another semicustom part? Or is that the -G parts that Adored was talking about? Seems too early for the PS5/XB2 but also too powerful for a regular APU.

(20 CUs on 2ch DDR4 doesn't make sense to me but neither does Gonzalo, either in timing or configuration.)

edit: just saw the above

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~
I've noticed a whole bunch of Zen+ laptop chips just came out, I wonder how they are vs Intel. Like Ryzen 3750H vs 9300H (also new from Intel but its just 8300H+100mhz)

Haven't seen any reviews/benchmarks yet.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Are they still single stick slow speed RAM?

SlayVus
Jul 10, 2009
Grimey Drawer
If anyone has a threadripper with an Enermax Liqtech TR4 AIO, your AIO is probably dying as we speak. I was having thermal issues with my 280mm version and Steve just did a video on this as well.

https://www.youtube.com/watch?v=HC1kzO_gIp4

ehnus
Apr 16, 2003

Now you're thinking with portals!

Mister Facetious posted:

What were the 6/7 stream processors of the Cell even supposed to be good for?

Games had access to six, but they could really only count on the capacity of five and a half. They were good for quite a few things.

The local memory wasn't that large at a glance but it was blazing fast. It was like each SPU had 256kb of L1 cache.

There were a number of workloads that suited them natively, things where a lot of math was needed (animation, audio, graphics setup). They were used to make up for deficiencies in other parts of the system (patching values in shader programs because the RSX was from between GPU generations and didn't have things the later ones did, like constant registers). They could also be used to run general purpose code to offload the main processor -- decompressing assets at load time, etc. -- though they weren't as efficient at it.

PC LOAD LETTER
May 23, 2005
WTF?!

ehnus posted:

The local memory wasn't that large at a glance but it was blazing fast. It was like each SPU had 256kb of L1 cache.
It was really a LSU (Local Store Unit, aka scratch pad memory) and not like the L1 cache in a modern x86 chip (or one of that time period either).

Thats not just a semantics issue.

From what I recall of some of the comments from the guys over at B3D that actually had to make games on the thing that meant that pretty much all the memory management on the LSU's had to be done by the programmer by "hand" (compilers were supposed to help (since the SPU's supported branch hints) and did but never well enough to make up for the deficiencies of the LSU or lack of a proper branch predictor in the SPU's themselves), which was apparently quite difficult to do and was a major limiting factor in getting performance out of the SPU's. Particularly for the first few years the PS3 was out. Also the LSU's latency wasn't all that great (neither was the EIB's if you couldn't sustain a high level of concurrency with your work load which made things worse) and so any sort of cache misses or branch mispredicts were massively penalizing to performance (made even worse by the deeply pipelined and in order nature of the SPU's).

Performance on general code for the SPU's wasn't just inefficient, it was abysmal, and as a result pretty much anything that required general (read: has branches and/or not highly parallel in nature) performance was ran on the PPE and not the SPU's by developers for the entire life of the PS3.

ehnus posted:

There were a number of workloads that suited them natively, things where a lot of math was needed (animation, audio, graphics setup).
So long as the work load was highly parallel in nature, most of the relevant data stayed in the LSU or could be easily streamed in and out of the LSU, and had little or no branches in the code the SPU's could indeed be made to perform very well. The problem was lots of work loads didn't fit well or at all within those restrictions and so while the SPU's hypothetically offered heaps of performance on all but a few real world work loads that potential was never realized.

While it was cool that it could do stuff like graphics or animations pretty quickly ultimately those tasks could've been more efficiently handled by the GPU or perhaps a task dedicated processor like a DSP. It was Cell's inability to perform as well as it was advertised to initially on generalized work loads, short of heroic feats of programming effort, that caused it to be seen as largely a failure as a CPU.

Old but pretty cool, short and sweet commentary from a developer who worked on all kinds of consoles that seems relevant here:

quote:

PS3: A 95 pound box shows up on your desk with a printout of the 24-step instructions for how to turn it on for the first time. Everyone tries, most people fail to turn it on. Eventually, one guy goes around and sets up everyone else’s machine. There’s only one CPU. It seems like it might be able to do everything, but it can’t. The SPUs seem like they should be really awesome, but not for anything you or anyone else is doing. The CPU debugger works pretty OK. There is no SPU debugger. There was nothing like PIX at first. Eventually some Sony 1st-party devs got fed up and made their own PIX-like GPU debugger. The GPU is very, very disappointing… Most people try to stick to working with the CPU, but it can’t handle the workload. A few people dig deep into the SPUs and, Dear God, they are fast! Unfortunately, they eventually figure out that the SPUs need to be devoted almost full time making up for the weaknesses of the GPU.

edit: another cool quote from a guy who seemed to have worked at Naughty Dog who commented the above developers thoughts:

quote:

agavin:

If you could figure out how to take some expensive part of your code and move it to an SPU (and we spent a LOT of time doing that at Naughty Dog) it basically became free. Once moved, you could do pretty much as much of it as you liked. The SPUs were so much faster at what they did than anything else it was crazy. Too bad they were SO hard to program. Pretty much only by hand assembly designs worked, and that was almost the easy part compared to the architecting of how you would structure your data and squeeze it into memory.

Various additional tidbits at: http://all-things-andy-gavin/video-games

PC LOAD LETTER fucked around with this message at 06:42 on Apr 26, 2019

repiv
Aug 13, 2009

i had to dig up this old post because lmao

Suspicious Dish posted:

The Sony answer: "everything". You were supposed to do all your work on these SPUs, and they tried to ship a lot of middleware. First problem: no common task scheduler, so middleware had no good way of being run on an SPU. Sony tried to ship a task scheduler, but then realized that no game would use it because engines already have their own task schedulers and asking game developers to rip out their own stuff (one of the most core parts of the engine) in favor of required middleware is probably the easiest way to get nobody to use your thing.

So they eventually just came up with a semi-standard API and games used that. Keep in mind there wasn't a compiler for the SPUs early on, Sony shipped an "assembler" in the form of a very complex Excel spreadsheet that used macros and VBA to tell you about pipeline stalls. They eventually caved and added a gcc port, but a sucky one and only chums used it.

By the end, most people used the SPU for basically one task per frame. DICE used it for their shading engine: http://www.dice.se/wp-content/uploads/2014/12/Christina_Coffin_Programming_SPU_Based_Deferred.pdf . Naughty Dog used it for animations.

it's a miracle that anything got shipped on the ps3

Cygni
Nov 12, 2005

raring to post

Biostar confirmed in a round about way that the X570 boards are going to launch at Computex. Weirdly, AMD is also launching a "50th Anniversary Edition" 2700X next month (no clock speed bumps or anything, its just a 2700X).

Seems like mixed signals about Zen2. Like if they had Zen2 availability planned the same month, I imagine one of those would be branded the "Anniversary Edition". Unless there is more than 1? Computer part naming is stupid.

Craptacular!
Jul 9, 2001

Fuck the DH

repiv posted:

i had to dig up this old post because lmao


it's a miracle that anything got shipped on the ps3

Well now we know why there's like three engines in the world.

Pity the Square and Konami people who didn't just adopt Unreal or whatever and had to deal with that mess.

SlayVus
Jul 10, 2009
Grimey Drawer

SlayVus posted:

If anyone has a threadripper with an Enermax Liqtech TR4 AIO, your AIO is probably dying as we speak. I was having thermal issues with my 280mm version and Steve just did a video on this as well.

https://www.youtube.com/watch?v=HC1kzO_gIp4

I had an Enermax on my Threadripper 1950x and just took apart my cooler. Seriously, if you have one of these your system is in danger. I switch to a ThermalRight Silver Arrow TR4 with an additional two Noctua NF-A12s two weeks ago. I just opened my Enermax cooler up to find this.





This is all crusted into the microfins, it's not liquid.
https://i.imgur.com/ZGvhiAU.jpg

The white glob is hard to the touch.
https://i.imgur.com/tV237Wx.jpg

Basically, the bio-growth inhibited I would say more than 60 or 70% of the cooler's cooling capacity. The white glob is on the outlet side of the microfins, so the majority of the water was never getting through.

SlayVus fucked around with this message at 02:10 on Apr 27, 2019

Adbot
ADBOT LOVES YOU

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

SlayVus posted:

I had an Enermax on my Threadripper 1950x and just took apart my cooler. Seriously, if you have one of these your system is in danger. I switch to a ThermalRight Silver Arrow TR4 with an additional two Noctua NF-A12s two weeks ago. I just opened my Enermax cooler up to find this.





This is all crusted into the microfins, it's not liquid.
https://i.imgur.com/ZGvhiAU.jpg

The white glob is hard to the touch.
https://i.imgur.com/tV237Wx.jpg

Basically, the bio-growth inhibited I would say more than 60 or 70% of the cooler's cooling capacity. The white glob is on the outlet side of the microfins, so the majority of the water was never getting through.

Stuff like this makes me never want to sway from air cooling. A NH-D15 may be huge but I can go years without having to even think about it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply