Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Generic Monk
Oct 31, 2011

GRINDCORE MEGGIDO posted:

Saints row always ran pretty well, another reason the series is clearly superior.

saints row 2 runs like poo poo on any computer; you can brute force gtaiv and it's pretty easy to get a solid 60fps with all settings at the highest on modern systems but saints row 2 will always run like absolute dogshit, or with weird speedup glitches

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

GRINDCORE MEGGIDO posted:

Saints row always ran pretty well, another reason the series is clearly superior.

Doesn't Saints Row 2 run like poo poo, no matter what hardware you have?

e:f,b

JawnV6
Jul 4, 2004

So hot ...
Was it GTA3 that didn't bother using the OS hooks for timing and went with raw rdtsc, leading to jerky yakkity-sax stuttering on Cstate/Pstate transitions?

MagusDraco
Nov 11, 2011

even speedwagon was trolled

HalloKitty posted:

Doesn't Saints Row 2 run like poo poo, no matter what hardware you have?

e:f,b

It's internal timer was set to like the clockspeed of the xbox 360 or something. Other CPUs don't generally do that so everything gets busted.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
SR2 always ran fine here.

craig588
Nov 19, 2005

by Nyc_Tattoo

JawnV6 posted:

Was it GTA3 that didn't bother using the OS hooks for timing and went with raw rdtsc, leading to jerky yakkity-sax stuttering on Cstate/Pstate transitions?

I'd believe this. There are 3rd party timing patches for 3, VC and SA that are almost required because they work so much better than the standard timing.

SR2 never ran fine for me. 3 and 4 no problem, but 2 is unplayable.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


havenwaters posted:

It's internal timer was set to like the clockspeed of the xbox 360 or something. Other CPUs don't generally do that so everything gets busted.

Yeah the code itself for Saints Row 2 PC was drunk, but the game engine internal clock thing is very specifically a Windows 7 error. XP and Vista didn't do that; 8 and 10 don't do that.

7 needs a hack of some sort to multiply the reported clock rate to fix it.

For everything else Gentlemen of the Row helps.

GRINDCORE MEGGIDO
Feb 28, 1985


dont be mean to me posted:

Yeah the code itself for Saints Row 2 PC was drunk, but the game engine internal clock thing is very specifically a Windows 7 error. XP and Vista didn't do that; 8 and 10 don't do that.

7 needs a hack of some sort to multiply the reported clock rate to fix it.

For everything else Gentlemen of the Row helps.

Ahh, that's probably why it ran fine for me on XP. Fair enough.

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

craig588 posted:

SR2 never ran fine for me. 3 and 4 no problem, but 2 is unplayable.

Same here. I gave up on even trying to play the drat thing.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
Has there been any news about when PCIe 4 will be supported by Intel/MB manufacturers yet? It was standardised quite a while ago at this stage.

Generic Monk
Oct 31, 2011

ConanTheLibrarian posted:

Has there been any news about when PCIe 4 will be supported by Intel/MB manufacturers yet? It was standardised quite a while ago at this stage.

take the date it was finalised, and add like a year and a half?

Cygni
Nov 12, 2005

raring to post

ConanTheLibrarian posted:

Has there been any news about when PCIe 4 will be supported by Intel/MB manufacturers yet? It was standardised quite a while ago at this stage.

For consumer market, I think I read 2020. Same time as DDR5.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

Cygni posted:

For consumer market, I think I read 2020. Same time as DDR5.

That would be at least 2 1/2 years after standardisation. Seems excessive compared to the time to market of PCIe 3.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

ConanTheLibrarian posted:

That would be at least 2 1/2 years after standardisation. Seems excessive compared to the time to market of PCIe 3.

Intel had a lot of trouble getting Gen3 stable, and Gen4 is going to be harder; that and all of those consumer devices that need > PCIe 3.0 x16 bandwidth, like


Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

ConanTheLibrarian posted:

That would be at least 2 1/2 years after standardisation. Seems excessive compared to the time to market of PCIe 3.

We also aren't running into any super compelling needs to have PCIe-4 over PCIe-3 compared to USB3 needing the PCIe-3 lanes over PCIe-2. About all it would give us is twice the bandwidth to the southbridge for more NVMe stuff, but a lot of boards just hang those slots directly off the regular PCIe lanes these days anyways.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
Interestingly PCIe Gen5 will be coming out much quicker after Gen4 than it went Gen3->Gen4.. I believe the target for the spec v1.0 is in 2019 which is really quick after Gen4 hit 1.0 in 2017.

Some systems have Gen4 now like POWER9 and some ARMs (Mellanox SoC) but Intel is lagging behind. It will be really interesting to see if AMD can get Gen4 EPYCs out at the same time or even ahead of the Intel Gen4 (Ice Lake)..

EoRaptor
Sep 13, 2003

by Fluffdaddy

Methylethylaldehyde posted:

We also aren't running into any super compelling needs to have PCIe-4 over PCIe-3 compared to USB3 needing the PCIe-3 lanes over PCIe-2. About all it would give us is twice the bandwidth to the southbridge for more NVMe stuff, but a lot of boards just hang those slots directly off the regular PCIe lanes these days anyways.

This would be the one semi-urgent use for PCIe 4 in the consumer space, using it to hang a PCIe switch off the cpu, to keep pincounts down and tracelength short, and using the switch to split it up as PCIe 3. Still won't happen anytime soon.

Potato Salad
Oct 23, 2014

nobody cares


EoRaptor posted:

This would be the one semi-urgent use for PCIe 4 in the consumer space, using it to hang a PCIe switch off the cpu, to keep pincounts down and tracelength short, and using the switch to split it up as PCIe 3. Still won't happen anytime soon.

:bitcoin: ?

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

lol bitcoin doesn't need poo poo for pcie bandwidth.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

EoRaptor posted:

This would be the one semi-urgent use for PCIe 4 in the consumer space, using it to hang a PCIe switch off the cpu, to keep pincounts down and tracelength short, and using the switch to split it up as PCIe 3. Still won't happen anytime soon.

Which would be super cool for mega-SLI implementations--like 4x or 8x cards. Too bad NVidia killed off anything over 2x SLI (which doesn't even work all that well these days to begin with), and AMD is a hot mess. So basically there's no real point for it on the GPU side whatsoever.

Could still be cool for splitting out a whole mess of NVM drives, though, since trying to dig out 4x PCIe 3.0 lanes for more than one or two of those buggers is not as easy as I'd like on consumer-level boards. But past that I struggle to see the urgent need for it.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

DrDork posted:

Which would be super cool for mega-SLI implementations--like 4x or 8x cards. Too bad NVidia killed off anything over 2x SLI (which doesn't even work all that well these days to begin with), and AMD is a hot mess. So basically there's no real point for it on the GPU side whatsoever.

Could still be cool for splitting out a whole mess of NVM drives, though, since trying to dig out 4x PCIe 3.0 lanes for more than one or two of those buggers is not as easy as I'd like on consumer-level boards. But past that I struggle to see the urgent need for it.

They make PCi-e riser cards with a PLX chip in them specifically for that use case, but they're kinda retarded expensive.

redeyes
Sep 14, 2002

by Fluffdaddy

PCjr sidecar posted:

Intel had a lot of trouble getting Gen3 stable, and Gen4 is going to be harder; that and all of those consumer devices that need > PCIe 3.0 x16 bandwidth, like



Id be happy with like 4 16x FULL bandwidth slots. PCIe has been a real downer because of this.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

You are describing a server. Go buy one.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
Pcie 4 and 5 are for networking and storage appliances where e.g. 400gbe cannot be done full bandwidth over 3.0 x16

Graphics cards barely push 2.0 x16 let alone 3.0 x16

However, for compute oriented stuff, it would be a vendor agnostic version of NVLINK.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Malcolm XML posted:

Graphics cards barely push 2.0 x16 let alone 3.0 x16

Not in averages, but minimum frametimes are affected pretty noticeably at least by 3.0x4/2.0x8 speeds.

https://www.pcper.com/reviews/Graphics-Cards/External-Graphics-over-Thunderbolt-3-using-AKiTiO-Node/Performance-Testing

Pryor on Fire
May 14, 2013

they don't know all alien abduction experiences can be explained by people thinking saving private ryan was a documentary

Apple just announced they are moving away from Intel to their own chips for new macs in 2020.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
^^^ That some remainder from April fools?

Paul MaudDib posted:

Not in averages, but minimum frametimes are affected pretty noticeably at least by 3.0x4/2.0x8 speeds.
Yeah, latency and poo poo.

repiv
Aug 13, 2009

Pryor on Fire posted:

Apple just announced they are moving away from Intel to their own chips for new macs in 2020.

...do you have a link for that?

Pryor on Fire
May 14, 2013

they don't know all alien abduction experiences can be explained by people thinking saving private ryan was a documentary

lol it's 2018 who needs links just search twitter

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Paul MaudDib posted:

Not in averages, but minimum frametimes are affected pretty noticeably at least by 3.0x4/2.0x8 speeds.

https://www.pcper.com/reviews/Graphics-Cards/External-Graphics-over-Thunderbolt-3-using-AKiTiO-Node/Performance-Testing

Thunderbolt is its own can of worms since the lanes are often attached to the PCH instead of to the cpu and/or the Thunderbolt chipset sucks and causes extra latency

repiv
Aug 13, 2009

Here it is: https://www.bloomberg.com/news/articles/2018-04-02/apple-plans-to-move-from-intel-to-own-mac-chips-from-2020

"according to people familiar with the plans" is not the same as "apple just announced"

Pryor on Fire
May 14, 2013

they don't know all alien abduction experiences can be explained by people thinking saving private ryan was a documentary



ow

Rastor
Jun 2, 2001

Isn't it possible the Bloomberg sources are misunderstanding and this is about co-processors, not processors?

Pryor on Fire
May 14, 2013

they don't know all alien abduction experiences can be explained by people thinking saving private ryan was a documentary

repiv posted:

"according to people familiar with the plans" is not the same as "apple just announced"

Sure it is, mine is just phrasing from the future.

Cygni
Nov 12, 2005

raring to post

Apple lets that rumor out every time their deal with Intel is about to expire. Remember when they even even leaked an AMD powered desktop and then said they were just 'evaluating'.

They are certainly capable, but as long as Intel keeps selling their chips for a loss, there hasn't been much motivation to actually go through with it, especially considering how small a slice Mac is in their bottomline. Maybe Intel's failures at 10nm will be the catalyst, though.

LRADIKAL
Jun 10, 2001

Fun Shoe
Ryzen makes enough sense on the Mac since the main issue we have around here is single thread performance. An apple designed arm chip would require a lot of new code or a lot of fancy, relatively slow virtualization.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
I think there's a space for an A11X (or A12, or whatever they name it) Macbook to slot in at their lineup under the current Macbook and Macbook Pro that would basically be an iPad Pro with a great keyboard and trackpad.

Why are the transistor counts on current iPhones are way higher than Intel desktop CPUs anyway, the A11 has 4.3 billion transistors?

Twerk from Home fucked around with this message at 19:11 on Apr 2, 2018

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Any CPU they will release will upset the pro crowd again. There's a huge difference between some mobile ARM CPU and some powerhouse like a Intel Xeon or -X. Maybe it'll drive the pros to Wintel this time (lol no).

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Twerk from Home posted:

I think there's a space for an A11X (or A12, or whatever they name it) Macbook to slot in at their lineup under the current Macbook and Macbook Pro that would basically be an iPad Pro with a great keyboard and trackpad.

That would actually be an appealing product to me but they'd probably put iOS on it which would ruin it for me.

Adbot
ADBOT LOVES YOU

Pryor on Fire
May 14, 2013

they don't know all alien abduction experiences can be explained by people thinking saving private ryan was a documentary

I mean Apple has been hiring chip designers like crazy for what four years now? It's not like this was unexpected. I'm sure there will be some x86 solution or virtualization engine or whatever that is good enough.

Plus if anyone has the money to just buy some fabs or whatever needs to be done Apple can do it. They almost bought McLaren on a whim.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply