|
GRINDCORE MEGGIDO posted:Saints row always ran pretty well, another reason the series is clearly superior. saints row 2 runs like poo poo on any computer; you can brute force gtaiv and it's pretty easy to get a solid 60fps with all settings at the highest on modern systems but saints row 2 will always run like absolute dogshit, or with weird speedup glitches
|
# ? Mar 29, 2018 18:58 |
|
|
# ? May 27, 2024 23:14 |
|
GRINDCORE MEGGIDO posted:Saints row always ran pretty well, another reason the series is clearly superior. Doesn't Saints Row 2 run like poo poo, no matter what hardware you have? e:f,b
|
# ? Mar 29, 2018 19:15 |
|
Was it GTA3 that didn't bother using the OS hooks for timing and went with raw rdtsc, leading to jerky yakkity-sax stuttering on Cstate/Pstate transitions?
|
# ? Mar 29, 2018 19:44 |
|
HalloKitty posted:Doesn't Saints Row 2 run like poo poo, no matter what hardware you have? It's internal timer was set to like the clockspeed of the xbox 360 or something. Other CPUs don't generally do that so everything gets busted.
|
# ? Mar 29, 2018 19:55 |
|
SR2 always ran fine here.
|
# ? Mar 29, 2018 21:41 |
|
JawnV6 posted:Was it GTA3 that didn't bother using the OS hooks for timing and went with raw rdtsc, leading to jerky yakkity-sax stuttering on Cstate/Pstate transitions? I'd believe this. There are 3rd party timing patches for 3, VC and SA that are almost required because they work so much better than the standard timing. SR2 never ran fine for me. 3 and 4 no problem, but 2 is unplayable.
|
# ? Mar 29, 2018 21:55 |
|
havenwaters posted:It's internal timer was set to like the clockspeed of the xbox 360 or something. Other CPUs don't generally do that so everything gets busted. Yeah the code itself for Saints Row 2 PC was drunk, but the game engine internal clock thing is very specifically a Windows 7 error. XP and Vista didn't do that; 8 and 10 don't do that. 7 needs a hack of some sort to multiply the reported clock rate to fix it. For everything else Gentlemen of the Row helps.
|
# ? Mar 30, 2018 00:57 |
|
dont be mean to me posted:Yeah the code itself for Saints Row 2 PC was drunk, but the game engine internal clock thing is very specifically a Windows 7 error. XP and Vista didn't do that; 8 and 10 don't do that. Ahh, that's probably why it ran fine for me on XP. Fair enough.
|
# ? Mar 30, 2018 02:53 |
|
craig588 posted:SR2 never ran fine for me. 3 and 4 no problem, but 2 is unplayable. Same here. I gave up on even trying to play the drat thing.
|
# ? Mar 30, 2018 03:24 |
|
Has there been any news about when PCIe 4 will be supported by Intel/MB manufacturers yet? It was standardised quite a while ago at this stage.
|
# ? Mar 31, 2018 12:24 |
|
ConanTheLibrarian posted:Has there been any news about when PCIe 4 will be supported by Intel/MB manufacturers yet? It was standardised quite a while ago at this stage. take the date it was finalised, and add like a year and a half?
|
# ? Mar 31, 2018 12:34 |
|
ConanTheLibrarian posted:Has there been any news about when PCIe 4 will be supported by Intel/MB manufacturers yet? It was standardised quite a while ago at this stage. For consumer market, I think I read 2020. Same time as DDR5.
|
# ? Mar 31, 2018 17:16 |
|
Cygni posted:For consumer market, I think I read 2020. Same time as DDR5. That would be at least 2 1/2 years after standardisation. Seems excessive compared to the time to market of PCIe 3.
|
# ? Mar 31, 2018 21:49 |
|
ConanTheLibrarian posted:That would be at least 2 1/2 years after standardisation. Seems excessive compared to the time to market of PCIe 3. Intel had a lot of trouble getting Gen3 stable, and Gen4 is going to be harder; that and all of those consumer devices that need > PCIe 3.0 x16 bandwidth, like
|
# ? Mar 31, 2018 21:54 |
|
ConanTheLibrarian posted:That would be at least 2 1/2 years after standardisation. Seems excessive compared to the time to market of PCIe 3. We also aren't running into any super compelling needs to have PCIe-4 over PCIe-3 compared to USB3 needing the PCIe-3 lanes over PCIe-2. About all it would give us is twice the bandwidth to the southbridge for more NVMe stuff, but a lot of boards just hang those slots directly off the regular PCIe lanes these days anyways.
|
# ? Mar 31, 2018 21:58 |
|
Interestingly PCIe Gen5 will be coming out much quicker after Gen4 than it went Gen3->Gen4.. I believe the target for the spec v1.0 is in 2019 which is really quick after Gen4 hit 1.0 in 2017. Some systems have Gen4 now like POWER9 and some ARMs (Mellanox SoC) but Intel is lagging behind. It will be really interesting to see if AMD can get Gen4 EPYCs out at the same time or even ahead of the Intel Gen4 (Ice Lake)..
|
# ? Mar 31, 2018 22:03 |
|
Methylethylaldehyde posted:We also aren't running into any super compelling needs to have PCIe-4 over PCIe-3 compared to USB3 needing the PCIe-3 lanes over PCIe-2. About all it would give us is twice the bandwidth to the southbridge for more NVMe stuff, but a lot of boards just hang those slots directly off the regular PCIe lanes these days anyways. This would be the one semi-urgent use for PCIe 4 in the consumer space, using it to hang a PCIe switch off the cpu, to keep pincounts down and tracelength short, and using the switch to split it up as PCIe 3. Still won't happen anytime soon.
|
# ? Apr 1, 2018 22:01 |
|
EoRaptor posted:This would be the one semi-urgent use for PCIe 4 in the consumer space, using it to hang a PCIe switch off the cpu, to keep pincounts down and tracelength short, and using the switch to split it up as PCIe 3. Still won't happen anytime soon. ?
|
# ? Apr 2, 2018 15:21 |
|
lol bitcoin doesn't need poo poo for pcie bandwidth.
|
# ? Apr 2, 2018 16:40 |
|
EoRaptor posted:This would be the one semi-urgent use for PCIe 4 in the consumer space, using it to hang a PCIe switch off the cpu, to keep pincounts down and tracelength short, and using the switch to split it up as PCIe 3. Still won't happen anytime soon. Which would be super cool for mega-SLI implementations--like 4x or 8x cards. Too bad NVidia killed off anything over 2x SLI (which doesn't even work all that well these days to begin with), and AMD is a hot mess. So basically there's no real point for it on the GPU side whatsoever. Could still be cool for splitting out a whole mess of NVM drives, though, since trying to dig out 4x PCIe 3.0 lanes for more than one or two of those buggers is not as easy as I'd like on consumer-level boards. But past that I struggle to see the urgent need for it.
|
# ? Apr 2, 2018 16:44 |
|
DrDork posted:Which would be super cool for mega-SLI implementations--like 4x or 8x cards. Too bad NVidia killed off anything over 2x SLI (which doesn't even work all that well these days to begin with), and AMD is a hot mess. So basically there's no real point for it on the GPU side whatsoever. They make PCi-e riser cards with a PLX chip in them specifically for that use case, but they're kinda retarded expensive.
|
# ? Apr 2, 2018 17:02 |
|
PCjr sidecar posted:Intel had a lot of trouble getting Gen3 stable, and Gen4 is going to be harder; that and all of those consumer devices that need > PCIe 3.0 x16 bandwidth, like Id be happy with like 4 16x FULL bandwidth slots. PCIe has been a real downer because of this.
|
# ? Apr 2, 2018 17:40 |
|
You are describing a server. Go buy one.
|
# ? Apr 2, 2018 17:41 |
|
Pcie 4 and 5 are for networking and storage appliances where e.g. 400gbe cannot be done full bandwidth over 3.0 x16 Graphics cards barely push 2.0 x16 let alone 3.0 x16 However, for compute oriented stuff, it would be a vendor agnostic version of NVLINK.
|
# ? Apr 2, 2018 17:49 |
|
Malcolm XML posted:Graphics cards barely push 2.0 x16 let alone 3.0 x16 Not in averages, but minimum frametimes are affected pretty noticeably at least by 3.0x4/2.0x8 speeds. https://www.pcper.com/reviews/Graphics-Cards/External-Graphics-over-Thunderbolt-3-using-AKiTiO-Node/Performance-Testing
|
# ? Apr 2, 2018 18:06 |
Apple just announced they are moving away from Intel to their own chips for new macs in 2020.
|
|
# ? Apr 2, 2018 18:46 |
|
^^^ That some remainder from April fools?Paul MaudDib posted:Not in averages, but minimum frametimes are affected pretty noticeably at least by 3.0x4/2.0x8 speeds.
|
# ? Apr 2, 2018 18:47 |
|
Pryor on Fire posted:Apple just announced they are moving away from Intel to their own chips for new macs in 2020. ...do you have a link for that?
|
# ? Apr 2, 2018 18:48 |
lol it's 2018 who needs links just search twitter
|
|
# ? Apr 2, 2018 18:49 |
|
Paul MaudDib posted:Not in averages, but minimum frametimes are affected pretty noticeably at least by 3.0x4/2.0x8 speeds. Thunderbolt is its own can of worms since the lanes are often attached to the PCH instead of to the cpu and/or the Thunderbolt chipset sucks and causes extra latency
|
# ? Apr 2, 2018 18:51 |
|
Here it is: https://www.bloomberg.com/news/articles/2018-04-02/apple-plans-to-move-from-intel-to-own-mac-chips-from-2020 "according to people familiar with the plans" is not the same as "apple just announced"
|
# ? Apr 2, 2018 18:51 |
ow
|
|
# ? Apr 2, 2018 18:53 |
|
Isn't it possible the Bloomberg sources are misunderstanding and this is about co-processors, not processors?
|
# ? Apr 2, 2018 18:54 |
repiv posted:"according to people familiar with the plans" is not the same as "apple just announced" Sure it is, mine is just phrasing from the future.
|
|
# ? Apr 2, 2018 18:54 |
|
Apple lets that rumor out every time their deal with Intel is about to expire. Remember when they even even leaked an AMD powered desktop and then said they were just 'evaluating'. They are certainly capable, but as long as Intel keeps selling their chips for a loss, there hasn't been much motivation to actually go through with it, especially considering how small a slice Mac is in their bottomline. Maybe Intel's failures at 10nm will be the catalyst, though.
|
# ? Apr 2, 2018 18:59 |
|
Ryzen makes enough sense on the Mac since the main issue we have around here is single thread performance. An apple designed arm chip would require a lot of new code or a lot of fancy, relatively slow virtualization.
|
# ? Apr 2, 2018 19:06 |
|
I think there's a space for an A11X (or A12, or whatever they name it) Macbook to slot in at their lineup under the current Macbook and Macbook Pro that would basically be an iPad Pro with a great keyboard and trackpad. Why are the transistor counts on current iPhones are way higher than Intel desktop CPUs anyway, the A11 has 4.3 billion transistors? Twerk from Home fucked around with this message at 19:11 on Apr 2, 2018 |
# ? Apr 2, 2018 19:06 |
|
Any CPU they will release will upset the pro crowd again. There's a huge difference between some mobile ARM CPU and some powerhouse like a Intel Xeon or -X. Maybe it'll drive the pros to Wintel this time (lol no).
|
# ? Apr 2, 2018 19:08 |
|
Twerk from Home posted:I think there's a space for an A11X (or A12, or whatever they name it) Macbook to slot in at their lineup under the current Macbook and Macbook Pro that would basically be an iPad Pro with a great keyboard and trackpad. That would actually be an appealing product to me but they'd probably put iOS on it which would ruin it for me.
|
# ? Apr 2, 2018 19:10 |
|
|
# ? May 27, 2024 23:14 |
I mean Apple has been hiring chip designers like crazy for what four years now? It's not like this was unexpected. I'm sure there will be some x86 solution or virtualization engine or whatever that is good enough. Plus if anyone has the money to just buy some fabs or whatever needs to be done Apple can do it. They almost bought McLaren on a whim.
|
|
# ? Apr 2, 2018 19:12 |