|
Josh Lyman posted:If Pascal has no architectural improvements and is just a die shrink, how come Jen Hsun said it cost $2 billion to develop? The architecture is the same but a lot of effort was put into path optimization and other things to get it to run at 2GHz+, if they just did a die shrink it wouldn't be able to reach those clocks.
|
# ? Sep 8, 2016 00:15 |
|
|
# ? Jun 2, 2024 15:51 |
|
THE DOG HOUSE posted:Depends on the game. Some AAA titles had a special console mode below low settings. Some don't though and are probably "medium". Then in theory the PS4 Pro should be able to play 4K30 with upscaling tricks? Wonder if someone could benchmark a 480 @ 911Mhz and see what's behavior is for 4K @ medium settings. eames posted:If that is true (and I think it is) then Sony was probably caught with their pants down when Microsoft announced their next Xbox. The PS4 Pro looks rather lacklustre in my eyes. It seems like they primarily built it for the PSVR and they threw in a questionable non-native solution for 4K as an afterthought to keep the marketing department happy. I suspect we'll see a PS4 Pro slim shortly after the new Xbox is out. I'm still trying to figure out the 310W part, since I know Hawaii/Grenada with 0.95v and 911Mhz comparatively sips at power, something close to 200W IIRC, and it'd stomp the 4.2TFLOP part into the ground as well.
|
# ? Sep 8, 2016 00:16 |
|
Maybe it's the PSU spec and they're aiming for good efficiency?
|
# ? Sep 8, 2016 00:23 |
Yeah... It says max power. Assuming they are aiming for the standard 50% for efficiency reasons, then that gives a normal tdp of 155 which is pretty right on for Jaguar plus good enough gpu at a low clock. Is it a 14nm part? Or a 28nm? I highly doubt they would back port 14nm to 28nm due to the huge process changes involved if it's Polaris for the gpu part.
|
|
# ? Sep 8, 2016 03:04 |
|
Watt figures for consumer electronics are worst case numbers for the FCC label, not TDP like gets reported for cpu and gpu chips. The PS4 pro isn't gonna pull 300 watts in operation. That number is the sum of each part's absolute max plus some safety factor. They're not using 28nm, it's on AMD / GloFo 14nm. That was pretty much confirmed months ago by an AMD investor disclosure. The Pro might well be pulling more watts than the original 28nm PS4, but not a ton more. (The new case with it's 2 slots for venting might indicate a slightly bigger airflow requirement, but that's a guess on very little evidence. It might just be decor.) The leaked developer docs about the Pro that have been out for months now already established that this was not doing true 4k rendering, it's just got a nice upsampler and the 4k HDMI version. Both consoles already render to arbitrary sized framebuffers then scale to the TV output. Klyith fucked around with this message at 03:17 on Sep 8, 2016 |
# ? Sep 8, 2016 03:14 |
|
So they have jaguar on 14nmFF but didn't move to Puma+? Just no advantage? I'd also expect 14nm Jaguar to be well, faster. Moving from 28nm to 14nm shouldn't just be a 500Mhz jump, unless of course Jaguar hits awful diminishing returns past 2.1Ghz.
|
# ? Sep 8, 2016 03:24 |
|
FaustianQ posted:So they have jaguar on 14nmFF but didn't move to Puma+? quote:Just no advantage? I'd also expect 14nm Jaguar to be well, faster. Moving from 28nm to 14nm shouldn't just be a 500Mhz jump, unless of course Jaguar hits awful diminishing returns past 2.1Ghz. So when it comes to CPU, what do you do with gobs more processing power that wouldn't violate those restrictions, but still be worth dev time to implement? There's not a lot of great options. It's not like they can sell the game for more on Pro, it's the same disc for both. So my speculation would be the CPU only has enough of a bump to keep the GPU full, and otherwise is exactly the same. Less cost for the chip respin, less potential for compatibility issues, and less power use so the GPU part can go nuts.
|
# ? Sep 8, 2016 04:07 |
|
I guess I'll spell out exactly what's going on with the PS4pro. As new process nodes become available, console makers typically migrate their original designs onto the new process. They spend minimal design cost in order to improve yields while cost downing whatever components they can. Making the design exactly the same minimizes any extra work you need to do to make the old games run as if they're on the original chip. The PS4 slim is exactly that, the old 28nm PS4 SoC design put onto a 14nm process. The previous console generation was so long that the PS3 and the 360 actually got many different design revisions as they migrated their original 90nm CPUs and GPUs over to a 60nm process and once again to 45nm. The big "ah ha" moment Sony had with the PS4pro was what if instead of just reusing the same design, they tweak it slightly. They reused the CPU portion of the slim model's SoC (which they were going to design anyway), then essentially doubled the size of the GPU portion. Give it faster RAM and slightly higher clocks, and you have the PS4pro. What I'm wondering is if AMD took the chance to refresh the design of the GPU portion of the SoC. Theoretically, it's simple enough to slot in a new design there without mucking up compatibility (much less true for the CPU), but I suspect AMD took the safe option instead. filthychimp fucked around with this message at 08:44 on Sep 8, 2016 |
# ? Sep 8, 2016 08:38 |
|
FaustianQ posted:Wonder if someone could benchmark a 480 @ 911Mhz and see what's behavior is for 4K @ medium settings. Already done.
|
# ? Sep 8, 2016 08:51 |
|
filthychimp posted:What I'm wondering is if AMD took the chance to refresh the design of the GPU portion of the SoC. Theoretically, it's simple enough to slot in a new design there without mucking up compatibility (much less true for the CPU), but I suspect AMD took the safe option instead. It's been pretty well known for a while now that the GPU would be based on polaris tech, and that's now confirmed officially https://twitter.com/PlayStation/status/773600156356866048 With how easily AMD has slotted in new GCN updates to their APU over the years, I can't imagine that it's difficult or has any pitfalls they don't know about by now. GCN versions are all the same architecture overall, with evolutionary improvements. And these consoles are drat close to being just PCs, it's not like a minor GPU change is totally crazy for them. That's why these upgrades are happening now, and never did with previous generations. Commodity hardware = piggyback on commodity upgrades. Klyith fucked around with this message at 09:40 on Sep 8, 2016 |
# ? Sep 8, 2016 09:25 |
|
I'm one of those people who gets hit with sales tax on Jet, so I'm operating in percentages rather than raw dollar values. They have a Gigabyte RX470 4GB for 16.8% less than their best deal on a 4GB RX480, but unfortunately it looks be some OEM fans thrown on a barely touched board. The raw dollar values, for me anyway, is about $186 for the 470 and $218 for the 480. My brain hurts from reading benchmarks, I need to go to bed. I should probably get that 470, though. Craptacular! fucked around with this message at 12:01 on Sep 8, 2016 |
# ? Sep 8, 2016 11:49 |
|
Everyone could also just ignore the first gen die shrink and wait for the rapture to happen so we no longer have to deal with earthly matters.
|
# ? Sep 8, 2016 12:04 |
|
When are we getting Freesync for consoles?
|
# ? Sep 8, 2016 13:25 |
|
Likely when tee vee freesync support is widespread enough?
|
# ? Sep 8, 2016 13:33 |
|
Definitely not before we see Adaptive Sync becoming an official HDMI extension and not an AMD proprietary one. Not insofar that it is necessary for the consoles to have it, but rather it is necessary for there to be actual TV's that support it.
|
# ? Sep 8, 2016 14:49 |
|
Craptacular! posted:I'm one of those people who gets hit with sales tax on Jet, so I'm operating in percentages rather than raw dollar values. They have a Gigabyte RX470 4GB for 16.8% less than their best deal on a 4GB RX480, but unfortunately it looks be some OEM fans thrown on a barely touched board. Uhh this sounds like the 480 is the winner here or am I missing something
|
# ? Sep 8, 2016 15:12 |
|
Ok I fired up Battlefield 1 last night for about 30 minutes (literally the only time I had free to try it during Open Beta) and threw all the options to max only to see it throw up on my 980Ti. The game doesn't look that much more impressive than Star Wars Battlefront, so why in the heck does it run at 15-30FPS at 1440P. I had to pull the scaling down to 50% to get 60FPS+ on that conquest map. (also I suck at BF after not playing it near as much since BF2, the last good battlefield you could semi coordinate the pubbies. ) Also the latest Nvidia drive that was "optimized" for BF1, is a steaming pile. Installed it and while I was able to tweak settings and have no issues in BF1, the new driver crashes the game each time I touch any option in Fullscreen mode. Performance was absolutely nill for difference, if anything, it may have gotten worse.
|
# ? Sep 8, 2016 15:38 |
|
So basically it "can" do 4K30 medium settings, but gets way better results from a playability standpoint by using upscaling tricks, and can do 1080p Ultra fairly well. I dunno, this doesn't seem terrible at all, considering there'll be a ton of advantages from working inside the PS4 API to squeeze out extra performance here and there.
|
# ? Sep 8, 2016 15:40 |
|
EdEddnEddy posted:Ok I fired up Battlefield 1 last night for about 30 minutes (literally the only time I had free to try it during Open Beta) and threw all the options to max only to see it throw up on my 980Ti. The game doesn't look that much more impressive than Star Wars Battlefront, so why in the heck does it run at 15-30FPS at 1440P. I had to pull the scaling down to 50% to get 60FPS+ on that conquest map. (also I suck at BF after not playing it near as much since BF2, the last good battlefield you could semi coordinate the pubbies. ) The resolution scale option is a mess - you want 42% (the default) for native resolution. By setting it to 100% you were running it at 5K-ish internally
|
# ? Sep 8, 2016 15:41 |
|
EdEddnEddy posted:Ok I fired up Battlefield 1 last night for about 30 minutes (literally the only time I had free to try it during Open Beta) and threw all the options to max only to see it throw up on my 980Ti. The game doesn't look that much more impressive than Star Wars Battlefront, so why in the heck does it run at 15-30FPS at 1440P. I had to pull the scaling down to 50% to get 60FPS+ on that conquest map. (also I suck at BF after not playing it near as much since BF2, the last good battlefield you could semi coordinate the pubbies. ) Usually games get some optimization before final release/day 0 patches, and I expect the NVidia drivers will also improve for BF1 performance. I wouldn't sweat Open-Beta performance quite yet. I remember the open-beta for BF4 was like half the perf a week after release. repiv posted:The resolution scale option is a mess - you want 42% (the default) for native resolution. By setting it to 100% you were running it at 5K-ish internally I thought 0% meant native and...uhhh...it's not. However, it is cool to see what games would look like if we just kept using the Duke3d Build engine instead of making new ones.
|
# ? Sep 8, 2016 15:42 |
|
repiv posted:The resolution scale option is a mess - you want 42% (the default) for native resolution. By setting it to 100% you were running it at 5K-ish internally Oh what the gently caress.
|
# ? Sep 8, 2016 15:43 |
|
E:f;b by a long shot
|
# ? Sep 8, 2016 15:43 |
|
repiv posted:The resolution scale option is a mess - you want 42% (the default) for native resolution. By setting it to 100% you were running it at 5K-ish internally Oh drat. lol Well then getting 30~ FPS on the previous driver wasn't terrible for the old 980Ti lol. However is 42% better than 50% in this case? At 50% It looked like crap like I was literally 1/2 the 1440P res. And I totally understand, but I did have good performance in battlefront beta so I figured it would have been similar. I remember the BF3/4 days as well. One thing is for sure though, that Frostbite engine is drat pretty, if Only I played the games more than a few hours. (Unlocks need to die in a fire of burning game boxes). Getting into the BF1 map and having everything just unlocked ready to roll was the best thing they could do for BF. One thing I loved about Overwatch.
|
# ? Sep 8, 2016 15:46 |
|
EdEddnEddy posted:However is 42% better than 50% in this case? At 50% It looked like crap like I was literally 1/2 the 1440P res. Yeah: https://www.reddit.com/r/battlefield_one/comments/50b4m1/psa_42_resolution_scale_is_your_native_res/ why native is 42% rather than 50% or 100%
|
# ? Sep 8, 2016 15:49 |
|
Well it isn't final.
|
# ? Sep 8, 2016 16:08 |
|
well, that's a thing. e: not quite as scary at higher resolutions, but still: https://www.guru3d.com/articles_pages/deus_ex_mankind_divided_pc_graphics_performance_benchmark_review,7.html Truga fucked around with this message at 17:17 on Sep 8, 2016 |
# ? Sep 8, 2016 17:15 |
|
HMS Boromir posted:Oh what the gently caress. Yeah was quite a common issue in chat in game with people getting 3 fps at 100% scaling. Funniest part is some thought people were messing with them when they said set it to 42 or 44 Nice dues ex loves AMD cards across the board
|
# ? Sep 8, 2016 17:49 |
|
THE DOG HOUSE posted:Yeah was quite a common issue in chat in game with people getting 3 fps at 100% scaling. Funniest part is some thought people were messing with them when they said set it to 42 or 44 I wonder if this is the long-speculated on "console games will be optimized for AMD" coming true.
|
# ? Sep 8, 2016 17:52 |
|
Reminder that DEMD is still heavily hosed in dx12, all cards still get better frames in dx11.
|
# ? Sep 8, 2016 18:07 |
|
Apparently the new Geforce Experience 3.0 requires you to log into a Nvidia, Facebook or Google account. Which loving rear end in a top hat thought that was a good idea? Guess I'm not going to experience the new Geforce Experience.
|
# ? Sep 8, 2016 18:16 |
|
Geemer posted:Apparently the new Geforce Experience 3.0 requires you to log into a Nvidia, Facebook or Google account. Which loving rear end in a top hat thought that was a good idea?
|
# ? Sep 8, 2016 18:17 |
|
Truga posted:Reminder that DEMD is still heavily hosed in dx12, all cards still get better frames in dx11. No? Compare the light blue in DX11 for AMD cards and the DX12 for AMD cards indicates slight improvement across the board. Nvidia cards suffer regression on the Titan XP and 1080, but don't seem to change at all on the 1060, 980, 970 and 980ti. EDIT: the sudden decrease in gains at 1440p and 2160p even for the Furies seems to indicate a ROP bottleneck for AMD cards. EmpyreanFlux fucked around with this message at 18:36 on Sep 8, 2016 |
# ? Sep 8, 2016 18:30 |
|
Geemer posted:Apparently the new Geforce Experience 3.0 requires you to log into a Nvidia, Facebook or Google account. Which loving rear end in a top hat thought that was a good idea? That's been in the works for a long while. After a little >:| I just made an nvidia account because... well ultimately, its fairly inconsequential in the end. Yes its retarded to have an account for my graphics card but oh well, shadowplay is way way way too good for me to pass up.
|
# ? Sep 8, 2016 20:05 |
|
THE DOG HOUSE posted:Uhh this sounds like the 480 is the winner here or am I missing something 480 is about 16% more expensive, for a performance gap of 6-11% in various game benchmarks. HardwareCanucks said the 470 was -8% overall. I'm not a particularly big fan of that Gigabyte SKU because I'd rather have removable fans than lights I have to install their driver for (I'm not gonna), but the MSI at B&H is only 8% less. I only thought about this stuff because someone said not to blindly trust TPU's Value chart, so I thought I would do my own apples to apples comparison of what I can buy and what it will actually cost me.
|
# ? Sep 8, 2016 20:22 |
|
Wow I did not realize the 470 was so close.
|
# ? Sep 8, 2016 20:23 |
|
THE DOG HOUSE posted:That's been in the works for a long while. After a little >:| I just made an nvidia account because... well ultimately, its fairly inconsequential in the end. Yes its retarded to have an account for my graphics card but oh well, shadowplay is way way way too good for me to pass up. The login also makes using Shield Streaming a ton more easy than the old pair device method. On home WiFI and Remote (though I can't test remote with my Internet).
|
# ? Sep 8, 2016 21:15 |
|
Well, looks like no one wants to give their Facebook or Gmail address to nVidia and the account generator for their own account service is either incompetently implemented or is getting hosed. Can't get them to send an email confirmation link for the last hour and the Experience installer won't let me proceed without confirming my email. GG, nVidia, GG.
|
# ? Sep 8, 2016 21:57 |
|
Soooo I bet no one cares about this anymore but I found an answer to a question I had. Theater-quality ray tracing renderers make little to no use of graphics hardware. They do everything on the processor, and the hardware acceleration used for the scanline rendering that games use to go 60 FPS or higher is of no use to their calculations. Renderman, Arnold, and Mental Ray don't give two shits whether you got a Titan X or a GTX 650. They don't touch the thing, or at least they didn't. Apparently it's hot poo poo that the newest version of Renderman can use graphics cards to do denoising. For renderers, it's all about total RAM and total GHz per hour. Spread the word, I guess.
|
# ? Sep 9, 2016 01:01 |
|
THE DOG HOUSE posted:Wow I did not realize the 470 was so close. It also comes with 4 GB of RAM at 1650 MHz Some 470's come with RAM clocked at 1750 MHz The reference 480 4GB is supposed to have RAM clocked at 1750 MHz, but some ship with 2000 MHz RAM the 480 8G has 2000 MHz RAM. Basically you have a 11% shader cut and a roughly 17% memory bandwidth cut. The 1750 MHz models have ~12% less memory bandwidth than the 480 8G. So they're pretty close depending on workload.
|
# ? Sep 9, 2016 01:19 |
|
|
# ? Jun 2, 2024 15:51 |
|
Wulfolme posted:Soooo I bet no one cares about this anymore but I found an answer to a question I had. They care a lot about I/O speed too.
|
# ? Sep 9, 2016 01:31 |