Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Josh Lyman posted:

If Pascal has no architectural improvements and is just a die shrink, how come Jen Hsun said it cost $2 billion to develop? :colbert:

The architecture is the same but a lot of effort was put into path optimization and other things to get it to run at 2GHz+, if they just did a die shrink it wouldn't be able to reach those clocks.

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

THE DOG HOUSE posted:

Depends on the game. Some AAA titles had a special console mode below low settings. Some don't though and are probably "medium".

Then in theory the PS4 Pro should be able to play 4K30 with upscaling tricks? Wonder if someone could benchmark a 480 @ 911Mhz and see what's behavior is for 4K @ medium settings.


eames posted:

If that is true (and I think it is) then Sony was probably caught with their pants down when Microsoft announced their next Xbox. The PS4 Pro looks rather lacklustre in my eyes. It seems like they primarily built it for the PSVR and they threw in a questionable non-native solution for 4K as an afterthought to keep the marketing department happy. I suspect we'll see a PS4 Pro slim shortly after the new Xbox is out.

310W... :crossarms:

I'm still trying to figure out the 310W part, since I know Hawaii/Grenada with 0.95v and 911Mhz comparatively sips at power, something close to 200W IIRC, and it'd stomp the 4.2TFLOP part into the ground as well.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Maybe it's the PSU spec and they're aiming for good efficiency?

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
Yeah... It says max power. Assuming they are aiming for the standard 50% for efficiency reasons, then that gives a normal tdp of 155 which is pretty right on for Jaguar plus good enough gpu at a low clock. Is it a 14nm part? Or a 28nm? I highly doubt they would back port 14nm to 28nm due to the huge process changes involved if it's Polaris for the gpu part.

Klyith
Aug 3, 2007

GBS Pledge Week
Watt figures for consumer electronics are worst case numbers for the FCC label, not TDP like gets reported for cpu and gpu chips. The PS4 pro isn't gonna pull 300 watts in operation. That number is the sum of each part's absolute max plus some safety factor.

They're not using 28nm, it's on AMD / GloFo 14nm. That was pretty much confirmed months ago by an AMD investor disclosure. The Pro might well be pulling more watts than the original 28nm PS4, but not a ton more. (The new case with it's 2 slots for venting might indicate a slightly bigger airflow requirement, but that's a guess on very little evidence. It might just be decor.)


The leaked developer docs about the Pro that have been out for months now already established that this was not doing true 4k rendering, it's just got a nice upsampler and the 4k HDMI version. Both consoles already render to arbitrary sized framebuffers then scale to the TV output.

Klyith fucked around with this message at 03:17 on Sep 8, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
So they have jaguar on 14nmFF but didn't move to Puma+? Just no advantage? I'd also expect 14nm Jaguar to be well, faster. Moving from 28nm to 14nm shouldn't just be a 500Mhz jump, unless of course Jaguar hits awful diminishing returns past 2.1Ghz.

Klyith
Aug 3, 2007

GBS Pledge Week

FaustianQ posted:

So they have jaguar on 14nmFF but didn't move to Puma+?
Lot of engineering costs there, to slot puma into the console memory architecture. Puma doesn't have HSA as is on AMD's own chips (mullins & beema).

quote:

Just no advantage? I'd also expect 14nm Jaguar to be well, faster. Moving from 28nm to 14nm shouldn't just be a 500Mhz jump, unless of course Jaguar hits awful diminishing returns past 2.1Ghz.
Could very well be. But also remember that the Pro has to be 100% compatible with the old PS4. Their developers are specifically forbidden from adding additional features or specific Pro-only gameplay for that version of the game, and there will never be games only made for Pro. Basically you can do better graphics or improvements to standard gameplay (a specific example was that a game with local co-op could support more players on Pro).

So when it comes to CPU, what do you do with gobs more processing power that wouldn't violate those restrictions, but still be worth dev time to implement? There's not a lot of great options. It's not like they can sell the game for more on Pro, it's the same disc for both.

So my speculation would be the CPU only has enough of a bump to keep the GPU full, and otherwise is exactly the same. Less cost for the chip respin, less potential for compatibility issues, and less power use so the GPU part can go nuts.

filthychimp
Jan 2, 2006
Damned dirty ape
I guess I'll spell out exactly what's going on with the PS4pro.

As new process nodes become available, console makers typically migrate their original designs onto the new process. They spend minimal design cost in order to improve yields while cost downing whatever components they can. Making the design exactly the same minimizes any extra work you need to do to make the old games run as if they're on the original chip. The PS4 slim is exactly that, the old 28nm PS4 SoC design put onto a 14nm process. The previous console generation was so long that the PS3 and the 360 actually got many different design revisions as they migrated their original 90nm CPUs and GPUs over to a 60nm process and once again to 45nm.

The big "ah ha" moment Sony had with the PS4pro was what if instead of just reusing the same design, they tweak it slightly. They reused the CPU portion of the slim model's SoC (which they were going to design anyway), then essentially doubled the size of the GPU portion. Give it faster RAM and slightly higher clocks, and you have the PS4pro.

What I'm wondering is if AMD took the chance to refresh the design of the GPU portion of the SoC. Theoretically, it's simple enough to slot in a new design there without mucking up compatibility (much less true for the CPU), but I suspect AMD took the safe option instead.

filthychimp fucked around with this message at 08:44 on Sep 8, 2016

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

FaustianQ posted:

Wonder if someone could benchmark a 480 @ 911Mhz and see what's behavior is for 4K @ medium settings.

Already done.

Klyith
Aug 3, 2007

GBS Pledge Week

filthychimp posted:

What I'm wondering is if AMD took the chance to refresh the design of the GPU portion of the SoC. Theoretically, it's simple enough to slot in a new design there without mucking up compatibility (much less true for the CPU), but I suspect AMD took the safe option instead.

It's been pretty well known for a while now that the GPU would be based on polaris tech, and that's now confirmed officially
https://twitter.com/PlayStation/status/773600156356866048


With how easily AMD has slotted in new GCN updates to their APU over the years, I can't imagine that it's difficult or has any pitfalls they don't know about by now. GCN versions are all the same architecture overall, with evolutionary improvements. And these consoles are drat close to being just PCs, it's not like a minor GPU change is totally crazy for them. That's why these upgrades are happening now, and never did with previous generations. Commodity hardware = piggyback on commodity upgrades.

Klyith fucked around with this message at 09:40 on Sep 8, 2016

Craptacular!
Jul 9, 2001

Fuck the DH
I'm one of those people who gets hit with sales tax on Jet, so I'm operating in percentages rather than raw dollar values. They have a Gigabyte RX470 4GB for 16.8% less than their best deal on a 4GB RX480, but unfortunately it looks be some OEM fans thrown on a barely touched board.

The raw dollar values, for me anyway, is about $186 for the 470 and $218 for the 480.
My brain hurts from reading benchmarks, I need to go to bed. I should probably get that 470, though.

Craptacular! fucked around with this message at 12:01 on Sep 8, 2016

wargames
Mar 16, 2008

official yospos cat censor
Everyone could also just ignore the first gen die shrink and wait for the rapture to happen so we no longer have to deal with earthly matters.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.
When are we getting Freesync for consoles?

Truga
May 4, 2014
Lipstick Apathy
Likely when tee vee freesync support is widespread enough?

BurritoJustice
Oct 9, 2012

Definitely not before we see Adaptive Sync becoming an official HDMI extension and not an AMD proprietary one. Not insofar that it is necessary for the consoles to have it, but rather it is necessary for there to be actual TV's that support it.

penus penus penus
Nov 9, 2014

by piss__donald

Craptacular! posted:

I'm one of those people who gets hit with sales tax on Jet, so I'm operating in percentages rather than raw dollar values. They have a Gigabyte RX470 4GB for 16.8% less than their best deal on a 4GB RX480, but unfortunately it looks be some OEM fans thrown on a barely touched board.

The raw dollar values, for me anyway, is about $186 for the 470 and $218 for the 480.
My brain hurts from reading benchmarks, I need to go to bed. I should probably get that 470, though.

Uhh this sounds like the 480 is the winner here or am I missing something

EdEddnEddy
Apr 5, 2012



Ok I fired up Battlefield 1 last night for about 30 minutes (literally the only time I had free to try it during Open Beta) and threw all the options to max only to see it throw up on my 980Ti. The game doesn't look that much more impressive than Star Wars Battlefront, so why in the heck does it run at 15-30FPS at 1440P. I had to pull the scaling down to 50% to get 60FPS+ on that conquest map. (also I suck at BF after not playing it near as much since BF2, the last good battlefield you could semi coordinate the pubbies. :( )

Also the latest Nvidia drive that was "optimized" for BF1, is a steaming pile. Installed it and while I was able to tweak settings and have no issues in BF1, the new driver crashes the game each time I touch any option in Fullscreen mode. Performance was absolutely nill for difference, if anything, it may have gotten worse.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

So basically it "can" do 4K30 medium settings, but gets way better results from a playability standpoint by using upscaling tricks, and can do 1080p Ultra fairly well. I dunno, this doesn't seem terrible at all, considering there'll be a ton of advantages from working inside the PS4 API to squeeze out extra performance here and there.

repiv
Aug 13, 2009

EdEddnEddy posted:

Ok I fired up Battlefield 1 last night for about 30 minutes (literally the only time I had free to try it during Open Beta) and threw all the options to max only to see it throw up on my 980Ti. The game doesn't look that much more impressive than Star Wars Battlefront, so why in the heck does it run at 15-30FPS at 1440P. I had to pull the scaling down to 50% to get 60FPS+ on that conquest map. (also I suck at BF after not playing it near as much since BF2, the last good battlefield you could semi coordinate the pubbies. :( )

The resolution scale option is a mess - you want 42% (the default) for native resolution. By setting it to 100% you were running it at 5K-ish internally :kingsley:

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

EdEddnEddy posted:

Ok I fired up Battlefield 1 last night for about 30 minutes (literally the only time I had free to try it during Open Beta) and threw all the options to max only to see it throw up on my 980Ti. The game doesn't look that much more impressive than Star Wars Battlefront, so why in the heck does it run at 15-30FPS at 1440P. I had to pull the scaling down to 50% to get 60FPS+ on that conquest map. (also I suck at BF after not playing it near as much since BF2, the last good battlefield you could semi coordinate the pubbies. :( )

Also the latest Nvidia drive that was "optimized" for BF1, is a steaming pile. Installed it and while I was able to tweak settings and have no issues in BF1, the new driver crashes the game each time I touch any option in Fullscreen mode. Performance was absolutely nill for difference, if anything, it may have gotten worse.

Usually games get some optimization before final release/day 0 patches, and I expect the NVidia drivers will also improve for BF1 performance.

I wouldn't sweat Open-Beta performance quite yet. I remember the open-beta for BF4 was like half the perf a week after release.

repiv posted:

The resolution scale option is a mess - you want 42% (the default) for native resolution. By setting it to 100% you were running it at 5K-ish internally :kingsley:


I thought 0% meant native and...uhhh...it's not. However, it is cool to see what games would look like if we just kept using the Duke3d Build engine instead of making new ones.

HMS Boromir
Jul 16, 2011

by Lowtax

repiv posted:

The resolution scale option is a mess - you want 42% (the default) for native resolution. By setting it to 100% you were running it at 5K-ish internally :kingsley:

Oh what the gently caress.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
E:f;b by a long shot

EdEddnEddy
Apr 5, 2012



repiv posted:

The resolution scale option is a mess - you want 42% (the default) for native resolution. By setting it to 100% you were running it at 5K-ish internally :kingsley:

Oh drat. lol Well then getting 30~ FPS on the previous driver wasn't terrible for the old 980Ti lol.

However is 42% better than 50% in this case? At 50% It looked like crap like I was literally 1/2 the 1440P res. :barf:

And I totally understand, but I did have good performance in battlefront beta so I figured it would have been similar.

I remember the BF3/4 days as well. One thing is for sure though, that Frostbite engine is drat pretty, if Only I played the games more than a few hours. (Unlocks need to die in a fire of burning game boxes). Getting into the BF1 map and having everything just unlocked ready to roll was the best thing they could do for BF. One thing I loved about Overwatch.

repiv
Aug 13, 2009

EdEddnEddy posted:

However is 42% better than 50% in this case? At 50% It looked like crap like I was literally 1/2 the 1440P res. :barf:

Yeah: https://www.reddit.com/r/battlefield_one/comments/50b4m1/psa_42_resolution_scale_is_your_native_res/

:iiam: why native is 42% rather than 50% or 100%

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Well it isn't final.

Truga
May 4, 2014
Lipstick Apathy


well, that's a thing.

e: not quite as scary at higher resolutions, but still:
https://www.guru3d.com/articles_pages/deus_ex_mankind_divided_pc_graphics_performance_benchmark_review,7.html

Truga fucked around with this message at 17:17 on Sep 8, 2016

penus penus penus
Nov 9, 2014

by piss__donald

HMS Boromir posted:

Oh what the gently caress.

Yeah was quite a common issue in chat in game with people getting 3 fps at 100% scaling. Funniest part is some thought people were messing with them when they said set it to 42 or 44

Nice dues ex loves AMD cards across the board

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

THE DOG HOUSE posted:

Yeah was quite a common issue in chat in game with people getting 3 fps at 100% scaling. Funniest part is some thought people were messing with them when they said set it to 42 or 44

Nice dues ex loves AMD cards across the board

I wonder if this is the long-speculated on "console games will be optimized for AMD" coming true.

Truga
May 4, 2014
Lipstick Apathy
Reminder that DEMD is still heavily hosed in dx12, all cards still get better frames in dx11.

Geemer
Nov 4, 2010



Apparently the new Geforce Experience 3.0 requires you to log into a Nvidia, Facebook or Google account. Which loving rear end in a top hat thought that was a good idea?

Guess I'm not going to experience the new Geforce Experience. :shrug:

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Geemer posted:

Apparently the new Geforce Experience 3.0 requires you to log into a Nvidia, Facebook or Google account. Which loving rear end in a top hat thought that was a good idea?

Guess I'm not going to experience the new Geforce Experience. :shrug:
Thankfully they decided to walk back from requiring GFE to get driver updates and will still offer them for download from their website without a login.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Truga posted:

Reminder that DEMD is still heavily hosed in dx12, all cards still get better frames in dx11.

No? Compare the light blue in DX11 for AMD cards and the DX12 for AMD cards indicates slight improvement across the board. Nvidia cards suffer regression on the Titan XP and 1080, but don't seem to change at all on the 1060, 980, 970 and 980ti.

EDIT: the sudden decrease in gains at 1440p and 2160p even for the Furies seems to indicate a ROP bottleneck for AMD cards.

EmpyreanFlux fucked around with this message at 18:36 on Sep 8, 2016

penus penus penus
Nov 9, 2014

by piss__donald

Geemer posted:

Apparently the new Geforce Experience 3.0 requires you to log into a Nvidia, Facebook or Google account. Which loving rear end in a top hat thought that was a good idea?

Guess I'm not going to experience the new Geforce Experience. :shrug:

That's been in the works for a long while. After a little >:| I just made an nvidia account because... well ultimately, its fairly inconsequential in the end. Yes its retarded to have an account for my graphics card but oh well, shadowplay is way way way too good for me to pass up.

Craptacular!
Jul 9, 2001

Fuck the DH

THE DOG HOUSE posted:

Uhh this sounds like the 480 is the winner here or am I missing something

480 is about 16% more expensive, for a performance gap of 6-11% in various game benchmarks. HardwareCanucks said the 470 was -8% overall.

I'm not a particularly big fan of that Gigabyte SKU because I'd rather have removable fans than lights I have to install their driver for (I'm not gonna), but the MSI at B&H is only 8% less.

I only thought about this stuff because someone said not to blindly trust TPU's Value chart, so I thought I would do my own apples to apples comparison of what I can buy and what it will actually cost me.

penus penus penus
Nov 9, 2014

by piss__donald
Wow I did not realize the 470 was so close.

EdEddnEddy
Apr 5, 2012



THE DOG HOUSE posted:

That's been in the works for a long while. After a little >:| I just made an nvidia account because... well ultimately, its fairly inconsequential in the end. Yes its retarded to have an account for my graphics card but oh well, shadowplay is way way way too good for me to pass up.

The login also makes using Shield Streaming a ton more easy than the old pair device method. On home WiFI and Remote (though I can't test remote with my Internet).

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



Well, looks like no one wants to give their Facebook or Gmail address to nVidia and the account generator for their own account service is either incompetently implemented or is getting hosed.

Can't get them to send an email confirmation link for the last hour and the Experience installer won't let me proceed without confirming my email.

GG, nVidia, GG.

Twinty Zuleps
May 10, 2008

by R. Guyovich
Lipstick Apathy
Soooo I bet no one cares about this anymore but I found an answer to a question I had.

Theater-quality ray tracing renderers make little to no use of graphics hardware. They do everything on the processor, and the hardware acceleration used for the scanline rendering that games use to go 60 FPS or higher is of no use to their calculations. Renderman, Arnold, and Mental Ray don't give two shits whether you got a Titan X or a GTX 650. They don't touch the thing, or at least they didn't. Apparently it's hot poo poo that the newest version of Renderman can use graphics cards to do denoising.

For renderers, it's all about total RAM and total GHz per hour. Spread the word, I guess.

Naffer
Oct 26, 2004

Not a good chemist

THE DOG HOUSE posted:

Wow I did not realize the 470 was so close.
The 470 is cut in two ways. It has 11% fewer shaders (2048/2304 = 89%)

It also comes with 4 GB of RAM at 1650 MHz
Some 470's come with RAM clocked at 1750 MHz
The reference 480 4GB is supposed to have RAM clocked at 1750 MHz, but some ship with 2000 MHz RAM
the 480 8G has 2000 MHz RAM.

Basically you have a 11% shader cut and a roughly 17% memory bandwidth cut. The 1750 MHz models have ~12% less memory bandwidth than the 480 8G.

So they're pretty close depending on workload.

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Wulfolme posted:

Soooo I bet no one cares about this anymore but I found an answer to a question I had.

Theater-quality ray tracing renderers make little to no use of graphics hardware. They do everything on the processor, and the hardware acceleration used for the scanline rendering that games use to go 60 FPS or higher is of no use to their calculations. Renderman, Arnold, and Mental Ray don't give two shits whether you got a Titan X or a GTX 650. They don't touch the thing, or at least they didn't. Apparently it's hot poo poo that the newest version of Renderman can use graphics cards to do denoising.

For renderers, it's all about total RAM and total GHz per hour. Spread the word, I guess.

They care a lot about I/O speed too.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply