Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Purgatory Glory
Feb 20, 2005
Path of least resistance, if this thread becomes the best place for parts picking questions then this is where people will come to ask. I click the thread when there are new replies hoping to see some discussion about GPU's, not to see what somebody with a specific rig got recommended to them.I'd go to the parts picking thread to see that. having said that, use tact when directing people to the parts picking thread.

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Well, let's add some: AnandTech recently ran a deep dive on the Broadwell uarch as embodied in Core M, including details on the new HD Graphics architecture.

Brief review: Nvidia has SM_ units (SMX for Kepler, SMM for Maxwell) consisting of 192 (Kepler)/128 (Maxwell) shader ALUs with a frontend of general purpose graphics hardware and scheduling and a backend of texture units. AMD's analogous unit is the Compute Unit (or lately, GCN Core), with 64 shader ALUs per CU. Intel's analogous unit is the sub-slice. In Haswell (Generation 7.5 of Intel graphics), the sub-slice has 80 shader ALUs organized as 10 Execution Units (EU), and certain hardware (like rasterizers) exists outside the sub-slice in the common slice.

There are a few different configurations for Intel IGPs in Haswell. GT2 is the basic configuration, consisting of 20 EUs (2 sub-slices). GT3 has 40 EUs. GT3e has 40 EUs plus an off-chip high-bandwidth DRAM cache, which also acts as an L4 cache for the CPU. GT3e, marketed as Iris Pro, is by far the fastest implementation.

GT2 has been a 2 sub-slice GPU for Sandy and Ivy Bridges. Sandy Bridge (HD Graphics gen 6) had 6 EUs per sub-slice and 12 in GT2. Ivy Bridge (Gen 7) had 8 EUs per sub-slice and 16 in GT2.

The biggest graphics architecture change in Broadwell is that the ratio of shader hardware to frontend, texture units, and L1 cache is changing. Previously, HD Graphics were very shader-heavy vs. their pixel-pushing capabilities, compared to e.g. AMD IGPs. In Broadwell, the 10 EUs per sub-slice of Haswell is being rolled back to 8 EUs per subslice. But in return, the GT2 configuration is being boosted to 3 sub-slices (24 EUs).

Sub-slice frontends and common-slice parts are receiving a uarch revision in Broadwell to improve fill rates. This should be really nice for gaming, as GT2 configurations in previous generations have struggled to go much past 768p resolution even at the lowest of low details. The excess of shader hardware couldn't help when the GPU just couldn't put pixels on screen fast enough. Haswell GT3 did a lot better in this respect, generally being able to push 1920x1080 without being too fill-limited (though it was bandwidth-limited without the GT3e config). But even with these uarch changes, it's improbable that Broadwell GT2 will seriously challenge Haswell GT3.

The other big features of Broadwell's GPU changes are driven by the market the IGP is destined for - tablets and convertibles with tablet-style power and media needs.

On the power front, Intel is basically brute-forcing the issue. Already, Gen7 (Ivy Bridge, Bay Trail) GPUs can power down when displaying a static image, leaving a small DRAM cache to keep the display refreshing while the GPU shuts down. Broadwell will take this idea of "render what's needed and shut down" to an extreme by combining it with the idea of "race to sleep" from their CPU Turbo Boost logic. The result is Duty Cycle Control.

Duty Cycle Control in a nutshell: the GPU renders frames and shuts down, waking up again only when it's necessary to render more frames. The display controller, which is powered separately, will handle updating the screen as the frames are needed. For light workloads, the GPU can be shut down as much as 7/8 of the time even while the screen is being constantly updated.

On the media front, the video hardware is being updated to handle 4K screens and video from end to end. 4K displays will be supported even in ultra-low voltage SKUs, and the video post-processing and QuickSync transcoding engines are getting a doubling in throughput. Also included will be hydrid hardware decode assist for the next-gen h.265 video codec. H.265/MPEG-H HEVC provides the same quality video as h.264/MPEG-4 AVC at half the bitrate. Broadwell's GPU will handle h.265 with so-called hybrid hardware decode. Like on Nvidia Maxwell, some of the codec will be run on the shader ALUs rather than in fixed-function hardware. It's not perfect from a power standpoint, but it's far better than CPU decoding.

Things Gen 8 HD Graphics will not do: Game well at 4K, support FreeSync (the bits are there with eDP 1.3a, but the external DP standard is only 1.2, not 1.2a).

--

In other news, speaking of power: Intel did a little demo of DirectX 12 to show that reducing CPU use in the API would greatly increase graphics performance by allotting more thermal budget to the GPU.

On the Team Green side, a research paper for scientific GPU computing let slip some expected performance-per-watt figures for Nvidia's roadmap. We're apparently about to see a Big Kepler refresh as GK210, and Big Maxwell GM200 should arrive by the end of the year (at least in Teslas). After that, Big Pascal, GP100, is slated for the beginning of 2016. The FLOPS/watt statistic in the paper suggests that Big Pascal will have three times the performance per watt of current Big Kepler.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Factory Factory posted:

On the Team Green side, a research paper for scientific GPU computing let slip some expected performance-per-watt figures for Nvidia's roadmap. We're apparently about to see a Big Kepler refresh as GK210, and Big Maxwell GM200 should arrive by the end of the year (at least in Teslas). After that, Big Pascal, GP100, is slated for the beginning of 2016. The FLOPS/watt statistic in the paper suggests that Big Pascal will have three times the performance per watt of current Big Kepler.
I wouldn't read anything into this. Looks like academics are extrapolating based on JHH's presentations at GTC, which are about as reliable as throwing darts at the wall.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
I suppose the confusion stems from the fact that when people want to find out what ssd to buy, they can go to the ssd thread; ditto the mouse thread, keyboard thread, monitor thread, etc but the gpu thread is just for academic discussions.

I do get though that it probably is best to have one thread for all things related to buying pc stuff.

Besides, if this becomes nothing but a short response gpu part picking thread then where will Agreed be able to practice his run on sentences?

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

Factory Factory posted:


On the media front, the video hardware is being updated to handle 4K screens and video from end to end. 4K displays will be supported even in ultra-low voltage SKUs, and the video post-processing and QuickSync transcoding engines are getting a doubling in throughput. Also included will be hydrid hardware decode assist for the next-gen h.265 video codec. H.265/MPEG-H HEVC provides the same quality video as h.264/MPEG-4 AVC at half the bitrate. Broadwell's GPU will handle h.265 with so-called hybrid hardware decode. Like on Nvidia Maxwell, some of the codec will be run on the shader ALUs rather than in fixed-function hardware. It's not perfect from a power standpoint, but it's far better than CPU decoding.


This is big news for HTPCs and 4K media, and I have been waiting to hear some kind news about H.265 for a while now.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Lowen SoDium posted:

This is big news for HTPCs and 4K media, and I have been waiting to hear some kind news about H.265 for a while now.

So far I've been very unimpressed with h.265 demos, x264 has performed better. But I'm sure better encoders are to come.

Khagan
Aug 8, 2012

Words cannot describe just how terrible Vietnamese are.
Doesn't H.265 mean half the file size at the same bit rate as H.264?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Khagan posted:

Doesn't H.265 mean half the file size at the same bit rate as H.264?

Half the file size at same quality as h264. If it was the same bitrate it would be the same size.

roadhead
Dec 25, 2001

So twice the quality at the same bit-rate?

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

HalloKitty posted:

So far I've been very unimpressed with h.265 demos, x264 has performed better. But I'm sure better encoders are to come.

It took years for x264 to get where it is. h265 just finished standardization a year or so back.

It's gonna be a while before we see a good h.265 encoder.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Don Lapre posted:

Half the file size at same quality as h264. If it was the same bitrate it would be the same size.

That's the marketing, but the reality isn't quite all that, unless I've missed some awesome advance

Star War Sex Parrot
Oct 2, 2003

Malcolm XML posted:

It took years for x264 to get where it is. h265 just finished standardization a year or so back.

It's gonna be a while before we see a good h.265 encoder.
Well get to work man!

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

HalloKitty posted:

That's the marketing, but the reality isn't quite all that, unless I've missed some awesome advance

Yea, i was more pointing out that bitrate is the same the filesize would be the same.

Civil
Apr 21, 2003

Do you see this? This means "Have a nice day".
I suppose it's nice that h.265 is getting hardware assist, but we can still expect broadwell integrated graphics to be equivalent to discreet cards 10 years back, right? A lot of noise was made about HD4000 and Iris, but in use it's only passable for older games at very low resolution.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
It's less impressive viewed in terms of raw graphics performance but when viewed as a whole with CPU processing power and overall performance/watt it's amazing. For form factors that can't viably fit a discrete GPU it is a godsend.

iastudent
Apr 22, 2008

Looks like nVidia is now including Borderlands: the Pre-Sequel download codes with their higher-end 700 cards (770, 780, 780 Ti) when purchased through select retailers. Amazon and Newegg at the very least are included.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

iastudent posted:

Looks like nVidia is now including Borderlands: the Pre-Sequel download codes with their higher-end 700 cards (770, 780, 780 Ti) when purchased through select retailers. Amazon and Newegg at the very least are included.

Still running on what must by now ought to be a very optimized version of Unreal 3, they ought to open that bundle to at the very least the GTX 760. I can force obscene amounts of just raw SSAA without ever going below like 90fps with my 780Ti, and I could with my GTX 680 when I had that, too. I get that they're riding the hype train but maybe it'll really take advantage of some cards if it's going to be a high end bundle only.

But probably it's just "oh, neat, more Borderlands, savin' bux over here!" from the user side, and ignoring the bigger picture, if AMD can bundle a game for Mantle features, bah gawd I guess nVidia can still get away with pushing PhysX. At least it looks rad as poo poo in Borderlands 2 and presumably this new one!

Lonny Donoghan
Jan 20, 2009
Pillbug
Thanks

Kazinsal
Dec 13, 2011


Note to self. Force installing drivers from a Catalyst package that refuses to install causes kernel mode bluescreens related to hardware accelerated video on 5850s.

I should ask my parents or the doctors told them I was retarded or something as an infant.

Aphrodite
Jun 27, 2006

Agreed posted:

Still running on what must by now ought to be a very optimized version of Unreal 3, they ought to open that bundle to at the very least the GTX 760. I can force obscene amounts of just raw SSAA without ever going below like 90fps with my 780Ti, and I could with my GTX 680 when I had that, too. I get that they're riding the hype train but maybe it'll really take advantage of some cards if it's going to be a high end bundle only.

But probably it's just "oh, neat, more Borderlands, savin' bux over here!" from the user side, and ignoring the bigger picture, if AMD can bundle a game for Mantle features, bah gawd I guess nVidia can still get away with pushing PhysX. At least it looks rad as poo poo in Borderlands 2 and presumably this new one!

The bundles have nothing to do with the cards that can play them. They just don't want to give them with the cards they're not making as much money on.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

It's just a bit of a long dry spell, I'm not even sure if I like graphics anymore

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Agreed posted:

It's just a bit of a long dry spell, I'm not even sure if I like graphics anymore

http://forums.somethingawful.com/showthread.php?threadid=3563643

Here's a link to the roguelike thread then so you can get some ASCII goodness.

Aphrodite
Jun 27, 2006

Nvidia and AMD pack-ins mean you can buy a code for a $50 game for half off in SA Mart anyway. No need to even buy a video card.

creatine
Jan 27, 2012




Can I buy the same arctic silver and use it for both CPU and GPU? Both are running hot (50-60 C idling) and I figured it's about time to replace them seeing as I haven't put new compound on since I built the computer in 2010.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
Get MX-4 instead if you need something for the GPU since it's non-conductive/non-capacitive.


edit: That also works.
VVV

future ghost fucked around with this message at 18:42 on Aug 16, 2014

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Pumpy Dumper posted:

Can I buy the same arctic silver and use it for both CPU and GPU? Both are running hot (50-60 C idling) and I figured it's about time to replace them seeing as I haven't put new compound on since I built the computer in 2010.

Use gelid extreme instead as it is non conductive and better than AS. If you already have the AS then its fine on your cpu but id be careful on the gpu.

creatine
Jan 27, 2012




Don Lapre posted:

Use gelid extreme instead as it is non conductive and better than AS. If you already have the AS then its fine on your cpu but id be careful on the gpu.

I haven't bought any yet. Wanted to get the general consensus on it. Tax free weekend in MA so I'm doing some PC shopping

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Pumpy Dumper posted:

I haven't bought any yet. Wanted to get the general consensus on it. Tax free weekend in MA so I'm doing some PC shopping

Gelid extreme is basically the best until you get into liquid metal. Being non conductive is a bonus

creatine
Jan 27, 2012




Don Lapre posted:

Gelid extreme is basically the best until you get into liquid metal. Being non conductive is a bonus

Sweet. I'll see if they carry it at the electronics store n

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
We should have a new thread title.

GPU Megathread - I got 99 problems but switchable graphics ain't one

netcat
Apr 29, 2008
Anyone have the MSI Radeon R9 280x? I'm having issues with some strange flickering. It happens occasionaly that the screen flickers for a quarter of a second or so. Google tells me it's a common problem with this particular family of GPU's and that it probably has to do with it going up or down in frequency when it has to do more or less work.

I'm not sure if there is a solution or if I should just try to get a refund and get an equivalent nvidia card, though that's gonna cost me at least $100 more I think.

goodness
Jan 3, 2012

When the light turns green, you go. When the light turns red, you stop. But what do you do when the light turns blue with orange and lavender spots?
So I got my 290x awhile ago, and ever since installing it and updating Catalyst and -all that, Skype crashes within a second of calling or receiving a call.

Any ideas? Already tried reinstalling skype multiple times as well as Catalyst.

beejay
Apr 7, 2002

Someone answered you last time you asked that, did you try that?

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

netcat posted:

Anyone have the MSI Radeon R9 280x? I'm having issues with some strange flickering. It happens occasionaly that the screen flickers for a quarter of a second or so. Google tells me it's a common problem with this particular family of GPU's and that it probably has to do with it going up or down in frequency when it has to do more or less work.

I'm not sure if there is a solution or if I should just try to get a refund and get an equivalent nvidia card, though that's gonna cost me at least $100 more I think.

I happened to notice that a new radeon graphics driver came out recently and flickering was one of the highlighted issues being addressed.

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe

Kazinsal posted:

Note to self. Force installing drivers from a Catalyst package that refuses to install causes kernel mode bluescreens related to hardware accelerated video on 5850s.

I should ask my parents or the doctors told them I was retarded or something as an infant.

I had an HP tower computer over 10 years ago, and I forcibly upgraded its BIOS with the Award BIOS for that model of motherboard, when it was an HP version that had half the L2 cache and Phoenix BIOS. This had to be fixed by pulling the BIOS chip and flashing it with an EPROM programmer using my older computer. Oh, but no, it didn't stop there. Within about a month or two, I got the same bright idea and tried it again, and had to perform the same exact fix a second time.

It didn't help that my only access to an EPROM programmer was a borrowed programmer that one of my older brothers had to bring over from his house each time, and this meant waiting a week or two after loving over my computer for a convenient time for my brother to take a break from his life to pay us a visit.

creatine
Jan 27, 2012




So I was asking earlier for upgrades on a gtx 460 since it wasn't running games as well as it used to. Well I decided that instead of dropping $150+ on a 660 or 750ti I would see if new thermal paste would help. Turns out my AMD Phenom II x4 was idling at 60-62C! Put new thermal paste on and now it's running 38C idle and only 55C under load. Also made my GPU idle temp drop a couple degrees since the CPU fan isn't working as hard to move the air.

Rakeris
Jul 20, 2014

Pumpy Dumper posted:

So I was asking earlier for upgrades on a gtx 460 since it wasn't running games as well as it used to. Well I decided that instead of dropping $150+ on a 660 or 750ti I would see if new thermal paste would help. Turns out my AMD Phenom II x4 was idling at 60-62C! Put new thermal paste on and now it's running 38C idle and only 55C under load. Also made my GPU idle temp drop a couple degrees since the CPU fan isn't working as hard to move the air.

Wow that was running a bit warm, pretty nice improvement for a few dollars.

1gnoirents
Jun 28, 2014

hello :)
Yeah, that's something-is-very-wrong high for Intel, but its way worse for that CPU. I think the thermal limit is 72 degrees or something

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AMD generally wants the cores no hotter than 62 C. Even on the wacky-tacky high TDP models.

Adbot
ADBOT LOVES YOU

goodness
Jan 3, 2012

When the light turns green, you go. When the light turns red, you stop. But what do you do when the light turns blue with orange and lavender spots?

beejay posted:

Someone answered you last time you asked that, did you try that?

Yes, I tried every answer that I could find by googling.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply