Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

td4guy posted:

I'm a bit sad that this first-generation G-Sync does away with the hardware scaler. I know I'm a small minority, but I like to plug consoles into my monitor too. Those often need scaling-up.

The 360 and the PS3 both have on-board scaling for media content, though the PS3 does rely on the TV's scalar since it usually just outputs 720p natively. I am embarrassed to admit I don't know off the top of my head if the new consoles have scaling for games, although I do know the XB1 has a dynamic internal resolution scalar (to approach the "address the framerate problem" from a different, inferior angle) and I don't see any way that would function if it lacked an output scalar (also, not sure if it's going to see a lot of real use in games but that's neither here nor there, depends on how hands-on coding for it is I guess). With twice the ROPs of the 360, the PS3 should be able to hit 1080p render targets without much trouble, especially as developers really get deep into optimizing for the virtually bare-metal access they get there.

I guess most people who will be using this technology will be using it solely for the PCs, although if they can improve the frame duration to at least 41.6ms (current limit is 30fps, or 33.33etc.ms) that would be pretty swell to make cinema playback better on PC if you use your PC for th at. The PS3's had true 24p output for like six or seven years now, but that could be an application of interest to a different demographic than the conventional game lyfe :pcgaming: nuts with really expensive graphics cards and all that jazz...

For the record, I'm waiting to pick up a G-Sync monitor until the first 1440p/1600p model lands. There are too few games that can't just run at perfect VSync at 1080p on my card, and the experience of G-Sync is most like perfect VSync, when comparing it to older technologies. I expect Maxwell to make the 1440p/1600p experience even better, too, though I am interested to see if they release it like they did Kepler, with the small and efficient chip that's very easily made (by way of comparison) hitting first and then following up later with the gigantic pricey one once they've got production to levels that can justify letting some chips hit the consumer market. If they have whatever they'll be calling Maxwell's version of GK110 right out the gate then it should have a substantial performance improvement over the 780Ti; but nobody saw the GK104 launch coming in terms of "holy poo poo this is the small one and it performs this well? What's in store with the BIG one? :aaa:" and that gave them a hell of a one-two punch that has kept them perfectly relevant in card performance and even price (yeah, they're a little on the wrong side of the curve, but right now they have a big feature advantage and they're certainly making the most of it) despite AMD launching their next generation products with a more advanced GCN architecture and an also very high transistor count.

The one thing I worry about being a G-Sync early adopter is that I might end up getting a very pricey monitor only for them to solve some key issue with the hardware (like letting it go down to 24fps, for example, without tearing - I don't need it for cinema, I've got my home theater setup in my living room, but big monitors plus all the shinies means sometimes you get a framerate drop even below 30fps, whatchagonnado?). The better value choice there might be to wait until we can see what monitors are compatible with their FPGA mod and pick one you'll be happy with for X years based on features, then just upgrade the G-Sync module itself if they release a significantly improved revision.

The one troublesome thing I guess is that the tech is really more expensive than makes sense for mid-range card users; pay twice for the monitor what you pay for the card just to play at 1080p? :raise: I hope price goes down once it hits the market and they have more time to refined the technology further. As the article mentions, and as industry tech heads have said, this is really nVidia looking at a way to change the gaming experience on PC to make it better. 'Course, change can be expensive.




Unrelated note, so it looks like I may be getting an Oculus Rift soon as a gift from a good friend who unfortunately gets motion sickness and headaches from the technology. I've been wanting to test the technology for a long time, really excited to have that become a real possibility. Not set in stone, depends on some other factors, but I will definitely be happy to answer any questions about it if anyone is curious enough to ask but not curious enough to drop $300 on an in-development product. I am mega stoked, I guess I need to get the Doom 3 re-release since that's one of the games that apparently is perfectly tuned for it. :cry: / :holy:

Adbot
ADBOT LOVES YOU

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Agreed posted:

As the article mentions, and as industry tech heads have said, this is really nVidia looking at a way to change the gaming experience on PC to make it better. 'Course, change can be expensive.

Plus, sadly proprietary change often doesn't take off, at least not to a large-enough extent. That's why I could see, over time, the G-sync concept going more universal and panel manufacturer's incorporating it as an ASIC into the panels - lower cost, universal support, everyone wins (in the long run...).

Double Punctuation
Dec 30, 2009

Ships were made for sinking;
Whiskey made for drinking;
If we were made of cellophane
We'd all get stinking drunk much faster!
Are IPS panels not able to hold the image long enough for G-Sync to work, or is there some other reason it's TN only? I would think if IPS monitors could handle it, they would have done it by now and have made TN completely obsolete.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Agreed posted:

Unrelated note, so it looks like I may be getting an Oculus Rift soon as a gift from a good friend who unfortunately gets motion sickness and headaches from the technology. I've been wanting to test the technology for a long time, really excited to have that become a real possibility. Not set in stone, depends on some other factors, but I will definitely be happy to answer any questions about it if anyone is curious enough to ask but not curious enough to drop $300 on an in-development product. I am mega stoked, I guess I need to get the Doom 3 re-release since that's one of the games that apparently is perfectly tuned for it. :cry: / :holy:
I got one pretty recently. I may have teared up a little bit the first time I used it because it was the first time I'd ever seen 3D (eye muscle disorder as a kid, lost normal stereo vision as a result). The resolution is fantastically bad, the lack of lateral tracking is incredibly annoying, there are very few apps, and I'm still almost completely sure that they're going to become the first new category of hardware that is required for PC gaming since 3D accelerators. It honestly reminds me of the first time I saw a Glide app.

Fuzz1111
Mar 17, 2001

Sorry. I couldn't find anyone to make you a cool cipher-themed avatar, and the look on this guy's face cracks me the fuck up.

Incredulous Dylan posted:

What's confusing me about G-synch is what it offers in practical benefits over just having a 120 hz monitor. I've had one for years for 3D gaming and I can't remember the last time I actively saw tearing in a modern game. Maybe I am just just used to tearing but I thought if I wasn't cranking fps around in the 120 area I wouldn't need to worry. I always have the quality on max which keeps things in most modern releases at 60-80 fps.
As I posted earlier, 120hz makes tearing a lot less noticeable because any given tear is only on-screen for 120th of a second. Like you, I only notice tearing when framerate is near 120hz (actually the only time I really notice it is in quakelive - because it has a hard 125fps limit that almost any PC will hit and sit on at all times - the result is a single tear creeping up the screen).

And come to think of it: if your framerate drops below 80 on a 120hz screen, then more than half the displayed frames (screen refreshes) won't have a tear at all, and this it only gets better as framerate drops (a framerate under 48 means a tear will be present on less than 1/3 of displayed frames).

Zero VGS posted:

Wouldn't something like a 240hz IPS, if those come to market in time with G-Sync, eliminate a lot of the perceived tearing just as well?
In my opinion yes, with vsync off you would need to have a framerate of over 160 to have tears present more often than not, and some fast eyes to see them.

fargom
Mar 21, 2007
I recently swapped from an AMD 5870 to a Nvidia GTX 670 and I'm having a few problem I can't seem to figure out. None of this happened before the swap, so I have to assume something is wrong with the card or I hosed up some required step when swapping cards.

When I turn on my computer, the screen stays black until windows loads, if I turn off the monitor off it will stay black for an unusually long time (15-30 sec), and most annoyingly, if I try to launch an older game in fullscreen (baulder's gate 2, OpenXcom, age of wonders, ect) it will stay black for the same 15-30 seconds. Sometimes I can alt tab back to windows and back into the game to fix the issue, but for the life of me I can't figure this out.

I deleted all references to AMD software I can find, including reg keys and all hidden folders, including stuff in app data, ect. Am I missing some step? Should I just format the drive and reinstall windows 7 fresh?

LRADIKAL
Jun 10, 2001

Fun Shoe

Fuzz1111 posted:

As I posted earlier, 120hz makes tearing a lot less noticeable because any given tear is only on-screen for 120th of a second. Like you, I only notice tearing when framerate is near 120hz (actually the only time I really notice it is in quakelive - because it has a hard 125fps limit that almost any PC will hit and sit on at all times - the result is a single tear creeping up the screen).
Can't you just do "com_maxfps 120"?

I am wondering why triple buffering is never mentioned? The extra 1 frame of latency?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It reduces latency somewhat, by each frame being closer in game time when displayed than a double-buffered setup, but it still has stutter problems. It's not super distinct from double-buffered Vsync.

Mad_Lion
Jul 14, 2005

Would it be possible for somebody to manufacture a device that sits between your video card and your regular monitor and does the same thing? That would be nice for those who like their current monitor but want to get in on G-Sync. I also wonder if this sort of tech will happen for AMD cards (possibly using a device like the one I described)?

I've got a 7850 that overclocks well (almost to 7870 ghz levels of performance) and I'm about to upgrade to an i7 rig. I imagine I would sit perfectly in the 30-60 fps range for most games with this setup and I really like what this can do for smoothness. I really like my video card though, and I'm certainly not going to buy a new monitor just yet.

EoRaptor
Sep 13, 2003

by Fluffdaddy

dpbjinc posted:

Are IPS panels not able to hold the image long enough for G-Sync to work, or is there some other reason it's TN only? I would think if IPS monitors could handle it, they would have done it by now and have made TN completely obsolete.

It'll work fine on an IPS panel? One hasn't been made or announced yet, but all the 'magic' happens in the decoder hardware that takes a DP/DVI signal and turns it into a voltage map for the LCD panel. Nothing about the LCD panel itself actually matters that much.

Gwaihir
Dec 8, 2009
Hair Elf

Mad_Lion posted:

Would it be possible for somebody to manufacture a device that sits between your video card and your regular monitor and does the same thing? That would be nice for those who like their current monitor but want to get in on G-Sync. I also wonder if this sort of tech will happen for AMD cards (possibly using a device like the one I described)?

I've got a 7850 that overclocks well (almost to 7870 ghz levels of performance) and I'm about to upgrade to an i7 rig. I imagine I would sit perfectly in the 30-60 fps range for most games with this setup and I really like what this can do for smoothness. I really like my video card though, and I'm certainly not going to buy a new monitor just yet.

A man in the middle type device wouldn't work from what I understand, it has to be a chip connected directly to the LCD panel in order to control the refresh timing and to be able to report back to the video card/allow it to adjust timing to match frame render times.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Actually, there was a site saying that when you set max frames per second on a game, you should do one less than the refresh rate, i.e. 59fps cap on a 60hz monitor, because one is rolling over to the next second. Is that right? Math isn't my strong suit.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

EoRaptor posted:

It'll work fine on an IPS panel? One hasn't been made or announced yet, but all the 'magic' happens in the decoder hardware that takes a DP/DVI signal and turns it into a voltage map for the LCD panel. Nothing about the LCD panel itself actually matters that much.

Aren't they selling the board separately such that you could make it work with any monitor with some modification?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

Aren't they selling the board separately such that you could make it work with any monitor with some modification?

They are selling an FPGA that will replace the scalar unit on compatible monitors (can't remember where I saw that specific detail, but it was with an nVidia guy). We don't know which monitors will be compatible yet, though. There is absolutely nothing in the technology itself that constrains it to a particular panel type (well, unless the panel is incapable of refreshing faster than 30Hz, since it won't let the GPU hold a frame for longer than 33.3repeating ms so there'd be no real point there), but there could be a number of issues integrating a very nontraditional thing where the scalar normally goes depending on all the surrounding tech.

P.S. I'm not RMAing the K95, seems like MX keys with LEDs just croak a lot and that's all there is to it. Called it way back, gently caress it, I'm just going to get back to using the damned thing. Not surprised blue ones have an even higher failure rate, either, their lumen ratings are off the charts these days and quality control is both very difficult and also almost nonexistent for major parts manufacturers for that particular component :negative:

I'm going to just turn the LEDs off and pretend I meant to buy a keyboard with dark lettering :unsmith:

Wowporn
May 31, 2012

HarumphHarumphHarumph

Jago posted:

Can't you just do "com_maxfps 120"?

I am wondering why triple buffering is never mentioned? The extra 1 frame of latency?

I turned triple buffering on in counter strike the other day because the insanely high FPS was causing some tearing, and it was like aiming a gun with a cinder block hanging off the front. I didn't think I'd really notice a teensy bit of input lag considering how long I used a TV instead of a monitor, but it made it really hard to play.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Ruin Completely posted:

I turned triple buffering on in counter strike the other day because the insanely high FPS was causing some tearing, and it was like aiming a gun with a cinder block hanging off the front. I didn't think I'd really notice a teensy bit of input lag considering how long I used a TV instead of a monitor, but it made it really hard to play.

Yeah, I notice the poo poo out of added frames. Even 2 frame Vsync is nuts. If y'all have Far Cry 3, go give that a shot as it has both 1 or 2 frame VSync (... I think?) and adjustable buffered frames (definitely has that). Quick and easy way to test for yourself how much input lag you can pick up. I'm using a 60Hz monitor and it's still just startlingly noticeable especially when you can actually compare it how much one extra frame held in reserve adds to your sense of detachment from the action and ability to respond.

GrizzlyCow
May 30, 2011
Also, TechReport has some data on that R9 290X retailer vs test sample variance issue. It looks kind of bad for AMD.

Wowporn
May 31, 2012

HarumphHarumphHarumph
It was the same way in DE:HR for me, but if I didn't have vsync on it would tear so bad I could barely see anything when I was turning. I haven't played it in a couple months though so maybe some drivers have fixed it since then.

Grim Up North
Dec 12, 2011

Agreed posted:

They are selling an FPGA that will replace the scalar unit on compatible monitors (can't remember where I saw that specific detail, but it was with an nVidia guy). We don't know which monitors will be compatible yet, though.

To be honest, the whole time I've been wondering if they really want to sell a board where you have crack open a monitor (made by a third party) that is not meant to be opened by end-users. That seems fraught with a whole lot of liability concerns and even if it's sold as a enthusiast, do it on your own risk kit, it could see it lead to negative PR.

Anyways, offer me a Dell Ultrasharp compatible kit and I'll be interested.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

GrizzlyCow posted:

Also, TechReport has some data on that R9 290X retailer vs test sample variance issue. It looks kind of bad for AMD.

I think the industry press is banging this drum far more than is appropriate. The long and short of it just seems to have to do with some considerable deviance in chip production (which is, I mean, that's totally normal, look at ASIC validation figures from the same vendor, they're all over the place for these gigantic GPUs; now that we're into very very small and getting smaller transistors making 6 billion or 7.1 billion transistor circuits, they're not all gonna be ideal) leading to some chips having more package leakage than others. So some chips are gonna take more cooling than others. They had the same problem with the R9 290 and fixed it by pushing a driver patch for the fan controller to make its behavior meet the apparent lowest-common-denominator requirement for adequate cooling.

It's solved by sticking the thing in Ultra mode, which most gamers looking for high performance are going to do anyway, or by using custom cooling, which a really surprising number of people are doing considering that it's pretty time and labor intensive and voids some warranties.

Vendors need to get their aftermarket cooler designs out there stat to make this minor issue into a non-issue. In the meantime, I think it's a bit unfair to AMD to lay into them for variance that isn't really performance related, more cooling related... Or, at least, it's better to acknowledge that the problem isn't something new, it's just that their coolers suck and they probably sent their best cards out for reviews. nVidia does the same thing, they just aren't scrutinized like this. I believe TomsHardware got the ball rolling on this "scandal" but I really feel it's making a mountain out of a molehill. Their cooler is lovely and everybody already gets that, so that's what people should either be pissed about or accept as just a thing about reference cooling the Hawaii cards. Just my opinion, of course.

Ruin Completely posted:

It was the same way in DE:HR for me, but if I didn't have vsync on it would tear so bad I could barely see anything when I was turning. I haven't played it in a couple months though so maybe some drivers have fixed it since then.

Yeah, input lag sucks, but tearing sucks more. I'll be getting G-Sync as soon as I can ensure I'm getting a nice monitor to strap the tech to (or pre-made, so long as it's user serviceable down the line if a better module comes out with superior features and requires hardware changes rather than some kind of firmware adjustment), because it's going to fix a bunch of issues that are as old as display technology as such. That's magical :allears:

Professor Science posted:

I got one pretty recently. I may have teared up a little bit the first time I used it because it was the first time I'd ever seen 3D (eye muscle disorder as a kid, lost normal stereo vision as a result). The resolution is fantastically bad, the lack of lateral tracking is incredibly annoying, there are very few apps, and I'm still almost completely sure that they're going to become the first new category of hardware that is required for PC gaming since 3D accelerators. It honestly reminds me of the first time I saw a Glide app.

I really can't wait to get it in and give it a look. I know it's not really a finished product or anything, but it feels like an absolutely fantastic stepping-off point into something amazing not far from now.

Overall, this is an extremely exciting time for people looking both at how videogames are made and the developer-consumer relationship, and also what the videogaming experience will be like next. Not just "more pixels!" but realizations of genuinely new and unique ways to interface with games and entertainment generally.

veedubfreak
Apr 2, 2005

by Smythe

GrizzlyCow posted:

Also, TechReport has some data on that R9 290X retailer vs test sample variance issue. It looks kind of bad for AMD.

It's almost as if all the cards vary.

I need a third loving 290 now.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Grim Up North posted:

To be honest, the whole time I've been wondering if they really want to sell a board where you have crack open a monitor (made by a third party) that is not meant to be opened by end-users. That seems fraught with a whole lot of liability concerns and even if it's sold as a enthusiast, do it on your own risk kit, it could see it lead to negative PR.

Anyways, offer me a Dell Ultrasharp compatible kit and I'll be interested.

According to the guy in the podcast, no, this is definitely not their primary avenue of getting people into G-Sync. It's intended for strictly intensive DIY enthusiasts who have a compatible monitor and want in without having to buy a new monitor (because they're knowledgeable enough and skilled enough that they shouldn't need to spend that extra, unnecessary money, provided the FPGA solution exists). Their main distribution channel for the technology will be conventional retail outlets and partners who have integrated it from the factory. He used the term "99.9%" or something like that for their aims regarding the projected split between bought a G-Sync monitor vs. made a G-Sync monitor.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

veedubfreak posted:

It's almost as if all the cards vary.

I need a third loving 290 now.

Pretty sure the 290 to 290X gravy train is getting harder to get into now a days. :(

GrizzlyCow
May 30, 2011
Oops. I linked to the wrong article. I meant this one where TR actually investigates. MaximumPC and others found the difference to be smaller than TR's report to be fair.

Sorry, I'm not trying to spread FUD or anything; I just want if this is a legitimate thing to worry about (until custom cards are released) or if this is a complete non-issue. More people are happy with their R9 290(X) than not, so it is probably the latter.

edit: deimos, you're getting a bit of competition from Zotac.

GrizzlyCow fucked around with this message at 18:36 on Dec 13, 2013

Straker
Nov 10, 2005
Well, to me it's like the converse of the way CPUs (and previous GPUs) are validated and binned based on "will this run in an old cat lady's attic for 10 years without any maintenance?", and if you run it in a nicer environment than that with a nicer cooler than that, there's a good amount of headroom; some have more than others since their maximum performance is basically unknown to most people. You can get a huge overclock or run the cooler nearly silently or whatever.

With the 290s it's like AMD is trying to overclock them all as much as possible out of the box, since it's now easy enough to dial them back a little or cool them harder automatically if they can't handle it well (i.e. compared to 10 years ago where everything would have been crashing and starting on fire given thermal management at the time). Some chips get dialed back more and faster than others, so getting a 290 that needs its fan higher to maintain "stock" (really overclocked) performance is just not winning the chip lottery. I imagine AMD rationalized doing things this way because they were kind of desperate for performance and this is kind of the easy way out without a fundamentally better product, especially what with nVidia's cards already being out for a while. The whole approach seems a little sketchy but as long as you view it in terms of "well maybe they're just not very overclockable from the old perspective but at least they all work properly with fans close to 100%" then it doesn't seem like a huge deal to me, it's not like you need to replace their coolers just to get them working at stock speeds.

edit: more succinctly, if you cherrypick cards for reviewers, previously they would have been cards that OCed well, and with the way the 290s work they're just going to get cards that run well at "stock" speeds without fans blasting like crazy. AMD's mistake is/was running the blowers too gently on non-review cards and underclocking cards prematurely or too coarsely, assuming they're not totally scummy and everything currently shipping is capable of performing as advertised. It's not like literally only a few cards actually work and they sent them all to reviewers.

Straker fucked around with this message at 18:52 on Dec 13, 2013

Wistful of Dollars
Aug 25, 2009

I think 'mountain out of a molehill' is the best way to put it. My 290 fau(x)'s have been lovely and I haven't put them under water yet.

My only lingering doubt of the wisdom of going AMD is Gsync; but I would be very surprised if the equivalent isn't developed for AMD in the lifetime of my cards.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

GrizzlyCow posted:

edit: deimos, you're getting a bit of competition from Zotac.

Goddamnit. But mine are the superduperclocked EVGA version!

I lowered the already ridiculous prices :smithicide:

Also gently caress me the ARC Mini R2 hits all my :pcgaming: buttons.

deimos fucked around with this message at 20:00 on Dec 13, 2013

EoRaptor
Sep 13, 2003

by Fluffdaddy

Grim Up North posted:

To be honest, the whole time I've been wondering if they really want to sell a board where you have crack open a monitor (made by a third party) that is not meant to be opened by end-users. That seems fraught with a whole lot of liability concerns and even if it's sold as a enthusiast, do it on your own risk kit, it could see it lead to negative PR.

Anyways, offer me a Dell Ultrasharp compatible kit and I'll be interested.

nVidia developed the tech, but couldn't really get any monitor makers to 'byte'. They knew it was a solution to a real problem, but manufacturers didn't get it.

So, they took an example 'gaming' monitor, and modified it to show G-Sync off, both to the OEM's and to end users (through reporters). They didn't want to wait for any hype they generated to die down, either, so they chose to sell this prototype 'mod' directly to end users, to keep the word out.

Asus is also building it into their monitors directly, but the price premium is huge, because it's a generic FPGA purposed for doing this, not a dedicated ASIC.

The second generation of g-sync enabled devices should only be a 30~50 dollars more. Once it's integrated into an ASIC, the cost will drop dramatically.

I hope a third generation appears soon after that is an official part of the DP spec, and can then be used by any video card that so chooses and has basically no cost premium. G-sync could benefit more than just gamers (anything that slides information around rapidly could be helped, cad/cam, financial, etc).

The eventual goal would be to un-bind refresh rate from display entirely, and switch to a spec that could push a frame and a 'draw now' command out as packetized data.

EoRaptor fucked around with this message at 20:05 on Dec 13, 2013

veedubfreak
Apr 2, 2005

by Smythe

Straker posted:

Well, to me it's like the converse of the way CPUs (and previous GPUs) are validated and binned based on "will this run in an old cat lady's attic for 10 years without any maintenance?", and if you run it in a nicer environment than that with a nicer cooler than that, there's a good amount of headroom; some have more than others since their maximum performance is basically unknown to most people. You can get a huge overclock or run the cooler nearly silently or whatever.

With the 290s it's like AMD is trying to overclock them all as much as possible out of the box, since it's now easy enough to dial them back a little or cool them harder automatically if they can't handle it well (i.e. compared to 10 years ago where everything would have been crashing and starting on fire given thermal management at the time). Some chips get dialed back more and faster than others, so getting a 290 that needs its fan higher to maintain "stock" (really overclocked) performance is just not winning the chip lottery. I imagine AMD rationalized doing things this way because they were kind of desperate for performance and this is kind of the easy way out without a fundamentally better product, especially what with nVidia's cards already being out for a while. The whole approach seems a little sketchy but as long as you view it in terms of "well maybe they're just not very overclockable from the old perspective but at least they all work properly with fans close to 100%" then it doesn't seem like a huge deal to me, it's not like you need to replace their coolers just to get them working at stock speeds.

edit: more succinctly, if you cherrypick cards for reviewers, previously they would have been cards that OCed well, and with the way the 290s work they're just going to get cards that run well at "stock" speeds without fans blasting like crazy. AMD's mistake is/was running the blowers too gently on non-review cards and underclocking cards prematurely or too coarsely, assuming they're not totally scummy and everything currently shipping is capable of performing as advertised. It's not like literally only a few cards actually work and they sent them all to reviewers.

Part of the problem is that the Hawaii chips were supposed to be 20nm. This explains both why they run hot as balls and why there are so many fully intact chips that are being sent out as 290 with simply a bios lock.

Acquilae
May 15, 2013

GrizzlyCow posted:

Also, TechReport has some data on that R9 290X retailer vs test sample variance issue. It looks kind of bad for AMD.
Eh I thought GPU manufacturers did this for a while, like Ferrari does with putting soft tires(higher grip) on their review cars. When I was researching the 780 Classified, it seemed as if reviewers all got the first-run cards with Samsung memory and only a small batch of release-day owners before EVGA switched to Elpida memory.

Wistful of Dollars
Aug 25, 2009

deimos posted:


Also gently caress me the ARC Mini R2 hits all my :pcgaming: buttons.

I just started putting together my Mini R2. It's a lovely case. I like my 550D but I wanted to downsize to mATX. I'll take a photo of my Franken-Mini work-in-progress when I get home.

take the red pill
Dec 12, 2013

by T. Finninho
This madness must stop.

https://bitcointalk.org/index.php?topic=361992.msg3881993#msg3881993

I was really hankering for the r9-290 too.

Doyzer
Jul 28, 2013

take the red pill posted:

This madness must stop.

https://bitcointalk.org/index.php?topic=361992.msg3881993#msg3881993

I was really hankering for the r9-290 too.

AMD hopes not

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

El Scotch posted:

I just started putting together my Mini R2. It's a lovely case. I like my 550D but I wanted to downsize to mATX. I'll take a photo of my Franken-Mini work-in-progress when I get home.

I need to know, assuming no crazy fins on RAM does it fit an 80mm thick radiator on the top mount?

Schiavona
Oct 8, 2008

take the red pill posted:

This madness must stop.

https://bitcointalk.org/index.php?topic=361992.msg3881993#msg3881993

I was really hankering for the r9-290 too.

He spent Forty-three point two thousand dollars on video cards.

What the gently caress is wrong with these people.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

GrizzlyCow posted:

Also, TechReport has some data on that R9 290X retailer vs test sample variance issue. It looks kind of bad for AMD.

This is so blown out of proportion it's not even funny. Variance in the manufacturing? Say it ain't so ;)

Wistful of Dollars
Aug 25, 2009

deimos posted:

I need to know, assuming no crazy fins on RAM does it fit an 80mm thick radiator on the top mount?

Without using a measuring device I'm 95% sure it would. I have a 60mm radiator with one set of fans and it overlaps/clears the non-frilly ram just fine. I think the only part that could threaten the thicker radiator would be if the mounting hardware for the CPU block stuck out to far. I'm using an EK block and I think it would also be fine, but if other makes stuck out too far then *maybe* you could have an issue. I'll take some photos of the 'so-far' when I get home.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Doyzer posted:

AMD hopes not

In some ways, yes, but others, no, think about it: there are less people gaming on Radeons because of it, and those may be going with NVIDIA now.

On the flipside, it re-affirms the compute superiority of the cards, and in that sense, is pretty good advertising.

But advertising isn't much good if you can't actually get the product to people who want it at the correct and reasonable price.

HalloKitty fucked around with this message at 22:52 on Dec 13, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

On the flipside, it re-affirms the compute superiority of the cards, and in that sense, is pretty good advertising.

This isn't really true, though. Don't confuse "better at doing bitcoin poo poo" with "just plain better at compute." The way AMD's front end compute works happens to jive very well with the kinds of things that bitcoins and litecoins need. That is far from universal, and checking any of the HPC performance lists shows that the real contenders for perf:watt are nVidia with massively parallel Tesla systems and lately Intel with their Xeon Phi cards.

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

This isn't really true, though. Don't confuse "better at doing bitcoin poo poo" with "just plain better at compute." The way AMD's front end compute works happens to jive very well with the kinds of things that bitcoins and litecoins need. That is far from universal, and checking any of the HPC performance lists shows that the real contenders for perf:watt are nVidia with massively parallel Tesla systems and lately Intel with their Xeon Phi cards.

That's a fair comment, but are there even any AMD GPU based supercomputers to compare with?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply