Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Yeah, it does something at unplayable framerates that you wouldn't be getting if you didn't waste a significant chunk of your budget on it.

G-Sync has always been a lovely deal for consumers and it'll be a good thing when Nvidia is eventually forced to admit it was a terrible idea and get in line with adaptive sync.

Adbot
ADBOT LOVES YOU

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf

veedubfreak posted:

I could give a poo poo if the new 390 pulls 500 watts as long as it is fastest card I can get my hands on. Why do people act like 100w is this giant deal. It's 1 loving light bulb. It might cost you an extra 50 cents a month in electricity. Also, gently caress Nvidia for being giant lovely assholes that overcharge for everything. The last Nvidia card I actually had for more than a month was my 690.

And for the record, from my personal actual use of the 980 and 970 and 290. My 290 was the easiest card to actually work with and had the best performance at 11.5 million pixels.

I'm sure you of all people know its not the cost of electricity but the cost of cooling.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

veedubfreak posted:

I could give a poo poo if the new 390 pulls 500 watts as long as it is fastest card I can get my hands on. Why do people act like 100w is this giant deal. It's 1 loving light bulb. It might cost you an extra 50 cents a month in electricity. Also, gently caress Nvidia for being giant lovely assholes that overcharge for everything. The last Nvidia card I actually had for more than a month was my 690.

And for the record, from my personal actual use of the 980 and 970 and 290. My 290 was the easiest card to actually work with and had the best performance at 11.5 million pixels.

I agree, why would anyone care that their graphics card sounds like a jet engine and fills the room with the smell of burnt plastic? :downs:

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

veedubfreak posted:

I could give a poo poo if the new 390 pulls 500 watts as long as it is fastest card I can get my hands on. Why do people act like 100w is this giant deal. It's 1 loving light bulb. It might cost you an extra 50 cents a month in electricity. Also, gently caress Nvidia for being giant lovely assholes that overcharge for everything. The last Nvidia card I actually had for more than a month was my 690.

And for the record, from my personal actual use of the 980 and 970 and 290. My 290 was the easiest card to actually work with and had the best performance at 11.5 million pixels.

You dont think AMD would charge just as much if they could? They only reason they are selling 290's at $300 is nobody will buy them otherwise now that mining is dead.

Kazinsal
Dec 13, 2011

AVeryLargeRadish posted:

I agree, why would anyone care that their graphics card sounds like a jet engine and fills the room with the smell of burnt plastic? :downs:

Why would anyone buy a reference card when non reference cards are the same price?

jkyuusai
Jun 26, 2008

homegrown man milk

K8.0 posted:

Yeah, it does something at unplayable framerates that you wouldn't be getting if you didn't waste a significant chunk of your budget on it.

Yes, you can't use the laptop if you don't buy it? I don't really know what you're getting at here.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Kazinsal posted:

Why would anyone buy a reference card when non reference cards are the same price?

SLI

BONESTORM
Jan 27, 2009

Buy me Bonestorm or go to Hell!

Kazinsal posted:

Why would anyone buy a reference card when non reference cards are the same price?

Yeah, I have a non-reference R290x simply because it was a $100 bucks off about a year ago, so it was a much better deal at the time than a 780 Ti (it was actually cheaper than most 780 models at the time of purchase here in Canada). I actually do prefer nVidia cards on the whole, but some of the stuff being posted here is a little absurd. My card doesn't sound like a jet engine, and I've never really noticed excessive heat generated by it, or any increase in my electricity bill over the 560 Ti I had prior.

edit: Good point on the above, though, reference 290X's would be terrible in Crossfire for heat and noise I imagine, and the model I have has is slightly larger than 2 slots, not even sure crossfiring them would be a viable option.

BONESTORM fucked around with this message at 17:42 on Feb 4, 2015

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
If Intel adopts FreeSync, it will instantly be widely supported and outrageously popular, even likely backported to a wide variety of current hardware. AMD might not be able to out-strongarm Nvidia, but Intel can. And not just because they used to do StrongARM.

Kazinsal
Dec 13, 2011

You can CrossFire different makes of similar AMD cards (eg a reference 290 and an MSI Twin Frozr 290), and I agree with Steiner, who the hell CrossFires top end reference cards in the first place?

Sergeant Steiner posted:

Yeah, I have a non-reference R290x simply because it was a $100 bucks off about a year ago, so it was a much better deal at the time than a 780 Ti (it was actually cheaper than most 780 models at the time of purchase here in Canada). I actually do prefer nVidia cards on the whole, but some of the stuff being posted here is a little absurd. My card doesn't sound like a jet engine, and I've never really noticed excessive heat generated by it, or any increase in my electricity bill over the 560 Ti I had prior.

I bought an XFX non-reference 290 about a week ago because it was incredibly inexpensive (also here in Canada) compared to a 970. My computer under load is quieter than it was with a full water loop under idle.


I dunno. I suppose if you lived in some backwater country where power costs $0.30/kWh then yeah, dual 295X2s might be a bit on the expensive side to run, but drat guys, I've used ATI cards for ten years (with a bit of Nvidia patching up dead card holes in between) and have never needed a small fission reactor to power it and a cooling stack to keep my house from burning down.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Kazinsal posted:

You can CrossFire different makes of similar AMD cards (eg a reference 290 and an MSI Twin Frozr 290), and I agree with Steiner, who the hell CrossFires top end reference cards in the first place?


Ohh i was more talking intel since ref cards will expel the heat out of the case without killing the card on top.

veedubfreak
Apr 2, 2005

by Smythe

Kazinsal posted:

Why would anyone buy a reference card when non reference cards are the same price?

Because waterblocks are always guaranteed to fit on a reference card :)

Kazinsal
Dec 13, 2011

veedubfreak posted:

Because waterblocks are always guaranteed to fit on a reference card :)

Okay, fair point. A reference cooler works best when it's replaced, doesn't it ;)

veedubfreak
Apr 2, 2005

by Smythe

Kazinsal posted:

Okay, fair point. A reference cooler works best when it's replaced, doesn't it ;)

Yep, there is almost always a waterblock ready to go at launch that will fit reference cards. One of the problems with the 970 was that there was no "reference" card released. So you had to guess and hope at what would fit until about 2 months after launch. The Asus Strix luckily would work with the old 670/760 Asus DCII blocks. But most of the other cards had no full cover blocks until a few months later. I don't even watercool for heat at this point, it's mostly just the noise. The 970 Strix was too loud for me, and I couldn't get a straight answer from any of the Asus reps I talked to about warranty if I broke the seal.

calusari
Apr 18, 2013

It's mechanical. Seems to come at regular intervals.
It's not fair to compare the power consumption of midrange cards like the 970/980 to the R9 380X. The 980Ti/Titan II or whatever has a 250W TDP after all.

SwissArmyDruid
Feb 14, 2014

by sebmojo

calusari posted:

It's not fair to compare the power consumption of midrange cards like the 970/980 to the R9 380X. The 980Ti/Titan II or whatever has a 250W TDP after all.

No, it's entirely fair. The 390X is still on the horizon, and THAT will likely be what contends with the GM200 Titan in that space.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I'm sure this has been asked a thousand times here but what if Intel took an honest shot at a discrete graphics card, or just made a giant die CPU with integrated graphics taking up an absurd amount of it?

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
Well, they'd lose a lot of time and money.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
If the 380X has good price/performance all of a sudden far fewer people are going to care about efficiency.

Rastor
Jun 2, 2001

Zero VGS posted:

I'm sure this has been asked a thousand times here but what if Intel took an honest shot at a discrete graphics card, or just made a giant die CPU with integrated graphics taking up an absurd amount of it?
Intel seems focused on muscling into the low power / mobile game right now at huge expense, I doubt they want to spend even more money to also fight for a piece of the discrete graphics card pie.

It's not necessarily going to be "a giant die", but Intel Skylake is expected to be available with "GT4e" graphics featuring 72 Execution Units, which by some estimates will bring it up to GTX 750 / 750Ti levels of performance.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry
So between G-Sync and Adaptivesync, is one of them technically superior to the other? I mean, they both do the same thing, right?

veedubfreak
Apr 2, 2005

by Smythe

Lowen SoDium posted:

So between G-Sync and Adaptivesync, is one of them technically superior to the other? I mean, they both do the same thing, right?

G-Sync is superior to Adaptivesync because Nvidia can charge 100 dollars more for the chip.

Gwaihir
Dec 8, 2009
Hair Elf

Zero VGS posted:

I'm sure this has been asked a thousand times here but what if Intel took an honest shot at a discrete graphics card, or just made a giant die CPU with integrated graphics taking up an absurd amount of it?

Intel processor graphics already take up considerably more space on the die than the CPU cores do.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Lowen SoDium posted:

So between G-Sync and Adaptivesync, is one of them technically superior to the other? I mean, they both do the same thing, right?

G-Sync is technically superior. As it has a frame buffer on the monitor side, it can continue to refresh the last image indefinitely. With AdaptiveSync, as no frame buffer exists, after a while, it HAS to grab a new frame if the framerate gets down too low, as the effect of sitting on a frame for too long is a gradual washing out (on the scale of dozens of milliseconds) to white. If left unchecked, this would demonstrate itself to the human eye as flickering, and that is worse than any kind of screen tearing. Therefore, yes, monitor makers will program their scalars so that if it really absolutely NEEDS it, an AdaptiveSync display WILL still tear before it flickers.

You can see an example of the tearing here: https://www.youtube.com/watch?v=hnBmjN-GVuw

Still better than no variable refresh rate, though. I suggest watching the entire video.

veedubfreak
Apr 2, 2005

by Smythe

Gwaihir posted:

Intel processor graphics already take up considerably more space on the die than the CPU cores do.



Imagine how awesome cpus would be if you could buy them without all the gpu bullshit on them.

SwissArmyDruid
Feb 14, 2014

by sebmojo

veedubfreak posted:

Imagine how awesome cpus would be if you could buy them without all the gpu bullshit on them.

You can. They use X99 motherboards, and cost an arm, leg, and spleen.

Rastor
Jun 2, 2001

veedubfreak posted:

Imagine how awesome cpus would be if you could buy them without all the gpu bullshit on them.
Yes, just imagine.

veedubfreak
Apr 2, 2005

by Smythe
Right right, server cpus. Forgot about those as I don't actually interact with em.

EoRaptor
Sep 13, 2003

by Fluffdaddy

jkyuusai posted:

Just to verify, you do realize that the laptop didn't work in 100% of the cases that a display with the GSync module has been shown to work, yes? There's issues with flickering and sometimes the display completely blanking altogether at low frame rates. These issues are reproducible on the ROG Swift, which has the GSync module, but it's noted that they're much less severe.

Link

I'm going to speculate, but I'd guess those situations are either when a vblank doesn't arrive before some other hardware 'timer' runs out and forces the panel to refresh without any valid data, or arrives when the panel controller cannot accept it.

The first is probably software fixable, with a maximum time between vblanks being set. The second is trickier, and probably comes about from the panel getting a vblank right when it is doing something else that it cannot interrupt, and the result essentially resets the panel to 'empty' until the next frame arrives. This might not be software fixable, but improved controller design+firmware can probably address it. A minimum time between vblanks or between end of data and the vblank might help.

The GSync hardware module had a bunch of onboard cache ram, probably to allow it hold enough data to be able to avoid a no data situation. Since it still exhibited the problem sometimes, it can be argued that the fix is trickier to get right than has been publicly talked about.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

SwissArmyDruid posted:

G-Sync is technically superior. As it has a frame buffer on the monitor side, it can continue to refresh the last image indefinitely. With AdaptiveSync, as no frame buffer exists, after a while, it HAS to grab a new frame if the framerate gets down too low, as the effect of sitting on a frame for too long is a gradual washing out (on the scale of dozens of milliseconds) to white. If left unchecked, this would demonstrate itself to the human eye as flickering, and that is worse than any kind of screen tearing. Therefore, yes, monitor makers will program their scalars so that if it really absolutely NEEDS it, an AdaptiveSync display WILL still tear before it flickers.

You can see an example of the tearing here: https://www.youtube.com/watch?v=hnBmjN-GVuw

Still better than no variable refresh rate, though. I suggest watching the entire video.

The tablet I'm using right now has a tiny framebuffer DRAM chip hooked to the screen to enable Panel Self-Refresh. DisplayPort: Pretty much already does everything G-Sync does.

sauer kraut
Oct 2, 2004

Rastor posted:

Intel seems focused on muscling into the low power / mobile game right now at huge expense, I doubt they want to spend even more money to also fight for a piece of the discrete graphics card pie.

It's not necessarily going to be "a giant die", but Intel Skylake is expected to be available with "GT4e" graphics featuring 72 Execution Units, which by some estimates will bring it up to GTX 750 / 750Ti levels of performance.

It's gonna be interesting to see how well GT4e/128MB cache (insert XBone joke here) fares with decent DDR4.
The <150$ graphics card segment could just disappear.

VP9 hardware support is also pretty nifty. Will that work for playing Youtube videos in Chrome? I spend so many hours a day doing that.

SwissArmyDruid
Feb 14, 2014

by sebmojo

sauer kraut posted:

It's gonna be interesting to see how well GT4e/128MB cache (insert XBone joke here) fares with decent DDR4.
The <150$ graphics card segment could just disappear.

VP9 hardware support is also pretty nifty. Will that work for playing Youtube videos in Chrome? I spend so many hours a day doing that.

I foresee AMD proceeding to take that segment right back away from Intel with the release of their first HBM products. For their Iris Pro products, Intel embeds 128 MB of eDRAM as, I believe, shared between graphics and victim cache for L3 on the same package. (Technically making it L4 cache.) This makes it faster than system memory because it's still right next to the die.

With HBM, I think there's the possibility (since it's stacked on top of its memory controller) that we could see a single added layer (I believe first-gen HBM has a capacity of 256 MB per layer? ) of silicon on top of the processor itself, shared between graphics and L3.

This is very exciting news when you consider that AMD's APUs are hitting roadblocks because of how bandwidth-and-latency starved they are with DDR3, forcing them to limit themselves to just eight compute units in Kaveri.

Add in the work that they're already doing with their whole OpenCompute and HSA, and things REALLY get interesting when you can access your L3/graphics memory right there on-die.

Darkpriest667
Feb 2, 2015

I'm sorry I impugned
your cocksmanship.

veedubfreak posted:

I could give a poo poo if the new 390 pulls 500 watts as long as it is fastest card I can get my hands on. Why do people act like 100w is this giant deal. It's 1 loving light bulb. It might cost you an extra 50 cents a month in electricity. Also, gently caress Nvidia for being giant lovely assholes that overcharge for everything. The last Nvidia card I actually had for more than a month was my 690.

And for the record, from my personal actual use of the 980 and 970 and 290. My 290 was the easiest card to actually work with and had the best performance at 11.5 million pixels.

Let me straighten you out some. for those of us that fold 247 on our rigs (for science my friend not for porn) 300W difference is a huge loving difference in your power bill. Also dissipating heat is a big loving deal when you are running everything at load 247. I'm glad you're casual and turn your computer on twice a day and use maybe 4% of it's actual computational capabilities. I use 100%. 300W 247 matters.


EDIT--- Run this for a month or two straight with a 500W GPU and let me know how you feel about it then.

http://folding.stanford.edu/

Darkpriest667 fucked around with this message at 23:45 on Feb 4, 2015

craig588
Nov 19, 2005

by Nyc_Tattoo
If it costs too much to donate computing power, just stop donating.

Kazinsal
Dec 13, 2011

Darkpriest667 posted:

I'm glad you're casual and turn your computer on twice a day and use maybe 4% of it's actual computational capabilities.

Jesus, and I thought I was a loving sperg.

:frogout:


e: 500W * 24h = 12kWh/day * 30 days = 360 kWh/month * $0.10/kWh (average US electricity cost) = $36/month

If you can't afford your $36/month in power for the good of the world through your charitable number-crunching donations then how the gently caress could you afford a loving theoretical 500W GPU?

Kazinsal fucked around with this message at 00:06 on Feb 5, 2015

sauer kraut
Oct 2, 2004
Going for quality of life/silent operation and strict TDP limits like Intel does for CPUs was absolutely the correct choice.
If AMD go full steam ahead with their Pirate Island stuff they're gonna look like fossils.

Ham Sandwiches
Jul 7, 2000

Darkpriest667 posted:

Let me straighten you out some. for those of us that fold 247 on our rigs (for science my friend not for porn) 300W difference is a huge loving difference in your power bill. Also dissipating heat is a big loving deal when you are running everything at load 247. I'm glad you're casual and turn your computer on twice a day and use maybe 4% of it's actual computational capabilities. I use 100%. 300W 247 matters.

Hey guys, I choose to do this voluntary thing, but I'm really sanctimonious about it...

Darkpriest667
Feb 2, 2015

I'm sorry I impugned
your cocksmanship.
Hey guys, the point is wattage matters when you're doing that kind of stuff. So does heat. I'm sorry that isn't clear or relevant to you but to a lot of folks it is. IF AMD can get back into the game and dominate with power AND efficiency like Nvidia has I'll be the first in line to buy one.

Kazinsal posted:

Jesus, and I thought I was a loving sperg.

:frogout:


e: 500W * 24h = 12kWh/day * 30 days = 360 kWh/month * $0.10/kWh (average US electricity cost) = $36/month

If you can't afford your $36/month in power for the good of the world through your charitable number-crunching donations then how the gently caress could you afford a loving theoretical 500W GPU?


Yeah, exactly you've just made my point. That money saved is a new GPU every year.

Kazinsal
Dec 13, 2011

Darkpriest667 posted:

Yeah, exactly you've just made my point. That money saved is a new GPU every year.

No no, you've totally missed my point.

if that kind of money is prohibitively expensive for you to put towards the good of mankind you probably need to stop spending money on GPUs and Folding@Home and put in a few hours of overtime at work per month.

Adbot
ADBOT LOVES YOU

wolrah
May 8, 2006
what?

Kazinsal posted:

No no, you've totally missed my point.

if that kind of money is prohibitively expensive for you to put towards the good of mankind you probably need to stop spending money on GPUs and Folding@Home and put in a few hours of overtime at work per month.

There is of course the additional cost (assuming you live in a place with a summer) of cooling the building six months a year, but I agree with your general point since the power cost is about the same as a bar trip with a decent tip. I don't think the cooling cost could be any more than that and it's probably a lot less.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply