Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
lDDQD
Apr 16, 2006

Cinara posted:

The biggest reason for the huge FPS jump in DOOM is because AMD's OpenGL support was godawful and had huge CPU overhead. Vulkan cut the CPU overhead way down for everything, which ends up being a large boost for AMD cards but also a large boost for anything NVIDIA + lovely CPU. The link Beautiful Ninja posted above shows a great example, where a TitanX + i7-920 gets double the FPS under Vulkan.

Dammit. Like a fool, I've invested money into a better CPU, thinking that a 1st-generation i7, despite being clocked at 4.2GHz, is just not enough CPU to drive my 980ti...

Adbot
ADBOT LOVES YOU

Gwaihir
Dec 8, 2009
Hair Elf

lDDQD posted:

Dammit. Like a fool, I've invested money into a better CPU, thinking that a 1st-generation i7, despite being clocked at 4.2GHz, is just not enough CPU to drive my 980ti...

On the other hand: You still have a 980ti, a very very good card.

penus penus penus
Nov 9, 2014

by piss__donald

lDDQD posted:

Dammit. Like a fool, I've invested money into a better CPU, thinking that a 1st-generation i7, despite being clocked at 4.2GHz, is just not enough CPU to drive my 980ti...

Its still better lol

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

sauer kraut posted:

It's just a catchy name for OpenGL 5 to run away from decades of clutter, proprietary hacks and bad word of mouth.
How much studios are gonna use AMD's donated code, impossible to know. So far only 2 Gaming Evolved games (Warhammer and that crappy rts) use the fabled async, and you can bet your rear end they had engineers on site.

And Hitman 2016 (I believe).

Hieronymous Alloy
Jan 30, 2009


Why! Why!! Why must you refuse to accept that Dr. Hieronymous Alloy's Genetically Enhanced Cream Corn Is Superior to the Leading Brand on the Market!?!




Morbid Hound

Hubis posted:

And Hitman 2016 (I believe).

Deus Ex: MD will come August also.

PunkBoy
Aug 22, 2008

You wanna get through this?
Is really the only difference between this:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487248

and this:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487249

is clock speeds? I'm willing to pay the $20 since it's finally in stock.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

PunkBoy posted:

Is really the only difference between this:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487248

and this:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487249

is clock speeds? I'm willing to pay the $20 since it's finally in stock.

Pretty much, and there's no real reason to pay more for an OC'd version of the same card when all GTX 10 series cards OC about the same, but if its in stock, go for it.

PunkBoy
Aug 22, 2008

You wanna get through this?

Beautiful Ninja posted:

Pretty much, and there's no real reason to pay more for an OC'd version of the same card when all GTX 10 series cards OC about the same, but if its in stock, go for it.

Awesome, thanks!

Anti-Hero
Feb 26, 2004

Hieronymous Alloy posted:

Deus Ex: MD will come August also.

On that note, since it's a cross-platform game I wonder how much this will really matter. I missed DXHR on release, did it push the technical envelope at all?

I'm perfectly happy with my 980ti, but my brother is thinking of upgrading his aging GTX460 and this would be a nice opportunity to hand down the 980ti for a GTX1080...

Hieronymous Alloy
Jan 30, 2009


Why! Why!! Why must you refuse to accept that Dr. Hieronymous Alloy's Genetically Enhanced Cream Corn Is Superior to the Leading Brand on the Market!?!




Morbid Hound

Anti-Hero posted:

On that note, since it's a cross-platform game I wonder how much this will really matter. I missed DXHR on release, did it push the technical envelope at all?

I'm perfectly happy with my 980ti, but my brother is thinking of upgrading his aging GTX460 and this would be a nice opportunity to hand down the 980ti for a GTX1080...

All indications are that it's basically using a modded Hitman engine so it'll matter as much as it does for Hitman and no more most likely.

Drakhoran
Oct 21, 2012

sauer kraut posted:

It's just a catchy name for OpenGL 5 to run away from decades of clutter, proprietary hacks and bad word of mouth.
How much studios are gonna use AMD's donated code, impossible to know. So far only 2 Gaming Evolved games (Warhammer and that crappy rts) use the fabled async, and you can bet your rear end they had engineers on site.

Rise of the Tomb Raider just got a Asynchronous DX12 update and that's a The Way It Was Meant To Be Played game.

penus penus penus
Nov 9, 2014

by piss__donald

Drakhoran posted:

Rise of the Tomb Raider just got a Asynchronous DX12 update and that's a The Way It Was Meant To Be Played game.

Deep within the furthest subterrainian level of the NVida Spire, a green light flickers on in the dark. "Drakhoran...noted" then flicked off, reducing back to its slumber state, which is powered by Pascal at a TDP unknown to their bitter enemy

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
On the other hand, NVIDIA cards don't burn down your computer. Always a plus.

sauer kraut
Oct 2, 2004
Yeah Nixxes is the one company I'd trust to pull it off without AMDs help at this point.

computer parts
Nov 18, 2010

PLEASE CLAP

Paul MaudDib posted:

On the other hand, NVIDIA cards don't burn down your computer. Always a plus.

Outside of the year 2010 anyway.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Paul MaudDib posted:

On the other hand, NVIDIA cards don't burn down your computer. Always a plus.

Craptacular!
Jul 9, 2001

Fuck the DH
I don't understand why grown-ups who look down on, say, console wars feel this need to embellish graphics card manufacturers.

I enjoy laughing at the idea of RX480 fires, but I'm not going to pretend that AMD is Yugo; nor is NVidia an evil empire because they want you to pay more for what the believe and consumers agree is a premium product.

At the end of the day, you're the guy who cares this much about framerates to spend 10% more and get 5% extra frames.


I had a Galaxy 460 for before my current 660, and I think it was a Fermi. I don't remember it being absurdly hot.

Craptacular! fucked around with this message at 04:48 on Jul 12, 2016

penus penus penus
Nov 9, 2014

by piss__donald
Sorry guys I'm just super high

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

THE DOG HOUSE posted:

Sorry guys I'm just super high

So were the GTX 480 engineers when they decided it was OK to ship a ~300W GPU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SwissArmyDruid posted:

Vulkan *has* to be biased towards AMD, by virtue of being built upon Mantle's bones, right? Isn't that to be expected? Or is the delta just way too large?

I'm not seeing bias, did you even seen those results from the guy with nehalem + titan x?
It's been long documented that amd's drivers have more cpu overhead. Remove some of that, and clearly amd has more to gain.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

HMS Boromir posted:

Yeah, barring unexpected developments, AdaptiveSync/FreeSync should slowly become a standard feature for monitors. The only reason it feels like it's a matter of "GPU-specific monitors" is because nVidia decided it wanted its own special snowflake version of adaptive sync, which carries a ridiculous price premium for no particularly good reason.

Unfortunately, nVidia has really good marketing and AMD isn't beating them on product quality either right now, so it's unlikely that the upcoming ubiquity of FreeSync is going to put any pressure on them. Adaptive sync tribalism is probably here to stay for a good while, so while FreeSync isn't particularly money-to-blow right now and will be even less so in the future, you'll still probably have to hitch your wagon to AMD to get to use it.

It'd be cool (though probably even more of a blow to poor AMD) if once Ice Lake comes around, Intel drops a few APU-style SKUs with uncharacteristically beefy iGPUs that can actually run new games at low-medium settings 1080p with the sync picking up the slack to make it feel smooth.

You are vastly overestimating the impact of the whateversync monitor market and vastly underestimating the number of gamers who would rather just buy a better GPU to cap framerate at 60 then to deal with it and the sheer penny pinching cost cutting of display manufacturers/OEMs.

Almost a decade since the Q6600 and quad cores still haven't outnumbered duals in the Steam gaming demographic, so good luck weaning them off their crappy TN/VGA monitors, let alone the even slower to adapt segment which is everybody else.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Craptacular! posted:

I had a Galaxy 460 for before my current 660, and I think it was a Fermi. I don't remember it being absurdly hot.

I think most refer to the 480: http://www.techspot.com/review/263-nvidia-geforce-gtx-480/page13.html

techspot posted:

Despite being a single GPU graphics card it used slightly more power than the dual-GPU Radeon HD 5970 at both idle and load, and was comparable to a pair of Radeon HD 5870 graphics cards in Crossfire mode.

When it came time to remove the GTX 480 from our test system, I had to wear a pair of gloves to avoid any possible burns or dropping the extremely hot graphics card. The PCI Express power cables were amazingly soft from all the heat that had been thrown at them.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Craptacular! posted:


I had a Galaxy 460 for before my current 660, and I think it was a Fermi. I don't remember it being absurdly hot.

Where do you think this comic came from?

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
Well, got my MSI Gaming X 1080, seems to have done pretty well in the lottery but I really wish nvidia would fix their loving idle clock bug already.

My 2139mhz overclock with 100% fans, that appeared to be stable in 3D, translated into 1430 in 2D mode which was not stable and caused twitches or flashes in Firefox. Backing it down to 2114, or 1405 in 2D mode, fixed that. It's possible 2139 wasn't as stable as I thought but I'll never get to know.

There's no way to force gpu boost to one bin, is there? Or redefine the temperature ranges? Having one degree Celsius of difference with tonight's high ambient temperature drop me down a boost bin is annoying, especially when I know it was stable in 3D at those slightly higher temperatures with a higher overclock.


At least MSI's cooler is nice and quiet, given the premium they charge; at 100% fans it's roughly comparable to the G1 Gaming 980Ti I had at 70% fans. At 70% I can barely tell it's there without headphones, but 70% fans is just not enough to hold the card at 2114 indefinitely tonight. I think if I ran the AC or opened up my case more it'd be fine. If I water cooled it'd be perfectly stable in the 2139 or 2126 boost bin, but I can't keep it there on air.

Deathreaper
Mar 27, 2010
Thinking of getting a fury x over a gtx 1070 for my 2560x1600 60hz monitor. The fury x is slightly cheaper, available and seems to have improved substantially in benchmarks since its initial release. Plus dx12/Vulcan with async is benefiting AMD cards nicely. Only thing I'm worried about is the 4GB vram. It would be replacing 2 r9 290s which are just too hot/noisy in CF and I don't want to deal with CF anymore. What do you guys think?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Has anyone encountered this problem? Shadowplay is recording at the wrong resolution. My game is set to use DSR at something like 3414x1440, which is a multiple of my native resolution, 2560x1080. It's running in fullscreen. Shadowplay is set to record at the in-game resolution. The video output is in 1680x1050, which is the resolution of one of my secondary monitors.

How do I get it to not be retarded?

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Vulkan trip report on Doom with my GTX 1080 (at 4k).

FPS seems to be locked at 60 just like on OpenGL, but it doesn't feel as smooth. Something about it just isn't quite right. Loading times went up a ton with it too.

It does seem more stable though.

Craptacular!
Jul 9, 2001

Fuck the DH

PerrineClostermann posted:

Where do you think this comic came from?

There needs to be a reversed version of that for today.

Regardless, I love that Intel made a tiny amount of progress in the first two panels. :3:

Evil Fluffy
Jul 13, 2009

Scholars are some of the most pompous and pedantic people I've ever had the joy of meeting.

Twerk from Home posted:

He's not alone, I found DOOM kinda crashy on a 2500K / R9-290.

Edit: After thinking about it more, really crashy, and maybe DOOM is just finding bad memory or unstable overclocks for me? Maybe I'm due for a memtest and OC stability test, that 2500K is getting long in the tooth and the 290 that used to be OK at 1050MHz might want to be slower nowadays.

Are you trying to run the game with some settings set to Nightmare? I've heard that the game can just crash due to memory issues (either GPU or system) when some of the graphic settings are cranked fully.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Deathreaper posted:

Thinking of getting a fury x over a gtx 1070 for my 2560x1600 60hz monitor. The fury x is slightly cheaper, available and seems to have improved substantially in benchmarks since its initial release. Plus dx12/Vulcan with async is benefiting AMD cards nicely. Only thing I'm worried about is the 4GB vram. It would be replacing 2 r9 290s which are just too hot/noisy in CF and I don't want to deal with CF anymore. What do you guys think?

Unfortunately the 4GB of VRAM will really hold things back at anything over 1080p.

LiquidRain
May 21, 2007

Watch the madness!

bull3964 posted:

Vulkan trip report on Doom with my GTX 1080 (at 4k).

FPS seems to be locked at 60 just like on OpenGL, but it doesn't feel as smooth. Something about it just isn't quite right. Loading times went up a ton with it too.

It does seem more stable though.
Had this problem with my GTX 970. Capped at 60 and something just felt off, even with G-Sync. Frame pacing is just wild.

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib
How accurate is videocardz.com for news or rumours? I saw this article which supposedly has the specs for the 470 and 460. It also offhandedly mentions that a source says the 470 will launch at the end of July, and become available early August.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

PerrineClostermann posted:

Where do you think this comic came from?



All the times I've seen that comic I've never noticed the Intel turtle until now.

And now I see it slowly moves! :3

Craptacular!
Jul 9, 2001

Fuck the DH
Pascal VS Polaris

japtor
Oct 28, 2005

sauer kraut posted:

It's just a catchy name for OpenGL 5 to run away from decades of clutter, proprietary hacks and bad word of mouth.
Not exactly, barring changes in plans in the last year I guess. OpenGL is (was?) supposed to continue as a higher level API...and reduce the overhead to end up comparable to the low level stuff like Vulkan and what not? v:shobon:v

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


I got some pretty significant performance improvements for Tomb Raider on my 980ti with the new Steam update + most recent Nvidia drivers. Typical performance drop areas like Geothermal Valley don't seem to hit my framerate as hard running DX12.

Pikey
Dec 25, 2004
Crossposting from SA Mart since it seems like a lot of cards are still on backorder:

Looking to sell my barely used gtx 1070, product details on the newegg page here . Picked it up when rebuilding my PC and realized its much more than I really need. Used for approximately 1 week.

I can post pictures from home later in the morning.

Looking to recoup cost, $410 shipped to the lower 48.

I don't have PM, but can email questions to danames22 at gmail

Ak Gara
Jul 29, 2005

That's just the way he rolls.
Thinking about that Zotac GTX 1080 AMP! Extreme some more, if it's fans TURN OFF when under a benchmark program, you could set it to Manual 20% and forget about it? That's loving insane.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
I forgot how much I hate displayport's loving hotplug detection. It's part of the spec where if you turn a monitor off it's treated as if it's been unplugged. I like to turn my secondary monitors off when watching something on my main screen, but with displayport that causes the desktop to flicker and resize every time. Windows avoids doing this for the primary monitor, at least, or my monitor doesn't follow the worst parts of the spec.

The HDMI ports on my monitors are already full, so I need three DVI connections. On my GTX 1080 I've got the DVI port for one, I can use an HDMI to DVI cable for another monitor, and for the last monitor I'll try a DP->DVI cable, but I'm not too confident in that. It might end up being a dp -> hdmi -> EDID ghost (because they don't make these for DP, at least not yet) -> dvi frankenstein chain.

I hate computers. I wish I could blame this on Windows but Mac and Linux have the exact same behaviour without any software fix, though you can work around it with nvidia's workstation cards. It's like they never thought people might use more than one monitor.

At least 1200p60 is in the range of passive DP to HDMI/DVI converters.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy
If you have a dell monitor, you can sometimes correct the dumb "off=unplugged" poo poo somewhere in the OSD settings.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply