Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Truga
May 4, 2014
Lipstick Apathy

Craptacular! posted:

This seems like a "nobody cares" kind of thing. If you've got a hotshit 1080, you can obviously afford the premium. If you refuse to spend a dollar over $250 on a video card, you're probably hooked up to a 1080p TV anyway. It's like pointing out to a guy spending $95,000 on a car that a Viper costs $5,000 less than a Porsche.

Actually, a guy who refuses to spend over $250 on a GPU will notice a huge increase in his game quality, by paying $20 extra over a non-sync model, for a freesync one, whereas that guy who spent $1000 on a titan P won't really notice much, as his monitor is running at 144hz constantly anyway.

Adbot
ADBOT LOVES YOU

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Phuzun posted:

There are some newer games that aren't able to maintain 120 on the GTX1080 at 1080p (maxed). Either should be fine if you are okay with lowering details.

I just want "high" settings at 1080p. I've noticed with quite a few games going from "high" settings to "ultra" settings I myself don't see much of a difference other than the framerate making GBS threads the bed.

Craptacular!
Jul 9, 2001

Fuck the DH

Truga posted:

Actually, a guy who refuses to spend over $250 on a GPU will notice a huge increase in his game quality, by paying $20 extra over a non-sync model, for a freesync one, whereas that guy who spent $1000 on a titan P won't really notice much, as his monitor is running at 144hz constantly anyway.

I guess I see these GPU-specific monitors to be niche products for people with money to blow.

I admit I'm strange, though: I bought a fan-loving-tastic 1080p TV waaaay back in 2009 and have been using that ever since, and I just can't find a classic monitor solution because I love integrated sound too much. I threw away two different 4.1 sound systems from the 2000s just a few weeks ago, and I don't ever ever ever want anything other than a stereo speaker under my display ever again. Positional sound was interesting when I was playing Quake 3 but these days I actually enjoy keeping it simple.

Craptacular! fucked around with this message at 08:25 on Jul 11, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo
A reminder that while Intel has signed on to AdaptiveSync (the VESA version of Freesync) and has committed to making their iGPUs use the tech, the earliest that can conceivably arrive is 2018 with Ice Lake. (If there's anything that will drive Freesync adoption rates and refinement, it would be Intel pushing it mainstream by virtue of market share and volume.)


VVVVVV

I got booted from the internet before I could finish editing my comment. TL;DR, in the end, all bow to Chipzilla.

SwissArmyDruid fucked around with this message at 08:56 on Jul 11, 2016

HMS Boromir
Jul 16, 2011

by Lowtax
Yeah, barring unexpected developments, AdaptiveSync/FreeSync should slowly become a standard feature for monitors. The only reason it feels like it's a matter of "GPU-specific monitors" is because nVidia decided it wanted its own special snowflake version of adaptive sync, which carries a ridiculous price premium for no particularly good reason.

Unfortunately, nVidia has really good marketing and AMD isn't beating them on product quality either right now, so it's unlikely that the upcoming ubiquity of FreeSync is going to put any pressure on them. Adaptive sync tribalism is probably here to stay for a good while, so while FreeSync isn't particularly money-to-blow right now and will be even less so in the future, you'll still probably have to hitch your wagon to AMD to get to use it.

It'd be cool (though probably even more of a blow to poor AMD) if once Ice Lake comes around, Intel drops a few APU-style SKUs with uncharacteristically beefy iGPUs that can actually run new games at low-medium settings 1080p with the sync picking up the slack to make it feel smooth.

HMS Boromir fucked around with this message at 08:56 on Jul 11, 2016

Truga
May 4, 2014
Lipstick Apathy

Craptacular! posted:

I guess I see these GPU-specific monitors to be niche products for people with money to blow.

Sure, but so are $200 GPUs. Mainstream buys PS4s and plays lol on intel HD4000 or a best buy box with a R5 230 or a GTX 750. And while there's probably never going to be a gsync tv, there's already korean tvs with freesync support. And at a $250 price point, the :20bux: you pay over the non-freesync version is going to give a shitload more smoothness to your game than an extra $20 of GPU ever could. I'd argue you'd need to splurge an extra $100 to get the same benefit there. If you're going to buy a monitor in the near future, and you're buying a mid range gpu right now, get a freesync IMO.

The only reason I probably never will have *sync is because hopefully my monitor will last long enough for VR to reach high enough resolution for me to ditch it. If VR wasn't a thing, I'd probably already be looking for a new monitor now (mine is 7-8 years old now). Adaptive sync is very very good in practice, especially at lower frame rates, even if it sounds completely irrelevant on paper.

e: ofc if money is no object, go for an x34 gsync and 1080, $2000 right there :v:

Truga fucked around with this message at 09:11 on Jul 11, 2016

NewFatMike
Jun 11, 2015

PerrineClostermann posted:

The way it's meant to be paid

:vince:

Craptacular!
Jul 9, 2001

Fuck the DH

Truga posted:

Sure, but so are $200 GPUs. Mainstream buys PS4s and plays lol on intel HD4000 or a best buy box with a R5 230 or a GTX 750.

I have a 750 equivalent (from 2013 when it was a $200 GPU) and my ability to feel mainstream died a little while ago when I saw how demanding GTA is; just about the most mainstream game there is.

As I watch TV over a LAN tuner now and see that there's PC monitors with built in speakers I suppose I could buy a new monitor, but with the current ATI product I'd probably buy a FreeSync display if I had the money to spend on one at all, then wait and see if their equipment ever gets better in the future. Regardless, my point is that I'm not making any money and throwing irreplaceable cash at a video card, so this "you really should buy a display" business seems unnecessary to me. If I was to buy a display right now it would mean going cardless.

I just want hot Overwatch framerates (though what I get now is acceptable) and all the little environmental details in GTA.

Craptacular! fucked around with this message at 09:23 on Jul 11, 2016

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Yay I have a GTX 1070 now.

However, is that power bug fixed yet? I am driving two displays at 1440p and 120hz, and my GPU and memory clocks went from both 215MHz and 202.5MHz to 800MHz and 2002MHz respectively. If I change one display to 60hz, it goes back to former. --edit: I guess not.

Combat Pretzel fucked around with this message at 12:18 on Jul 11, 2016

GRINDCORE MEGGIDO
Feb 28, 1985


Combat Pretzel posted:

Yay I have a GTX 1070 now.

However, is that power bug fixed yet? I am driving two displays at 1440p and 120hz, and my GPU and memory clocks went from both 215MHz and 202.5MHz to 800MHz and 2002MHz respectively. If I change one display to 60hz, it goes back to former. --edit: I guess not.

Probably this is a really stupid question
... but does the bug remain, even if one of the displays is switched off?

I'm just wondering if having a vive plugged in its gonna trigger it even if the vive is switched off.

GRINDCORE MEGGIDO fucked around with this message at 12:47 on Jul 11, 2016

ASIC v Danny Bro
May 1, 2012

D&D: HASBARA SQUAD
CAPTAIN KILL


Just HEAPS of dead Palestinnos for brekkie, mate!
So I finally got my GTX 1080!

Had to remove the 750 ti, which I've had for a few years (it only just started to die this week), and was able to grab the 1080 before the store closed... but I forgot to purchase the DVI cable. Oh well!

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Truga posted:

Sure, but so are $200 GPUs. Mainstream buys PS4s and plays lol on intel HD4000 or a best buy box with a R5 230 or a GTX 750. And while there's probably never going to be a gsync tv, there's already korean tvs with freesync support. And at a $250 price point, the :20bux: you pay over the non-freesync version is going to give a shitload more smoothness to your game than an extra $20 of GPU ever could. I'd argue you'd need to splurge an extra $100 to get the same benefit there. If you're going to buy a monitor in the near future, and you're buying a mid range gpu right now, get a freesync IMO.

The only reason I probably never will have *sync is because hopefully my monitor will last long enough for VR to reach high enough resolution for me to ditch it. If VR wasn't a thing, I'd probably already be looking for a new monitor now (mine is 7-8 years old now). Adaptive sync is very very good in practice, especially at lower frame rates, even if it sounds completely irrelevant on paper.

e: ofc if money is no object, go for an x34 gsync and 1080, $2000 right there :v:

that tv seems awesome, but are there even any amd products that push avg 60fps at 4k on high performance games without having to use medium settings?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

wipeout posted:

Probably this is a really stupid question
... but does the bug remain, even if one of the displays is switched off?

I'm just wondering if having a vive plugged in its gonna trigger it even if the vive is switched off.
Display off during short-term doesn't change a thing. I have to yank the cable for it go into proper power mode. Or power it off by physical power switch.

Maybe if I let it be in standby off for a longer time so it enters deep sleep, but haven't tried that yet.

Combat Pretzel fucked around with this message at 13:28 on Jul 11, 2016

CopperHound
Feb 14, 2012

Craptacular! posted:

I just can't find a classic monitor solution because I love integrated sound too much. I threw away two different 4.1 sound systems from the 2000s just a few weeks ago, and I don't ever ever ever want anything other than a stereo speaker under my display ever again. Positional sound was interesting when I was playing Quake 3 but these days I actually enjoy keeping it simple.
Is it so bad to bolt a sound bar to your monitor?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Craptacular! posted:

I just can't find a classic monitor solution because I love integrated sound too much.

So, Dell monitors that support the Dell soundbar, then. They're actually pretty good, as opposed to the absolutely tiny speakers built into monitors.

Truga
May 4, 2014
Lipstick Apathy

Fauxtool posted:

that tv seems awesome, but are there even any amd products that push avg 60fps at 4k on high performance games without having to use medium settings?

That TV actually sucks and is like a year old, I'm sure there's ones with way better sync range out there by now, I just linked the first google hit for freesync tv

And, well, if you're willing to use a Sapphire Fury, you'll get decent FPS on high at 4k. Not in any good looking game at ultra though. And fury eats power like hell.

You can couple a 4k telly with a RX 480 though and play lol or overwatch at 4k and witcher etc at 1080p. And freesync will make sure the gameplay stays quite smooth even when you drop below 50. I mean, we are talking about $200 range GPUs, nothing will do 4k@60 in all games yet, not even a GTX 1080.

Craptacular! posted:

Regardless, my point is that I'm not making any money and throwing irreplaceable cash at a video card, so this "you really should buy a display" business seems unnecessary to me.

Oh certainly. I'm not saying you should buy a new screen because *sync is must have. Hell, I'm staying without it, because my monitor still works fine, and I'm spending cash on VR instead. I'm just saying, if you're considering getting a monitor upgrade, *sync is too good to pass up IMO (:iiaca: it's like buying a new car in tyool 2016 and not getting airbags), and at mid range prices gsync immediately makes any kind of price/performance curve break. It's unfortunate, but it is what it is. Maybe it'll even be what AMD needs to pull ahead a bit again.

Who am I kidding people will just buy nvidia anyway :v:

GRINDCORE MEGGIDO
Feb 28, 1985


Combat Pretzel posted:

Display off during short-term doesn't change a thing. I have to yank the cable for it go into proper power mode. Or power it off by physical power switch.

Maybe if I let it be in standby off for a longer time so it enters deep sleep, but haven't tried that yet.

Thanks for trying that. Bugger. Already going to have to work around one 10x0 problem and run the vive on HDMI :/

E- first world problems, but I'd be pissed if a cheap AMD card had them, let alone a £630 one.

GRINDCORE MEGGIDO fucked around with this message at 16:28 on Jul 11, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

A lot of the cheaper gaming oriented screens are freesync already, and more than once the best option I've seen is freesync at which point if you're in that range it's great. You get a boost with AMD cards and if for whatever reason you can't go with them it remains a good screen in that price range. That's a lot of why I'm so happy with the $600 XR341CK refurb I got. It's got freesync to make running it from a 290 viable (which is a pretty strong statement on how much Freesync can help at lower frame rates) and if I need an upgrade and have no AMD option, it's going to remain a 75 Hz low latency IPS ultrawide and that's hardly a bad screen.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Truga posted:

That TV actually sucks and is like a year old, I'm sure there's ones with way better sync range out there by now, I just linked the first google hit for freesync tv

The UHD400 is a lot newer than the UHD420, it also has Freesync but is a 40" 4K Samsung PLS panel for $500-ish

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

wipeout posted:

Thanks for trying that. Bugger. Already going to have to work around one 10x0 problem and run the vive on HDMI :/
Well, it depends how the Vive handles the HDMI. If it disconnects from it entirely, the card may slow down properly. I've let the computer idle with one display turned off with the soft button, didn't seem to affect the clocks, so YMMV.

axeil
Feb 14, 2006
Do *-sync solutions really improve things at sub-60 fps? It's sort of hard to tell from all the reviews. I'm oscillating back and forth between upgrading my 380 2 GB to an RX480 and buying a 1080p 144hz Free sync monitor and running dual monitors since the 380 can do free sync. I can't really figure out which is the better choice.

Boosted_C5
Feb 16, 2008
Probation
Can't post for 5 years!
Grimey Drawer
When can I expect aftermarket 1080s to be in stock for normal prices? This is getting absurd.

I FINALLY got an e-mail from NewEgg that the EVGA 1080 I wanted was in stock a few days ago. In the wee hours of the morning. Obviously they were out of stock by the time I saw the e-mail.

I have still yet to see a single card available on Amazon from Amazon for the list price.

Is this typical for GPU launches? Launch them without adequate inventory?

After buying late and cheaper the last 2 generations, I want to buy the big dog card ASAP and enjoy it as long as possible. I've got a 144hz G-SYNC 1440p monitor and I want to play Hitman with everything on Ultra and absurdly high frame rates dammit!!!

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Boosted_C5 posted:

When can I expect aftermarket 1080s to be in stock for normal prices? This is getting absurd.

I FINALLY got an e-mail from NewEgg that the EVGA 1080 I wanted was in stock a few days ago. In the wee hours of the morning. Obviously they were out of stock by the time I saw the e-mail.

I have still yet to see a single card available on Amazon from Amazon for the list price.

Is this typical for GPU launches? Launch them without adequate inventory?

After buying late and cheaper the last 2 generations, I want to buy the big dog card ASAP and enjoy it as long as possible. I've got a 144hz G-SYNC 1440p monitor and I want to play Hitman with everything on Ultra and absurdly high frame rates dammit!!!

Use nowinstock

penus penus penus
Nov 9, 2014

by piss__donald

axeil posted:

Do *-sync solutions really improve things at sub-60 fps? It's sort of hard to tell from all the reviews. I'm oscillating back and forth between upgrading my 380 2 GB to an RX480 and buying a 1080p 144hz Free sync monitor and running dual monitors since the 380 can do free sync. I can't really figure out which is the better choice.

My experience is pretty minimal but yes everytime I used (g)sync it was always with something that the GPU could barely handle and rarely went above 60 fps. The difference is quite noticeable.

But I would probably get a 480 over running dual monitors on a 380 tbh. The best 33 fps is still just 33 fps, and sync can't fix plain old stuttering of course. It would depend on what you're doing with it though

Boosted_C5 posted:

When can I expect aftermarket 1080s to be in stock for normal prices? This is getting absurd.

I FINALLY got an e-mail from NewEgg that the EVGA 1080 I wanted was in stock a few days ago. In the wee hours of the morning. Obviously they were out of stock by the time I saw the e-mail.

I have still yet to see a single card available on Amazon from Amazon for the list price.

Is this typical for GPU launches? Launch them without adequate inventory?

After buying late and cheaper the last 2 generations, I want to buy the big dog card ASAP and enjoy it as long as possible. I've got a 144hz G-SYNC 1440p monitor and I want to play Hitman with everything on Ultra and absurdly high frame rates dammit!!!

It has been worse lol (Fury), but this is easily the most significantly large release with supply issues in recent memory. Unusual though? No, anytime anything new and cool releases they get sold out for at least a week or two. But with the 1080 we are solidly at 5 weeks, so, that part of it is unusual.

penus penus penus fucked around with this message at 15:54 on Jul 11, 2016

penus penus penus
Nov 9, 2014

by piss__donald
*double post

NewFatMike
Jun 11, 2015

Re: monitor chat, I'm picking up a Wasabi Mango UHD 490 in the near future, probably after I get back from a business trip overseas.

It's a 49" 4k LG IPS panel with Freesync. My biggest concern is nobody has any mention on what the freesync range is, but we'll see! It caps at 4k60, so hopefully it'll be something like 42-60FPS just to future proof a bit. It's going to replace my projector as my media screen, though.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

axeil posted:

Do *-sync solutions really improve things at sub-60 fps? It's sort of hard to tell from all the reviews. I'm oscillating back and forth between upgrading my 380 2 GB to an RX480 and buying a 1080p 144hz Free sync monitor and running dual monitors since the 380 can do free sync. I can't really figure out which is the better choice.

My Opinion. I have a XB270H (which was bought refurbed for $450) and a GTX 770(1070 coming tomorrow). While GSync helps somewhat over 60FPS the best use for it is when Frame rates start bouncing around from 35-60. Pre-GSync those bounces would be pretty noticable, now it feels like I have an artificial increase in FPS because those dips don't feel nearly as jarring. A game like XCOM2 (which is optimized like crap and really taxes my 2500K) immediately felt way better because the monitor just clocked itself down in the low 40s. If I play, like, CS GO or something (FPS in the 80s) it still feels better but not as dramatic as lower FPS.

secret volcano lair
Oct 23, 2005

quote:

We regret to inform you that due to a system error the ordered merchandise is not in stock.

Shout out to B&H photo for temporarily saving me from my poor financial decisions

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
DOOM has just gained Vulkan support:
https://bethesda.net/#en/events/game/doom-vulkan-support-now-live/2016/07/11/156

AMD's saying over 20% increase in performance on the 480 with Vulkan:
https://community.amd.com/community/gaming/blog/2016/07/11/radeon-graphics-takes-doom-to-the-next-level-with-vulkan-implementation

Finally, an actual game people would want to play has gained support for Vulkan. This should make for interesting comparisons.

Edit: just loaded a saved game, standing on the spot staring in the direction it loaded, switched between the two, gained 10 FPS on my 290X (from 96 to 106). 1920×1200, every setting maxed (shadows on Ultra, of course, it won't let me set Nightmare), but I'm using SMAA (1TX) as opposed to the temporal AA everyone gushes about. Doubt it would make a performance difference. Obviously that's not a meaningful benchmark result, someone else will do it better than I could, but it looks promising.

Double edit: I'm not even on the latest driver, that was on 16.6.1. Running a 2500K @ 4.6, DDR3-2133

HalloKitty fucked around with this message at 21:09 on Jul 11, 2016

Sininu
Jan 8, 2014

HalloKitty posted:

DOOM has just gained Vulkan support:
https://bethesda.net/#en/events/game/doom-vulkan-support-now-live/2016/07/11/156

AMD's saying over 20% increase in performance on the 480 with Vulkan:
https://community.amd.com/community/gaming/blog/2016/07/11/radeon-graphics-takes-doom-to-the-next-level-with-vulkan-implementation

Finally, an actual game people would want to play that has gained support for Vulkan. This should make for interesting comparisons.

Vulkan is crazy good.

SinineSiil posted:

Game was unplayable before with OpenGL because of CPU spikes to ~40ms every couple seconds and GPU graph looked like sawteeth. Now the graphs are almost flat and performance is sublime!


That's on GTX970M, i7 4720HQ and 16GB RAM, all high settings. Previously it didn't even matter what settings I used. It would stutter like crazy even with 50% 800x600 res, all low/off settings.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Can I assume that enabling DSR on NVidia cards excludes the composited desktop from it in Windows? --edit: Oh wait, it creates new resolution modes for the games. Nevermind then.

Combat Pretzel fucked around with this message at 16:45 on Jul 11, 2016

penus penus penus
Nov 9, 2014

by piss__donald
Tomb Raider released a patch with async compute

http://www.extremetech.com/gaming/231481-rise-of-the-tomb-raider-async-compute-update-improves-performance-on-amd-hardware-flat-on-maxwell

Although they only tested the fury vs 980ti so far it seems, gains are similar to other async compute things.



They do mention that the 480 probably won't see these gains due to what theyve seen with other benchmarks, does anybody know why this is? They reference their own AOTS which shows just a 3% increase between DX11 and DX12+async compute,

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Hopefully Vulkan makes it more stable. I don't decide when to quit playing Doom for the night, it decides for me.

bull3964 fucked around with this message at 16:51 on Jul 11, 2016

Anthony Chuzzlewit
Oct 26, 2008

good for healthy


Truga posted:

Actually, a guy who refuses to spend over $250 on a GPU will notice a huge increase in his game quality, by paying $20 extra over a non-sync model, for a freesync one, whereas that guy who spent $1000 on a titan P won't really notice much, as his monitor is running at 144hz constantly anyway.

Agreed, adding a Gsync monitor for my 960 resulted in a huge quality improvement. I was able to increase quality settings in most games while still feeling "smooth".

The Gsync pricing does strike me as odd, since the people who would benefit the most are also the least likely to shell out the extra cash. I only did it because my old screen was getting too yellow.

Also I know this isn't the monitor thread, but if you get a sync monitor, replace your decade-old mouse too. I play Overwatch on low settings (for maximum fps - with my old man reflexes I need every advantage), and when I swapped my junky old Lenovo mouse for a new 1000Hz one, aiming got a LOT smoother.

repiv
Aug 13, 2009


lol i'm sure it's all due to async shaders and intrinsics, and not that your opengl implementation is hot garbage

THE DOG HOUSE posted:

They do mention that the 480 probably won't see these gains due to what theyve seen with other benchmarks, does anybody know why this is? They reference their own AOTS which shows just a 3% increase between DX11 and DX12+async compute,

Probably because Hawaii/Tonga/Fiji had 8 async queues, but they dropped to 4 with Polaris.

Setzer Gabbiani
Oct 13, 2004

HalloKitty posted:

DOOM has just gained Vulkan support:
https://bethesda.net/#en/events/game/doom-vulkan-support-now-live/2016/07/11/156

AMD's saying over 20% increase in performance on the 480 with Vulkan:
https://community.amd.com/community/gaming/blog/2016/07/11/radeon-graphics-takes-doom-to-the-next-level-with-vulkan-implementation

Finally, an actual game people would want to play that has gained support for Vulkan. This should make for interesting comparisons.

It's one hell of an upgrade, The Foundry - probably the most visually-demanding mission - is where you wanna be to see it in action, this is with both API's running everything maxed using Nightmare settings too

http://imgur.com/50cxYXo

http://imgur.com/JE6cBBr

At least there's a next-gen alternative if DX12 suffers the same fate as DX10

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Setzer Gabbiani posted:

Nightmare settings too

Nightmare on Fury? I guess there's some kind of hack to ignore the 5GB VRAM limit in the menu.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Those benchmarks are outdated, since there was an enormous boost to FPS in the AMD driver after the one used in that review (16.5.2.1 vs 16.5.2 in the review)
https://www.youtube.com/watch?v=WvWaE-3Aseg

Hieronymous Alloy
Jan 30, 2009


Why! Why!! Why must you refuse to accept that Dr. Hieronymous Alloy's Genetically Enhanced Cream Corn Is Superior to the Leading Brand on the Market!?!




Morbid Hound

HalloKitty posted:

DOOM has just gained Vulkan support:
https://bethesda.net/#en/events/game/doom-vulkan-support-now-live/2016/07/11/156

AMD's saying over 20% increase in performance on the 480 with Vulkan:
https://community.amd.com/community/gaming/blog/2016/07/11/radeon-graphics-takes-doom-to-the-next-level-with-vulkan-implementation

Finally, an actual game people would want to play has gained support for Vulkan. This should make for interesting comparisons.

Edit: just loaded a saved game, standing on the spot staring in the direction it loaded, switched between the two, gained 10 FPS on my 290X (from 96 to 106). 1920×1200, every setting maxed (shadows on Ultra, of course, it won't let me set Nightmare), but I'm using SMAA (1TX) as opposed to the temporal AA everyone gushes about. Doubt it would make a performance difference. Obviously that's not a meaningful benchmark result, someone else will do it better than I could, but it looks promising.

Double edit: I'm not even on the latest driver, that was on 16.6.1

Oh gently caress yes, I'd been holding off on Doom waiting for this.

Adbot
ADBOT LOVES YOU

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

bull3964 posted:

Hopefully Vulkan makes it more stable. I don't decide when to quit playing Doom for the night, it decides for me.

Just as a warning that you might want to examine your system for other issues, I have *never* had stability problems in Doom.

e: unless you mean you're on an AMD GPU? Because AMD OpenGL is a potential stability issue in itself, yeah.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply