Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rock Puncher
Jul 26, 2014

VelociBacon posted:

This is in response to my emailing that that this image implies that the card pictured has iCX2 cooling.

I think my next card is going to be an EVGA. A human wrote this, and the person who wrote it clearly knows what's going on and discloses where they don't have full information.


Thank you for posting.

Adbot
ADBOT LOVES YOU

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
People are speculating now that the RTX cards are intentionally overpriced to get rid of the Pascal stock, that AIBs are being forced to eat.

Dr. Despair
Nov 4, 2009


39 perfect posts with each roll.

and yet it doesn't seem like there's any b-stock 1080's in evga's midweek madness, or real slashed prices yet for new cards. Hopefully some good labor day sales pop up I guess.

eames
May 9, 2009

Combat Pretzel posted:

People are speculating now that the RTX cards are intentionally overpriced to get rid of the Pascal stock, that AIBs are being forced to eat.

Yes, GN mentioned that. I don't think Turing is bad, from what I can tell right now it just seems overpriced at launch to me, more so than previous generations. The theory makes a lot of sense to me.

ufarn
May 30, 2009
It was also one of the major concerns leading up to the Turing announcement that this was going to be the outcome.

Aexo
May 16, 2007
Don't ask, I don't know how to pronounce my name either.
A few days ago I posted saying how Windows 7 wouldn't boot after I installed the motherboard's Intel video drivers.

After many hours of trial and error, including trying fresh installs of Windows 10 with no change in behavior, I came to the conclusion that it was either the video card or the power supply feeding the video card. This was because without the discreet GPU slotted (or slotted and not given additional power), everything worked fine.

I even broke out the multimeter and all the power looked good both on the video card and PSU and its connectors.

Continuing my furious Googling led me to a video (I'll edit the post when I'm back on desktop mode) pointing to a GPU ROM flash as a solution for an issue with similar symptoms. I somehow made it into windows with it detecting my card but not crashing, and was able to flash the ROM on the GPU. Then install the drivers...and holy crap, I think that's exactly what I needed.

I thought for sure I somehow had blown something on the card and I was pretty pissed that it took this long to figure out that it wasn't but hopefully it can serve as another troubleshooting step for future posters. I've never done a GPU ROM flash and it was a little scary, as it came with a warning that you could brick your GPU, but it already wasn't working so I just dove right and it seems to have fixed my issue.

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop
What makes no sense to me is they have containers full of 10xx that AIBs are trying to send back, but there were no cards on shelves because miners bought them all out.

What was the holdup in making more? Usually the bottleneck is the big-die silicon, and that clearly already existed. GDDR5(x) shortages?

Sacred Cow
Aug 13, 2007

Spaseman posted:

The EVGA ebay store has a 1070 Ti for $356.99 after promo code and I can't decide whether to buy it or wait for another 1080 sale. From what I see, the power jump between a 1070Ti and a 1080 is small enough that that the Ti at this price is worth it. Am I right?

It might be too late but I compared my 1070ti against my friends 1080 and found I could easily OC my card to match a base clocked 1080. If you’re gaming at 1440p/60 it’s a great card.

Krogort
Oct 27, 2013

Harik posted:

What makes no sense to me is they have containers full of 10xx that AIBs are trying to send back, but there were no cards on shelves because miners bought them all out.

What was the holdup in making more? Usually the bottleneck is the big-die silicon, and that clearly already existed. GDDR5(x) shortages?

Maybe they ramped up production to match the crypto demand. But I find that unlikely considering that everybody knew thag ASIC would eventually be developed for every big crypto and nvidia had the new generation on standby ready for release when the demand slowed.
Or they stockpiled a bit to encourage scarcity and drive up the prices and it ended up bitting them in the rear end.

Ragingsheep
Nov 7, 2009

Combat Pretzel posted:

Apparently AIBs aren't allowed to send you drivers with their review cards anymore, and to get the pre-release drivers from NVIDIA, you have to sign their lovely NDA. Well played I guess.

https://forums.guru3d.com/threads/nvidia-to-control-aibs-custom-rtx-2080-ti-reviews.422723/#post-5579179

Setzer Gabbiani
Oct 13, 2004

If the RTX cards perform either exactly the same or just somewhat better than the 1000's in DX11 - the API 90% all current and future games use - then that's probably why there's a giant mass of stock that will go absolutely nowhere anytime soon, and probably why we're only seeing DX12 benchmarks and vague as gently caress DX11 speculation. Nvidia and co. aren't going to be sanctioning a firesale with fair prices anytime soon on stockpiles of cards that could potentially-compete with a brand-new lineup, and since this stockpile probably only exists because of a bad bet on crypto, you've got some added spite in there too

Most of the RTX cards are gonna find their way into an eternal Windows 7 installation at launch, and if the current DX12 situation continues, a lot of publishers/devs who won't move either because 7 is the protest OS like XP was in the era of Vista, and that's still enough of a sales risk no one's gonna take. Since nearly every DX12 game has come with a DX11 renderer the same way every DX10 game came with a DX9 fallback, the better deal is still going to be a discounted 1080/1080ti if you don't care about what's probably going to be rough first-gen raytracing, are one of the many that's been stuck on a 390X or 970 for what feels like years, or have correctly-noticed that DX12 mostly just doesn't exist in the AAA space

https://web.archive.org/web/20180617132107/https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

Aexo posted:

A few days ago I posted saying how Windows 7 wouldn't boot after I installed the motherboard's Intel video drivers.

After many hours of trial and error, including trying fresh installs of Windows 10 with no change in behavior, I came to the conclusion that it was either the video card or the power supply feeding the video card. This was because without the discreet GPU slotted (or slotted and not given additional power), everything worked fine.

I even broke out the multimeter and all the power looked good both on the video card and PSU and its connectors.

Continuing my furious Googling led me to a video (I'll edit the post when I'm back on desktop mode) pointing to a GPU ROM flash as a solution for an issue with similar symptoms. I somehow made it into windows with it detecting my card but not crashing, and was able to flash the ROM on the GPU. Then install the drivers...and holy crap, I think that's exactly what I needed.

I thought for sure I somehow had blown something on the card and I was pretty pissed that it took this long to figure out that it wasn't but hopefully it can serve as another troubleshooting step for future posters. I've never done a GPU ROM flash and it was a little scary, as it came with a warning that you could brick your GPU, but it already wasn't working so I just dove right and it seems to have fixed my issue.

Most of the time you can just reflash with a different bios from dos after you short some pins on the card, just a fyi.

wargames
Mar 16, 2008

official yospos cat censor

Setzer Gabbiani posted:

If the RTX cards perform either exactly the same or just somewhat better than the 1000's in DX11 - the API 90% all current and future games use - then that's probably why there's a giant mass of stock that will go absolutely nowhere anytime soon, and probably why we're only seeing DX12 benchmarks and vague as gently caress DX11 speculation. Nvidia and co. aren't going to be sanctioning a firesale with fair prices anytime soon on stockpiles of cards that could potentially-compete with a brand-new lineup, and since this stockpile probably only exists because of a bad bet on crypto, you've got some added spite in there too

Most of the RTX cards are gonna find their way into an eternal Windows 7 installation at launch, and if the current DX12 situation continues, a lot of publishers/devs who won't move either because 7 is the protest OS like XP was in the era of Vista, and that's still enough of a sales risk no one's gonna take. Since nearly every DX12 game has come with a DX11 renderer the same way every DX10 game came with a DX9 fallback, the better deal is still going to be a discounted 1080/1080ti if you don't care about what's probably going to be rough first-gen raytracing, are one of the many that's been stuck on a 390X or 970 for what feels like years, or have correctly-noticed that DX12 mostly just doesn't exist in the AAA space

https://web.archive.org/web/20180617132107/https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support

Does vulkan work on win7?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

wargames posted:

Does vulkan work on win7?

Yes.

repiv
Aug 13, 2009

wargames posted:

Does vulkan work on win7?

yes

devs don't seem to care though, besides id software most are opting for dx12 over vulkan

Anime Schoolgirl
Nov 28, 2002

also mostly because a lot of what's on dx12 is done the same way on PS4/bone dev kits

kind of amusing that a lot of the cool performance-saving tricks just largely don't happen on PC for a long time due to adoption being more staggered, rather than a eventual wide switchover between console generations

you can see this most egregiously on ff15, with its comically poo poo performance on PCs compared to consoles of similar spec

Worf
Sep 12, 2017

If only Seth would love me like I love him!

9900k works with z370 you say eh


Well thats the worst piece of good news I've internalized lately

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

Statutory Ape posted:

9900k works with z370 you say eh


Well thats the worst piece of good news I've internalized lately

?

ufarn
May 30, 2009
Speaking of Vulkan, is Mantle still a thing?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

ufarn posted:

Speaking of Vulkan, is Mantle still a thing?

Mantle, with some API tweaks, more or less became Vulkan.

https://www.pcworld.com/article/2894036/mantle-is-a-vulkan-amds-dead-graphics-api-rises-from-the-ashes-as-opengls-successor.html

GRINDCORE MEGGIDO
Feb 28, 1985



They mean they're going to spend a fair bit of money probably.

repiv
Aug 13, 2009

The NDA on Battlefield Vs RTX preview just dropped and a bunch of videos are going out:

https://www.youtube.com/watch?v=8kQ3l6wN6ns

Key points:
>60fps at 1080p or 40-50fps at 1440p on a 2080ti in the current build :geno:
DICE isn't using the tensor cores yet, their denoising is done with a regular shader
Reflections are currently traced at full resolution, a half-res option would be much faster

repiv fucked around with this message at 18:11 on Aug 29, 2018

Arcturas
Mar 30, 2011

I’m super uninformed about graphics cards and want some complete speculation about the future to help me make a purchase-now-or-wait decision. I have a rig with an i5-8600k and a 1050ti. I would like to be able to try VR (don’t get me started on buying a rift vs waiting for the next gen on that side). What’s the current thinking on upgrading now vs waiting for the next generation of nVidia cards that have been announced to be released? Price is negotiable but I would like to avoid spending more than $400 on the card.

I’m not particularly interested in bleeding edge performance because my monitors are a 1920x1200 Dell and a 1920x1080 Acer, and most of what I play is lower-performance stuff like Hollow Knight, League, Slay the Spire, CK 2, EU4, civ, Witcher, etc. I am super lovely at FPS’s and don’t care about them. But I have been getting into Elite recently and playing that, (particularly on VR at my brothers place) has been great. Plus Beat Saber and the various VR games look awesome.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

The NDA on Battlefield Vs RTX preview just dropped and a bunch of videos are going out:

https://www.youtube.com/watch?v=8kQ3l6wN6ns

Key points:
>60fps at 1080p or 40-50fps at 1440p on a 2080ti in the current build :geno:
DICE isn't using the tensor cores yet, their denoising is done with a regular shader
Reflections are currently traced at full resolution, a half-res option would be much faster

Competitive online gaming seems like the wrong place to be showcasing performance-impairing raytracing poo poo, given that many people turn down their settings to N64 levels in those games, but I guess DICE is probably the hottest engine dev team around at this point.

At a minimum they need to get that half-res and tensor cores going. I think they need at least twice what they're getting right now for it to really be viable. 2080 Ti really needs to be pushing at least 60 fps at 4K and close to 100 fps at 1440p before it's even remotely an option in a competitive shooter.

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Arcturas posted:

I’m super uninformed about graphics cards and want some complete speculation about the future to help me make a purchase-now-or-wait decision. I have a rig with an i5-8600k and a 1050ti. I would like to be able to try VR (don’t get me started on buying a rift vs waiting for the next gen on that side). What’s the current thinking on upgrading now vs waiting for the next generation of nVidia cards that have been announced to be released? Price is negotiable but I would like to avoid spending more than $400 on the card.

I’m not particularly interested in bleeding edge performance because my monitors are a 1920x1200 Dell and a 1920x1080 Acer, and most of what I play is lower-performance stuff like Hollow Knight, League, Slay the Spire, CK 2, EU4, civ, Witcher, etc. I am super lovely at FPS’s and don’t care about them. But I have been getting into Elite recently and playing that, (particularly on VR at my brothers place) has been great. Plus Beat Saber and the various VR games look awesome.

Benchmarks for the new cards are supposed to be available on the 14th. Any advice given before that is speculation unless it's advice like "if you have to buy a card now get an EVGA so you can use their step-up program if the new cards are good."

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

repiv posted:

The NDA on Battlefield Vs RTX preview just dropped and a bunch of videos are going out:

https://www.youtube.com/watch?v=8kQ3l6wN6ns

Key points:
>60fps at 1080p or 40-50fps at 1440p on a 2080ti in the current build :geno:
DICE isn't using the tensor cores yet, their denoising is done with a regular shader
Reflections are currently traced at full resolution, a half-res option would be much faster

Interesting, it looks like from what the Dice team are saying that they could get a pretty huge performance gain from doing a number of things that were not possible on the Volta hardware they had been using before they got their hands on Turing. Like just one tweak they want to make should boost ray tracing performance by 30% all by itself. They had also only had two weeks with the hardware at the time they talked to the press for that video, so they have a lot to learn and improve on. I really don't think the early performance we have seen in demos will be reflective of performance in games once they are released.

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
Why do people turn down graphics so much for competitive games? I used to be in competitive Fighting games and there 60fps was the Golden rule

GRINDCORE MEGGIDO
Feb 28, 1985


punk rebel ecks posted:

Why do people turn down graphics so much for competitive games? I used to be in competitive Fighting games and there 60fps was the Golden rule

They want the lowest latency they can get = highest framerate.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Arcturas posted:

I’m super uninformed about graphics cards and want some complete speculation about the future to help me make a purchase-now-or-wait decision. I have a rig with an i5-8600k and a 1050ti. I would like to be able to try VR (don’t get me started on buying a rift vs waiting for the next gen on that side). What’s the current thinking on upgrading now vs waiting for the next generation of nVidia cards that have been announced to be released? Price is negotiable but I would like to avoid spending more than $400 on the card.

I’m not particularly interested in bleeding edge performance because my monitors are a 1920x1200 Dell and a 1920x1080 Acer, and most of what I play is lower-performance stuff like Hollow Knight, League, Slay the Spire, CK 2, EU4, civ, Witcher, etc. I am super lovely at FPS’s and don’t care about them. But I have been getting into Elite recently and playing that, (particularly on VR at my brothers place) has been great. Plus Beat Saber and the various VR games look awesome.

You won't be getting any of the new cards for $400, I'd expect the cheapest ones to be 2070s at ~$550 for quite a while after release. It sounds like for your usage a GTX 1070, 1070 Ti or 1080 would be more than enough to last you for years, or at least until better VR becomes available and needs faster video cards to go with it. I just bought a 1080 for $365 so if you look around for deals you should be able to find one of the cards I mentioned within your budget some time in the next few weeks.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

punk rebel ecks posted:

Why do people turn down graphics so much for competitive games? I used to be in competitive Fighting games and there 60fps was the Golden rule

Attempting to get minimum fps as high as possible, because the minimums are going to occur at the worst possible times (when the most things are happening ingame/on-screen). Also because turning down the settings often removes visual clutter like foliage/etc that could make it more difficult to spot enemies. Can't hide in the grass when LOD makes foliage disappear past 100 feet! /pubg player taps forehead

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
[quote="GRINDCORE MEGGIDO post="487478846"]
They want the lowest latency they can get = highessot framerate.
[/quote]

Does that make any notable difference past a certain FPS? I know that input lag matters but at some point your reaction time will be the limiting factor.

[quote="Paul MaudDib" post=""4874788"] Also because turning down the settings often removes visual clutter like foliage/etc that could make it more difficult to spot enemies. Can't hide in the grass when LOD makes foliage disappear past 100 feet! /pubg player taps forehead
[/quote]

This sounds like it could ruin the experience.

Arcturas
Mar 30, 2011

AVeryLargeRadish posted:

You won't be getting any of the new cards for $400, I'd expect the cheapest ones to be 2070s at ~$550 for quite a while after release. It sounds like for your usage a GTX 1070, 1070 Ti or 1080 would be more than enough to last you for years, or at least until better VR becomes available and needs faster video cards to go with it. I just bought a 1080 for $365 so if you look around for deals you should be able to find one of the cards I mentioned within your budget some time in the next few weeks.

Thanks. I guess then the next questions are: will the prices on current gen cards drop much after next gen launches? From the past few pages of posts it sounds like we don’t know but probably not? And how do I tell the difference between the eight thousand varieties of 1080s out there on pcpartpicker and amazon?

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

punk rebel ecks posted:

Does that make any notable difference past a certain FPS? I know that input lag matters but at some point your reaction time will be the limiting factor.


This sounds like it could ruin the experience.

You will always react when you can make the visual determination to act

https://www.youtube.com/watch?v=pUvx81C4bgs&t=22s

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

punk rebel ecks posted:

This sounds like it could ruin the experience.

"the experience" is different things for different people. A lot of people don't play games like Battlefield or Overwatch or TF2 (either of them) for the fancy graphics, they play it for the combat, and graphics can even get in the way of that combat.

Team Fortress 2, in particular, is notorious for people turning down the graphics to actual N64 levels even though this produces almost no appreciable performance gain over a more reasonable low-settings configuration.

Llamadeus
Dec 20, 2005
Quake 3 must be the earliest example of this, it had a console command to turn down texture resolution to basically nothing... which became standard in competitive play.

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Llamadeus posted:

Quake 3 must be the earliest example of this, it had a console command to turn down texture resolution to basically nothing... which became standard in competitive play.

Quake had a setting to turn textures to basically a solid smear of color. I knew a guy who was really competitive and used this back in '96 or something. I forget the exact console command but I think it was some mipmap number. It looked awful and quake wasn't terribly pretty to begin with.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Yeah mipmapping is the one that turns TF2 down to potato levels too. It doesn't even affect performance much at all, texturing is pretty much "free" on anything released in the last 10 years.

Same engine really, Source is a divergent evolution of Quake, just like Quake III. There is probably still code written in 1993 being used in Source 2 today.

Paul MaudDib fucked around with this message at 19:19 on Aug 29, 2018

Llamadeus
Dec 20, 2005
Also standard in competitive play: forcing other players to render with the biggest, most visible model. Coloured bright green:

dad on the rag
Apr 25, 2010

Paul MaudDib posted:

Yeah mipmapping is the one that turns TF2 down to potato levels too. It doesn't even affect performance much at all, texturing is pretty much "free" on anything released in the last 10 years.

I used to use a low res texture pack for the original Counter-Strike and it added around ~10 fps and made it easier to spot people. Enemy models would just stand out so much from the plain background. I believe most competitive shooters/leagues don't allow this now.

E: ^yeah exactly like that

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Paul MaudDib posted:

Competitive online gaming seems like the wrong place to be showcasing performance-impairing raytracing poo poo, given that many people turn down their settings to N64 levels in those games, but I guess DICE is probably the hottest engine dev team around at this point.

At a minimum they need to get that half-res and tensor cores going. I think they need at least twice what they're getting right now for it to really be viable. 2080 Ti really needs to be pushing at least 60 fps at 4K and close to 100 fps at 1440p before it's even remotely an option in a competitive shooter.

Well EA uses Frostbite for everything now, so whatever DICE comes up with will inevitably end up in non-competitive games. BFV just happens to be their next big release.

Half-res is definitely the way to go, it's already standard practice to run screen-space reflections at half-res. Frostbites SSR only traces one ray per 2x2 pixel quad and still manages to filter it into a pretty good result by SSR standards.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply