Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo
Videocardz: Intel (Xe) DG1 spotted with 96 Execution Units

Anandtech: Analyzing Intel’s Discrete Xe-HPC Graphics Disclosure: Ponte Vecchio, Rambo Cache, and Gelato

Adbot
ADBOT LOVES YOU

Khagan
Aug 8, 2012

Words cannot describe just how terrible Vietnamese are.

apropos man posted:

I'm on the lookout for a 1660 Super or a Ti after Christmas is over. I take it the MSI cards aren't bandwidth limited?

This is me too. I'm waiting for AMD to release the RX 5600 series to see if Nvidia will lower prices.

ufarn
May 30, 2009
Anyone ever experienced some kind of weird GPU noise that sounds like something between coil whine and an obstructed fan when they run certain tasks like (specific) local videos and certain things in certain games?

Noticed it a week ago on my old 1070, and it's pretty weird. Doesn't sound like the coil whine I got when I tried overclocking the card a while back.

E: I guess it's possible it's something from the new CPU fan, but that would make it even weirder. Especially since it happens instantaneously and not with a ramp-up.

E2: Seems to be coming from the GPU.

ufarn fucked around with this message at 00:33 on Dec 27, 2019

Arzachel
May 12, 2012

ufarn posted:

Anyone ever experienced some kind of weird GPU noise that sounds like something between coil whine and an obstructed fan when they run certain tasks like (specific) local videos and certain things in certain games?

Noticed it a week ago on my old 1070, and it's pretty weird. Doesn't sound like the coil whine I got when I tried overclocking the card a while back.

E: I guess it's possible it's something from the new CPU fan, but that would make it even weirder. Especially since it happens instantaneously and not with a ramp-up.

E2: Seems to be coming from the GPU.

Might still be coil whine, my card makes something closer to a buzzing noise with the pitch changing depending on the load. It happening when watching video is kind of weird though.

apropos man
Sep 5, 2016

You get a hundred and forty one thousand years and you're out in eight!

ufarn posted:

Anyone ever experienced some kind of weird GPU noise that sounds like something between coil whine and an obstructed fan when they run certain tasks like (specific) local videos and certain things in certain games?

Noticed it a week ago on my old 1070, and it's pretty weird. Doesn't sound like the coil whine I got when I tried overclocking the card a while back.

E: I guess it's possible it's something from the new CPU fan, but that would make it even weirder. Especially since it happens instantaneously and not with a ramp-up.

E2: Seems to be coming from the GPU.

I had a sort of clicking noise on a card a couple of years ago. More like an electrical clicking than a physical clicking noise. It would happen sporadically but much more often when I had v-sync on.

repiv
Aug 13, 2009

Does the card have a 0db mode? Might be the fan spinning up if so.

I had to override the 0db mode on my EVGA because the fans make an obnoxious grinding noise when they spin up, and it was more annoying than just letting them idle at a very low rpm.

Sheep
Jul 24, 2003

ufarn posted:

Anyone ever experienced some kind of weird GPU noise that sounds like something between coil whine and an obstructed fan when they run certain tasks like (specific) local videos and certain things in certain games?

Noticed it a week ago on my old 1070, and it's pretty weird. Doesn't sound like the coil whine I got when I tried overclocking the card a while back.

E: I guess it's possible it's something from the new CPU fan, but that would make it even weirder. Especially since it happens instantaneously and not with a ramp-up.

E2: Seems to be coming from the GPU.

I had a Dell laptop with integrated graphics that did that. It was most definitely the GPU and not a fan. Only occurred during OpenGL though.

I think I've got a machine at work with an ATI card that does it too. Any time the screen is being actively drawn on the GPU emits a weird sound that I can best describe as reminiscent of the old modem dialup handshake but really, really quiet.

It's a pretty rare thing I've only encountered a handful of times in thirty years of messing with computers, but at any rate you're not crazy and it isn't just you.

Sheep fucked around with this message at 05:10 on Dec 27, 2019

ufarn
May 30, 2009
It's a Palit GameRock with 0 RPM functionality.

The video probably spins up because of my madVR settings or something, even though it's mainly WebM files, and doesn't seem to happen with MKV. It's just weird that it happens so instantaneously. On top of a menu in Resident Evil 2 that also makes it happen.

Assuming fan hysteresis is an issue, what minimum should I set the fan speeds to?

It's a used card I bought when the disappointing Turing line-up was announced, and it's run great so far. I don't *think* it's degradation over time but more likely some weird, specific strain that brings out the coil whine.

repiv
Aug 13, 2009

You can use GPU-Z to monitor to fan speed and see if the noise coincides with the fan speed going from 0% to >0%.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Also look at thermals and memory utilization, I bet if you looked at what was going on across your system the cause would jump out.

GRINDCORE MEGGIDO
Feb 28, 1985


ufarn posted:

It's a Palit GameRock with 0 RPM functionality.

The video probably spins up because of my madVR settings or something, even though it's mainly WebM files, and doesn't seem to happen with MKV. It's just weird that it happens so instantaneously. On top of a menu in Resident Evil 2 that also makes it happen.

Assuming fan hysteresis is an issue, what minimum should I set the fan speeds to?

It's a used card I bought when the disappointing Turing line-up was announced, and it's run great so far. I don't *think* it's degradation over time but more likely some weird, specific strain that brings out the coil whine.

Hmm. I haven't tried resident evil but it sounds like coil whine (I have a gamerock 1080, that luckily doesn't do it).

ufarn
May 30, 2009
First two quarters is baseline, third quarter is the MP4 file, and the fourth is the WebM. Easiest way to spot it is in GPU Clock and Power Consumption.

I noticed some faint whine with parts of the MP4, so I think it's tied to some sort of load rather than some weird edge case.



I have to assume there's some degradation, since I don't recall noticing it before. Seems weird I'd just start hearing it.

ufarn fucked around with this message at 01:16 on Dec 28, 2019

NickBlasta
May 16, 2003

Clearly their proficiency at shooting is supernatural, not practical, in origin.
I've always had the loudest coil whine with anything that displays at extremely high framerates, so loading screens, menus, and 3d accelerated videos.

ufarn
May 30, 2009
Ah that's a good point, uncapped frame rate in a certain menu could be the reason.

I remember when older games caused a lot of issues because of uncapped menus. WarCraft 3 or StarCraft 2 did iirc.

sauer kraut
Oct 2, 2004
The low video engine load is suspicious. What codec exactly is the webm file encoded in?
If it's VP9, the software you're using might not use the Nvidia hardware decoder correctly and use some hosed up hybrid concoction.
In case it's 10bit HDR VP9 that's a whole other can of worms.

sauer kraut fucked around with this message at 02:13 on Dec 28, 2019

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
A Redditor found an Asrock listing for the 5600XT specs.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
Looks good, can't wait to see how they gently caress this up with the price

mom and dad fight a lot
Sep 21, 2006

If you count them all, this sentence has exactly seventy-two characters.

Happy_Misanthrope posted:

Looks good, can't wait to see how they gently caress this up with the price

I was admittedly hyped with what the RX 5500 XT was promising. Oh how naive I was.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Shrug, the 5600XT is gonna be what I hoped the 5500XT was going to be.

Maybe I'm just spoiled on what Nvidia X50 and X50ti cards are historically capable of within that power/price band?

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

SwissArmyDruid posted:

Shrug, the 5600XT is gonna be what I hoped the 5500XT was going to be.

Maybe I'm just spoiled on what Nvidia X50 and X50ti cards are historically capable of within that power/price band?

Well the 1660 super is around ~20% faster than say, a radeon 580 (along with 2 gigs less memory). So that's a 20% uptick in the same price bracket after close to 3 years now (more if you count the 480). Bitcoin really hosed everything up for a while but this price segment hasn't exactly been on fire lately.

AMD will likely price it at 1660ti or better, but the problem is the Ti serves little purpose with the $50 cheaper super being basically identical in performance. Maybe it might be closer to the 2060 6 gb, but if so then I would expect AMD to price it even higher.

Wistful of Dollars
Aug 25, 2009

What's the over/under on AMD coming up with something that actually competes with the future 3080?

Zarin
Nov 11, 2008

I SEE YOU

Wistful of Dollars posted:

What's the over/under on AMD coming up with something that actually competes with the future 3080?

In my completely uneducated opinion, lol

VelociBacon
Dec 8, 2009

Maybe in the year 3080 :smug:

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Pretty sure AMD would be elated putting out a part that favorably competes with the 2080 non-Ti, since the Radeon 7 wasn't it.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Wistful of Dollars posted:

What's the over/under on AMD coming up with something that actually competes with the future 3080?

AMD can't beat Turing even with entire full-node lead, so: lmao

Cavauro
Jan 9, 2008

Wistful of Dollars posted:

What's the over/under on AMD coming up with something that actually competes with the future 3080?

3.5

CrazyLoon
Aug 10, 2015

"..."
In the coming year? lolno. Best I'm willing to hope for is that they provide a solid alternative to a 3070.

In 2021 tho I don't frikkin' know. Folks that talked to their GPU division got the impression, that they're trying to do a gradual creep over the years with Navi, like they did with Ryzen in the CPU space, but it's easy enough to say that and another thing entirely to actually do it when most of AMD's funding will likely be going into Ryzen to capitalize on their success this year. So I'd still temper my expectations TBH.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
As good as my 5700 XT has been I don’t see AMD competing with nvidia high end parts anytime soon, especially the next series. 7nm nvidia is going to be good unless they start sandbagging it because they don’t have any high end competition.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Zen wasn't a gradual creep, it was an acknowledgement that they were producing absolute trash and then five years of work to design a brand-new product that could compete with Intel at the low end, followed by a few years of refinement and a process jump to creep upward from there. I would estimate that it will take a similar 5 year span for AMD to completely rebuild their GPU division from the ground up to have any hope of ever competing against Nvidia.

The good news is that AMD is making tons of money right now, so they can afford to fund such a venture. The bad news is that Nvidia hasn't made a massive blunder like Intel's 10nm bet that leads to them doing nothing for 3+ years, and they aren't likely to.

The thing AMD should be doing, but probably isn't because their GPU division seems to be always run by idiots, is working to secure software advantages. Getting developers to port over a lot of the performance enhancing features from console versions of games could give AMD a big leg up with an ability to trade minor quality impacts for huge performance gains. It'd be already implemented in a way that favors AMD, and reduce some of the drive to buy $1000+ GPUs in the first place, if you can get 95% of the image quality and 80% of the framerate from a $400-500 GPU.

K8.0 fucked around with this message at 04:24 on Dec 30, 2019

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

The drive to buy $1000 GPUs (in the consumer space) is basically non-existent anyways. It's the $200-300 bracket that gets the majority of sales (where they're not competitive either, but maybe the 5600 can fix?). Top end parts are great advertising, but it's not where the economic battle is fought.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
People not buying them doesn't mean they don't want to. The drive is there because games are murdering GPUs right now. You have to spend more than ever to get decent performance in most games. For many years, $300-330 bought a GPU that would run almost every game at/close to max detail at the max res/refresh your average gamer's monitor supported. Now, the 2080 and Ti are the only products that can kinda manage that, and everything below the $350ish 5700s are just "limp by playing old games" products. The fact that Nvidia is the only company selling GPUs that can even come close to max detail gives them a new kind of mindshare dominance - AMD is just not relevant in people's minds. Many would rather buy an Nvidia card because at least they're buying into the winning team. That's why bringing performance up across the board would benefit AMD - if you could get close to max settings and ~100 FPS at 1440p on a 5700/XT, people would care a lot less that Nvidia dominates the high end of the market, and AMD could actually compete. With the new consoles coming out and doubtless making the GPU performance situation even worse, things are ripe for AMD to exploit, but I am almost certain they won't try to do so.

K8.0 fucked around with this message at 04:59 on Dec 30, 2019

Craptacular!
Jul 9, 2001

Fuck the DH

K8.0 posted:

The drive is there because games are murdering GPUs right now.

Nah, Nvidia has poorly optimized DX12 for a while. In the days of Pascal vs Polaris, the general understanding was, "AMD is doing well on DX12 but nobody wants to use that because Microsoft is leveraging it to promote UWP adoption." Now that Microsoft has switched to using Games Pass to promote UWP/Metro and the Windows Store, regular rear end .exe games are going to DX11 to DX12 with a Vulkan fallback for Windows 7, Linux, Stadia, etc, and the old "owns bones at DX11" strength isn't as important as it was. Nvidia needs to optimize poo poo.

EDIT: Also, the truly max settings are only for screenshots. One to two settings below whatever the most tricked out profile is called is gonna be what people should (if not necessarily will) target. 100FPS at 1440p is about the same amount of work as 60fps at 4K, and since that's really the dream to most people for the next Assassin's Creed and the next Sekiro/Souls and the next Tomb Raider etc, whoever builds that card without being substantially more expensive than the card under it in the semgentation is going to sell a lot of poo poo.

Craptacular! fucked around with this message at 05:56 on Dec 30, 2019

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

People aren't buying them because they don't have a thousand bucks. And the average gamers resolution is still 1080p, which you can max out settings at in that overwhelmingly popular $200-300 range, and the 300-400 range will max out settings at 1440p to various degrees of >60fps for the very serious gamers. You've got a seriously distorted view of what the market is.

That said, the halo effect is very real and Nvidia's brand is way, way more valuable than AMD's, putting out a top performing GPU would be very good for them, though even if they could they'd have to fight through a lot of inertia just to get it recognized.

Inept
Jul 8, 2003

K8.0 posted:

People not buying them doesn't mean they don't want to.

At the end of the day if no one is buying it, it doesn't matter what people would ideally get. Hardly anyone buys $1000 GPUs because that's an insane price to play video games that look a bit nicer than an xbonex at a glance.

Mr.PayDay
Jan 2, 2004
life is short - play hard

Inept posted:

At the end of the day if no one is buying it, it doesn't matter what people would ideally get. Hardly anyone buys $1000 GPUs because that's an insane price to play video games that look a bit nicer than an xbonex at a glance.

The Xbox locked 30 fps settings on 4K are hilariously easy to meet with new lower to mid tier GPUs as the RDR2 tests showed. The 1660Ti already gets more fps than the XBox One X in RDR2 in 4K (with 33 avg fps IIRC).

The XBox One X Settings are far below Ultra PC Settings in RDR2 settings, more a mixture of Low, Medium and High Settings.

lol @ „a bit nicer“. The Xbox One X 4K screenshots are visibly washed and with serious less details than the 4K Ultra PC Settings













Many of that are originally 1440p screenshots by the way.
A „bit nicer“ than X Box you say? ;-) The PC Ultra Settings are on a whole different level of details(literally) and visual immersion

Mr.PayDay fucked around with this message at 07:34 on Dec 30, 2019

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Mr.PayDay posted:

The Xbox locked 30 fps settings on 4K are hilariously easy to meet with new lower to mid tier GPUs as the RDR2 tests showed. The 1660Ti already gets more fps than the XBox One X in RDR2 in 4K (with 33 avg fps IIRC).

The XBox One X Settings are far below Ultra PC Settings in RDR2 settings, more a mixture of Low, Medium and High Settings.

lol @ „a bit nicer“. The Xbox One X 4K screenshots are visibly washed and with serious less details than the 4K Ultra PC Settings













Many of that are originally 1440p screenshots by the way.
A „bit nicer“ than X Box you say? ;-) The PC Ultra Settings are on a whole different level of details(literally) and visual immersion

Yup, it's a myth that modern consoles are perfoming better than their gpus allow.. fact is, most people on pc aren't happy with 30 fps and sub-native res, and love to crank every last setting to the max, regardless of visual impact.

HalloKitty fucked around with this message at 11:27 on Dec 30, 2019

Worf
Sep 12, 2017

If only Seth would love me like I love him!

30 fps is a tough pill to swallow going from my PC at 144+

Graphics fidelity I care less about but playing at 30 on my Xbox feels like I'm playing in 1998

shrike82
Jun 11, 2005

Pretty keen to see the next gen consoles in action

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Following current trends and priorities those will likely offer 8k with RTX on in... 30 FPS.

Adbot
ADBOT LOVES YOU

Indiana_Krom
Jun 18, 2007
Net Slacker

Craptacular! posted:

Nah, Nvidia has poorly optimized DX12 for a while. In the days of Pascal vs Polaris, the general understanding was, "AMD is doing well on DX12 but nobody wants to use that because Microsoft is leveraging it to promote UWP adoption." Now that Microsoft has switched to using Games Pass to promote UWP/Metro and the Windows Store, regular rear end .exe games are going to DX11 to DX12 with a Vulkan fallback for Windows 7, Linux, Stadia, etc, and the old "owns bones at DX11" strength isn't as important as it was. Nvidia needs to optimize poo poo.

EDIT: Also, the truly max settings are only for screenshots. One to two settings below whatever the most tricked out profile is called is gonna be what people should (if not necessarily will) target. 100FPS at 1440p is about the same amount of work as 60fps at 4K, and since that's really the dream to most people for the next Assassin's Creed and the next Sekiro/Souls and the next Tomb Raider etc, whoever builds that card without being substantially more expensive than the card under it in the semgentation is going to sell a lot of poo poo.

I doubt that, Nvidia probably spends more than half of their resources optimizing other peoples software. A huge part of Nvidia's continued success is the massive amount of developer support they provide. Like if you were paying attention you would have noticed for a stretch they were shipping a new game ready driver every few days, so many it was getting to be a pain in the rear end keeping up to date.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply