|
Videocardz: Intel (Xe) DG1 spotted with 96 Execution Units Anandtech: Analyzing Intel’s Discrete Xe-HPC Graphics Disclosure: Ponte Vecchio, Rambo Cache, and Gelato
|
# ? Dec 26, 2019 11:00 |
|
|
# ? May 15, 2024 17:17 |
|
apropos man posted:I'm on the lookout for a 1660 Super or a Ti after Christmas is over. I take it the MSI cards aren't bandwidth limited? This is me too. I'm waiting for AMD to release the RX 5600 series to see if Nvidia will lower prices.
|
# ? Dec 27, 2019 00:09 |
|
Anyone ever experienced some kind of weird GPU noise that sounds like something between coil whine and an obstructed fan when they run certain tasks like (specific) local videos and certain things in certain games? Noticed it a week ago on my old 1070, and it's pretty weird. Doesn't sound like the coil whine I got when I tried overclocking the card a while back. E: I guess it's possible it's something from the new CPU fan, but that would make it even weirder. Especially since it happens instantaneously and not with a ramp-up. E2: Seems to be coming from the GPU. ufarn fucked around with this message at 00:33 on Dec 27, 2019 |
# ? Dec 27, 2019 00:21 |
|
ufarn posted:Anyone ever experienced some kind of weird GPU noise that sounds like something between coil whine and an obstructed fan when they run certain tasks like (specific) local videos and certain things in certain games? Might still be coil whine, my card makes something closer to a buzzing noise with the pitch changing depending on the load. It happening when watching video is kind of weird though.
|
# ? Dec 27, 2019 00:40 |
|
ufarn posted:Anyone ever experienced some kind of weird GPU noise that sounds like something between coil whine and an obstructed fan when they run certain tasks like (specific) local videos and certain things in certain games? I had a sort of clicking noise on a card a couple of years ago. More like an electrical clicking than a physical clicking noise. It would happen sporadically but much more often when I had v-sync on.
|
# ? Dec 27, 2019 00:48 |
|
Does the card have a 0db mode? Might be the fan spinning up if so. I had to override the 0db mode on my EVGA because the fans make an obnoxious grinding noise when they spin up, and it was more annoying than just letting them idle at a very low rpm.
|
# ? Dec 27, 2019 00:57 |
|
ufarn posted:Anyone ever experienced some kind of weird GPU noise that sounds like something between coil whine and an obstructed fan when they run certain tasks like (specific) local videos and certain things in certain games? I had a Dell laptop with integrated graphics that did that. It was most definitely the GPU and not a fan. Only occurred during OpenGL though. I think I've got a machine at work with an ATI card that does it too. Any time the screen is being actively drawn on the GPU emits a weird sound that I can best describe as reminiscent of the old modem dialup handshake but really, really quiet. It's a pretty rare thing I've only encountered a handful of times in thirty years of messing with computers, but at any rate you're not crazy and it isn't just you. Sheep fucked around with this message at 05:10 on Dec 27, 2019 |
# ? Dec 27, 2019 05:07 |
|
It's a Palit GameRock with 0 RPM functionality. The video probably spins up because of my madVR settings or something, even though it's mainly WebM files, and doesn't seem to happen with MKV. It's just weird that it happens so instantaneously. On top of a menu in Resident Evil 2 that also makes it happen. Assuming fan hysteresis is an issue, what minimum should I set the fan speeds to? It's a used card I bought when the disappointing Turing line-up was announced, and it's run great so far. I don't *think* it's degradation over time but more likely some weird, specific strain that brings out the coil whine.
|
# ? Dec 27, 2019 14:25 |
|
You can use GPU-Z to monitor to fan speed and see if the noise coincides with the fan speed going from 0% to >0%.
|
# ? Dec 27, 2019 15:22 |
|
Also look at thermals and memory utilization, I bet if you looked at what was going on across your system the cause would jump out.
|
# ? Dec 27, 2019 17:26 |
|
ufarn posted:It's a Palit GameRock with 0 RPM functionality. Hmm. I haven't tried resident evil but it sounds like coil whine (I have a gamerock 1080, that luckily doesn't do it).
|
# ? Dec 27, 2019 17:39 |
|
First two quarters is baseline, third quarter is the MP4 file, and the fourth is the WebM. Easiest way to spot it is in GPU Clock and Power Consumption. I noticed some faint whine with parts of the MP4, so I think it's tied to some sort of load rather than some weird edge case. I have to assume there's some degradation, since I don't recall noticing it before. Seems weird I'd just start hearing it. ufarn fucked around with this message at 01:16 on Dec 28, 2019 |
# ? Dec 28, 2019 01:14 |
|
I've always had the loudest coil whine with anything that displays at extremely high framerates, so loading screens, menus, and 3d accelerated videos.
|
# ? Dec 28, 2019 01:22 |
|
Ah that's a good point, uncapped frame rate in a certain menu could be the reason. I remember when older games caused a lot of issues because of uncapped menus. WarCraft 3 or StarCraft 2 did iirc.
|
# ? Dec 28, 2019 01:39 |
|
The low video engine load is suspicious. What codec exactly is the webm file encoded in? If it's VP9, the software you're using might not use the Nvidia hardware decoder correctly and use some hosed up hybrid concoction. In case it's 10bit HDR VP9 that's a whole other can of worms. sauer kraut fucked around with this message at 02:13 on Dec 28, 2019 |
# ? Dec 28, 2019 01:48 |
|
A Redditor found an Asrock listing for the 5600XT specs.
|
# ? Dec 29, 2019 15:09 |
|
Looks good, can't wait to see how they gently caress this up with the price
|
# ? Dec 29, 2019 18:57 |
|
Happy_Misanthrope posted:Looks good, can't wait to see how they gently caress this up with the price I was admittedly hyped with what the RX 5500 XT was promising. Oh how naive I was.
|
# ? Dec 29, 2019 21:29 |
|
Shrug, the 5600XT is gonna be what I hoped the 5500XT was going to be. Maybe I'm just spoiled on what Nvidia X50 and X50ti cards are historically capable of within that power/price band?
|
# ? Dec 30, 2019 01:13 |
|
SwissArmyDruid posted:Shrug, the 5600XT is gonna be what I hoped the 5500XT was going to be. Well the 1660 super is around ~20% faster than say, a radeon 580 (along with 2 gigs less memory). So that's a 20% uptick in the same price bracket after close to 3 years now (more if you count the 480). Bitcoin really hosed everything up for a while but this price segment hasn't exactly been on fire lately. AMD will likely price it at 1660ti or better, but the problem is the Ti serves little purpose with the $50 cheaper super being basically identical in performance. Maybe it might be closer to the 2060 6 gb, but if so then I would expect AMD to price it even higher.
|
# ? Dec 30, 2019 01:58 |
|
What's the over/under on AMD coming up with something that actually competes with the future 3080?
|
# ? Dec 30, 2019 02:37 |
|
Wistful of Dollars posted:What's the over/under on AMD coming up with something that actually competes with the future 3080? In my completely uneducated opinion, lol
|
# ? Dec 30, 2019 03:14 |
|
Maybe in the year 3080
|
# ? Dec 30, 2019 03:17 |
|
Pretty sure AMD would be elated putting out a part that favorably competes with the 2080 non-Ti, since the Radeon 7 wasn't it.
|
# ? Dec 30, 2019 03:17 |
|
Wistful of Dollars posted:What's the over/under on AMD coming up with something that actually competes with the future 3080? AMD can't beat Turing even with entire full-node lead, so: lmao
|
# ? Dec 30, 2019 03:25 |
|
Wistful of Dollars posted:What's the over/under on AMD coming up with something that actually competes with the future 3080? 3.5
|
# ? Dec 30, 2019 03:48 |
|
In the coming year? lolno. Best I'm willing to hope for is that they provide a solid alternative to a 3070. In 2021 tho I don't frikkin' know. Folks that talked to their GPU division got the impression, that they're trying to do a gradual creep over the years with Navi, like they did with Ryzen in the CPU space, but it's easy enough to say that and another thing entirely to actually do it when most of AMD's funding will likely be going into Ryzen to capitalize on their success this year. So I'd still temper my expectations TBH.
|
# ? Dec 30, 2019 03:50 |
|
As good as my 5700 XT has been I don’t see AMD competing with nvidia high end parts anytime soon, especially the next series. 7nm nvidia is going to be good unless they start sandbagging it because they don’t have any high end competition.
|
# ? Dec 30, 2019 04:18 |
|
Zen wasn't a gradual creep, it was an acknowledgement that they were producing absolute trash and then five years of work to design a brand-new product that could compete with Intel at the low end, followed by a few years of refinement and a process jump to creep upward from there. I would estimate that it will take a similar 5 year span for AMD to completely rebuild their GPU division from the ground up to have any hope of ever competing against Nvidia. The good news is that AMD is making tons of money right now, so they can afford to fund such a venture. The bad news is that Nvidia hasn't made a massive blunder like Intel's 10nm bet that leads to them doing nothing for 3+ years, and they aren't likely to. The thing AMD should be doing, but probably isn't because their GPU division seems to be always run by idiots, is working to secure software advantages. Getting developers to port over a lot of the performance enhancing features from console versions of games could give AMD a big leg up with an ability to trade minor quality impacts for huge performance gains. It'd be already implemented in a way that favors AMD, and reduce some of the drive to buy $1000+ GPUs in the first place, if you can get 95% of the image quality and 80% of the framerate from a $400-500 GPU. K8.0 fucked around with this message at 04:24 on Dec 30, 2019 |
# ? Dec 30, 2019 04:20 |
|
The drive to buy $1000 GPUs (in the consumer space) is basically non-existent anyways. It's the $200-300 bracket that gets the majority of sales (where they're not competitive either, but maybe the 5600 can fix?). Top end parts are great advertising, but it's not where the economic battle is fought.
|
# ? Dec 30, 2019 04:29 |
|
People not buying them doesn't mean they don't want to. The drive is there because games are murdering GPUs right now. You have to spend more than ever to get decent performance in most games. For many years, $300-330 bought a GPU that would run almost every game at/close to max detail at the max res/refresh your average gamer's monitor supported. Now, the 2080 and Ti are the only products that can kinda manage that, and everything below the $350ish 5700s are just "limp by playing old games" products. The fact that Nvidia is the only company selling GPUs that can even come close to max detail gives them a new kind of mindshare dominance - AMD is just not relevant in people's minds. Many would rather buy an Nvidia card because at least they're buying into the winning team. That's why bringing performance up across the board would benefit AMD - if you could get close to max settings and ~100 FPS at 1440p on a 5700/XT, people would care a lot less that Nvidia dominates the high end of the market, and AMD could actually compete. With the new consoles coming out and doubtless making the GPU performance situation even worse, things are ripe for AMD to exploit, but I am almost certain they won't try to do so.
K8.0 fucked around with this message at 04:59 on Dec 30, 2019 |
# ? Dec 30, 2019 04:47 |
|
K8.0 posted:The drive is there because games are murdering GPUs right now. Nah, Nvidia has poorly optimized DX12 for a while. In the days of Pascal vs Polaris, the general understanding was, "AMD is doing well on DX12 but nobody wants to use that because Microsoft is leveraging it to promote UWP adoption." Now that Microsoft has switched to using Games Pass to promote UWP/Metro and the Windows Store, regular rear end .exe games are going to DX11 to DX12 with a Vulkan fallback for Windows 7, Linux, Stadia, etc, and the old "owns bones at DX11" strength isn't as important as it was. Nvidia needs to optimize poo poo. EDIT: Also, the truly max settings are only for screenshots. One to two settings below whatever the most tricked out profile is called is gonna be what people should (if not necessarily will) target. 100FPS at 1440p is about the same amount of work as 60fps at 4K, and since that's really the dream to most people for the next Assassin's Creed and the next Sekiro/Souls and the next Tomb Raider etc, whoever builds that card without being substantially more expensive than the card under it in the semgentation is going to sell a lot of poo poo. Craptacular! fucked around with this message at 05:56 on Dec 30, 2019 |
# ? Dec 30, 2019 05:50 |
|
People aren't buying them because they don't have a thousand bucks. And the average gamers resolution is still 1080p, which you can max out settings at in that overwhelmingly popular $200-300 range, and the 300-400 range will max out settings at 1440p to various degrees of >60fps for the very serious gamers. You've got a seriously distorted view of what the market is. That said, the halo effect is very real and Nvidia's brand is way, way more valuable than AMD's, putting out a top performing GPU would be very good for them, though even if they could they'd have to fight through a lot of inertia just to get it recognized.
|
# ? Dec 30, 2019 06:37 |
|
K8.0 posted:People not buying them doesn't mean they don't want to. At the end of the day if no one is buying it, it doesn't matter what people would ideally get. Hardly anyone buys $1000 GPUs because that's an insane price to play video games that look a bit nicer than an xbonex at a glance.
|
# ? Dec 30, 2019 06:56 |
|
Inept posted:At the end of the day if no one is buying it, it doesn't matter what people would ideally get. Hardly anyone buys $1000 GPUs because that's an insane price to play video games that look a bit nicer than an xbonex at a glance. The Xbox locked 30 fps settings on 4K are hilariously easy to meet with new lower to mid tier GPUs as the RDR2 tests showed. The 1660Ti already gets more fps than the XBox One X in RDR2 in 4K (with 33 avg fps IIRC). The XBox One X Settings are far below Ultra PC Settings in RDR2 settings, more a mixture of Low, Medium and High Settings. lol @ „a bit nicer“. The Xbox One X 4K screenshots are visibly washed and with serious less details than the 4K Ultra PC Settings Many of that are originally 1440p screenshots by the way. A „bit nicer“ than X Box you say? ;-) The PC Ultra Settings are on a whole different level of details(literally) and visual immersion Mr.PayDay fucked around with this message at 07:34 on Dec 30, 2019 |
# ? Dec 30, 2019 07:30 |
|
Mr.PayDay posted:The Xbox locked 30 fps settings on 4K are hilariously easy to meet with new lower to mid tier GPUs as the RDR2 tests showed. The 1660Ti already gets more fps than the XBox One X in RDR2 in 4K (with 33 avg fps IIRC). Yup, it's a myth that modern consoles are perfoming better than their gpus allow.. fact is, most people on pc aren't happy with 30 fps and sub-native res, and love to crank every last setting to the max, regardless of visual impact. HalloKitty fucked around with this message at 11:27 on Dec 30, 2019 |
# ? Dec 30, 2019 11:23 |
|
30 fps is a tough pill to swallow going from my PC at 144+ Graphics fidelity I care less about but playing at 30 on my Xbox feels like I'm playing in 1998
|
# ? Dec 30, 2019 12:22 |
|
Pretty keen to see the next gen consoles in action
|
# ? Dec 30, 2019 12:30 |
|
Following current trends and priorities those will likely offer 8k with RTX on in... 30 FPS.
|
# ? Dec 30, 2019 12:39 |
|
|
# ? May 15, 2024 17:17 |
|
Craptacular! posted:Nah, Nvidia has poorly optimized DX12 for a while. In the days of Pascal vs Polaris, the general understanding was, "AMD is doing well on DX12 but nobody wants to use that because Microsoft is leveraging it to promote UWP adoption." Now that Microsoft has switched to using Games Pass to promote UWP/Metro and the Windows Store, regular rear end .exe games are going to DX11 to DX12 with a Vulkan fallback for Windows 7, Linux, Stadia, etc, and the old "owns bones at DX11" strength isn't as important as it was. Nvidia needs to optimize poo poo. I doubt that, Nvidia probably spends more than half of their resources optimizing other peoples software. A huge part of Nvidia's continued success is the massive amount of developer support they provide. Like if you were paying attention you would have noticed for a stretch they were shipping a new game ready driver every few days, so many it was getting to be a pain in the rear end keeping up to date.
|
# ? Dec 30, 2019 13:22 |