|
that would be a perfectly reasonable workstation-tier GPU though (like, RDNA2 Frontier Edition), so that's a great example of an april fool's joke that is not really funny edit: except GDDR7 I guess, wow so funny Paul MaudDib fucked around with this message at 16:46 on Apr 2, 2020 |
# ? Apr 2, 2020 16:39 |
|
|
# ? Jun 3, 2024 23:29 |
|
ShaneB posted:Someone talk me out of moving from an RX580 to an RTX 2060 Super... please? Seems like a reasonable upgrade?
|
# ? Apr 2, 2020 17:04 |
|
Lube banjo posted:Its probably nothing/false since :WCCFTECH: but 32gb of VRAM what the hell
|
# ? Apr 2, 2020 17:15 |
|
lllllllllllllllllll posted:Seems like a reasonable upgrade? Yeah but then I am dropping $400. QUESTIONABLE.
|
# ? Apr 2, 2020 17:43 |
|
Happy_Misanthrope posted:dude, really Yeah sorry. I got like 3 hrs of sleep last night and I've been in zombie-mode playing Fire Emblem all day. It popped up in a news feed and I didn't see anyone talking about it. Seemed somewhat realistic but I did notice the blower fans were spinning in opposite directions. Thought it was kind of weird but hey
|
# ? Apr 2, 2020 17:49 |
|
So AMD's joke was "we made a good GPU"?
|
# ? Apr 2, 2020 18:38 |
|
32GB not a joke tho
|
# ? Apr 2, 2020 20:26 |
|
VostokProgram posted:So AMD's joke was "we made a good GPU"? I think it was a techbro website joke, not amds own
|
# ? Apr 2, 2020 20:30 |
|
ShaneB posted:Yeah but then I am dropping $400. QUESTIONABLE.
|
# ? Apr 2, 2020 20:40 |
|
april 1 - push bullshit to the limit to see what people will still click on and share
|
# ? Apr 2, 2020 20:48 |
|
every day is April 1st if you write for WCCF Tech
|
# ? Apr 2, 2020 21:52 |
|
TorakFade posted:Hmm, after the last couple of driver updates (Nvidia, I have an EVGA 1080 FTW bought at the end of 2018) RDR2 seems to have some weird graphics glitch, with random red and white flashes. My baseless guess/suggestion is try changing your render engine from dx12 to vulkan or vice versa and see what that does.
|
# ? Apr 3, 2020 00:08 |
|
The GTC talk on DLSS 2.0 went public if anyone wants a deeper dive: https://developer.nvidia.com/gtc/2020/video/s22698
|
# ? Apr 3, 2020 00:43 |
|
repiv posted:The GTC talk on DLSS 2.0 went public if anyone wants a deeper dive: https://developer.nvidia.com/gtc/2020/video/s22698 This is great, thanks for the link.
|
# ? Apr 3, 2020 01:31 |
|
Really it's more of a primer on the limitations of other upscalers, and a demonstration that DLSS avoids those problems, but they don't give any real details on the inner workings. Nvidia are playing their hand close to their chest with this one, it's their secret free performance juice and they're not sharing with the class
|
# ? Apr 3, 2020 04:07 |
|
The one thing that I think is really striking in that presentation is the comparison shots with 32x supersampling. I was mindblown to find out that much of what I thought was oversharpening in DLSS 2.0 is actually just better conformance to a supersampled image. My assumption is that their algorithm must be incredibly good at using the motion data to allow data to be re-used over time, allowing them to be very effectively supersampling anything that isn't in extreme motion. I think the fact that they got the DLSS processing time down is actually very important. Being able to run at high base framerates is really important to anything like this working well, because for a given motion, a higher framerate is more samples/less difference between samples/easier correction and thus better image quality. I wonder what AMD's response is. If they haven't been playing catch-up in secret, I can't imagine them not getting absolutely hosed next gen. Nvidia is almost certainly going to include tensor cores across the stack, and so even if an AMD card benches better, in practice being able to use DLSS for maybe a minor IQ hit and a huge performance gain is going to make AMD largely irrelevant. It's going to be an interesting time, because AMD will probably be able to point to benchmarks and truthfully claim that they are providing better 1:1 performance at a given price point, but Nvidia will probably be providing significantly better actual experience per dollar without any sufficiently objective way to quantify that.
|
# ? Apr 3, 2020 07:30 |
|
K8.0 posted:AMD will probably be able to point to benchmarks and truthfully claim that they are providing better 1:1 performance at a given price point, but Nvidia will probably be providing significantly better actual experience per dollar without any sufficiently objective way to quantify that. Well, there are a few parts to this. First, unless AMD has some sort of that no one has seen coming, NVidia will likely dominate the upper-mid and high-end tiers again. I'd be real surprised if AMD could come out with a price-competitive card to match the xx70 on up without also accepting some silly compromises, like 300+W power use or something. So NVidia won't even have to bother with benchmark wars there; they'll be able to say that not only is the xx80(Ti) X% faster in benchmarks, it's X+50% faster with DLSS. In the mid-tier where AMD probably will be price-competitive, I don't think the story changes much in their favor from today: even at the price points where they are competitive, they often lose out to NVidia based on drivers, name recognition, etc. There's a reason that the xx60 has been a massive seller, and it's not because the 580/590's are bad cards. Regardless of whether DLSS goes down to the xx50/60 cards, I don't think many people are going to care that the 680/690 bench 5% higher (if they even do--which I think is doubtful). The low end...who the gently caress knows. That's always been a weird space with no rules and very strange decisions.
|
# ? Apr 3, 2020 16:15 |
|
The Xbox Series X (still a poo poo name) presents an interesting case, though. Ray tracing hardware, newer architecture, 52 CUs, 1.825GHz... 12 TFLOPS, it seems like it could be a serious contender if it was released as a normal card, maybe pushing the boost out even a little further, and possibly enabling the remaining 4 CUs for an XT part It's a lot beefier than anything they're selling right now. HalloKitty fucked around with this message at 17:11 on Apr 3, 2020 |
# ? Apr 3, 2020 17:07 |
|
repiv posted:Really it's more of a primer on the limitations of other upscalers, and a demonstration that DLSS avoids those problems, but they don't give any real details on the inner workings. I think the secret sauce is compute time, they needed 1.5 years after dlss 1.0 to get the model trained with enough data.
|
# ? Apr 3, 2020 17:34 |
|
Now the question is, how quickly can Nvidia's devrel people crank out DLSS patches for games DLSS 2.0 was just confirmed for Amid Evil, which doesn't sound particularly useful for such a lightweight game, but they're probably taking the shotgun approach and trying to get it into as many UE4 games as possible now the integration work is done
|
# ? Apr 3, 2020 18:00 |
|
Are you sure the integration takes nvidia assistance? Sounds like devs are implementing it themselves but I'm no expert.
|
# ? Apr 3, 2020 18:15 |
|
The developers can do it themselves if they want to (there's no need for NV to train a model for them anymore) but NV have a bunch of people whose job is to embed in game studios and assist with integrating NV tech into their engines. It's in NVs best interests to speed things along.
repiv fucked around with this message at 18:25 on Apr 3, 2020 |
# ? Apr 3, 2020 18:20 |
|
So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now?
|
# ? Apr 4, 2020 03:40 |
|
Murvin posted:So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now? 1070 should be fine for most games at 1440p as long as you aren't trying for 144hz. What games in particular are you struggling with?
|
# ? Apr 4, 2020 03:48 |
|
Murvin posted:So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now? 1070 should be fine so long as you don't mind turning some settings down.
|
# ? Apr 4, 2020 09:01 |
|
Yep, dollar for dollar you are still probably better off spending on a GPU, so I wouldn't say you're bottlenecked, but a 1070 should get you 60fps+ in most games with settings turned down a smidge.
|
# ? Apr 4, 2020 14:47 |
|
Murvin posted:So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now? A little insight from someone who went from a O/C’d 3770k to a 3800x + 3600 CL15. I was running a 2070 under both setups, and frame rates were pretty good in both cases, but there was a noticeable increase in the minimum FPS and stability of FPS in general with the upgrade to the 3800x. Prior to the 2070, I had a 1070 until it popped a capacitor. The 1070 is a decent 1440p card, and should handle most of what you throw at it with some compromises. You’ll definitely see better FPS with a vid card upgrade, but more consistent FPS with a CPU upgrade. I guess what I’m trying to say is, either way you’ll benefit.
|
# ? Apr 4, 2020 17:00 |
|
What’s the forecast look like for USB-C monitor/GPU connection in the near future? I’m trying to figure out how to share my peripherals between my MacBook and my next desktop, and it would be pretty nifty if I didn’t have to chain a bunch of moderately expensive converters and switches to do so.
|
# ? Apr 4, 2020 17:05 |
|
RTX cards can already do it, the USB-C VirtualLink port can also drive regular USB-C monitors. Make sure the SKU you choose has the port though, some non-reference designs omit it.
|
# ? Apr 4, 2020 17:10 |
|
Murvin posted:So I jumped the gun and upgraded to a 2560x1440 monitor before my hardware was probably ready for it and some games are struggling a bit. I am using the old and formerly very popular i5 4460 cpu and a GTX 1070 gpu. At this point I know I need a new CPU and that may be bottle-necking me, but is my GPU also good enough for 1440p or should I be considering a step up? If so, is there a current card in the sweet spot for that resolution right now? In addition to what other people have said, you can consider that right now new GPUs that present a big value increase will come sooner than new CPUs that present a big value increase. Unless you really need a GPU right now, I'd be going CPU upgrade first.
|
# ? Apr 4, 2020 17:16 |
|
repiv posted:RTX cards can already do it, the USB-C VirtualLink port can also drive regular USB-C monitors. Ah, good. I would just then need to add yet another criterion when shopping for my next monitor, which tbh may be a mercy given how confusing I found the process last time!
|
# ? Apr 4, 2020 19:04 |
|
repiv posted:RTX cards can already do it, the USB-C VirtualLink port can also drive regular USB-C monitors. Huh, i didnt know that. Is it DisplayLink over USB-C? I assumed that was just gonna be a useless plug nobody ever used, like SATA Express.
|
# ? Apr 4, 2020 19:13 |
|
The MSI Optix MAG251RX has a USB-C port, and it's looking like the best 1080p240 monitor when it comes out this Summer. Not sure how it works with video, especially Sync.
|
# ? Apr 4, 2020 19:42 |
|
Cygni posted:Huh, i didnt know that. Is it DisplayLink over USB-C? I assumed that was just gonna be a useless plug nobody ever used, like SATA Express. Isn't DisplayLink a fallback thing for cramming video down a data-only USB port? The port on the RTX cards has native DisplayPort-over-USB-C. It's also a general purpose USB 3.1 gen2 port you can use for whatever, and it supports USB-PD up to 27W so it can serve as a fast charger. VirtualLink seems to be stillborn though, Valve cancelled the VL cable for the Index and I don't think anyone else has adopted it.
|
# ? Apr 4, 2020 19:43 |
|
Yeah, Valve said they couldn't get it work reliably enough.
|
# ? Apr 4, 2020 20:06 |
|
repiv posted:Isn't DisplayLink a fallback thing for cramming video down a data-only USB port? The port on the RTX cards has native DisplayPort-over-USB-C. Nice, i didnt know about the displayport or general USB usage, i literally just thought it was a dumb dead port for me to ignore on my card forever. I guess its slightly more useful than that!
|
# ? Apr 4, 2020 21:56 |
|
Lowen SoDium posted:Yeah, Valve said they couldn't get it work reliably enough. Is it just me or has every iteration of USB since 2 been less and less reliable?
|
# ? Apr 4, 2020 23:20 |
|
Ganondork posted:Is it just me or has every iteration of USB since 2 been less and less reliable? The actual reliability of USB 3 is perfectly fine. What's not has been the compatibility, as 3.x decided to introduce multiple layers of confusing and conflicting speeds, capabilities, etc. They'll all work as a generic 3.0 host, but then whether it supports super high speed, a given wattage for charging, video, etc., is all a poo poo-show.
|
# ? Apr 5, 2020 00:40 |
|
The selling point with USB-C in particular is that hey, it's one connector that does everything and you don't have to flip it three times when plugging in, sweet. The problem is that if you look at any random USB-C port there's absolutely no way of telling what it might support. Does it supply power? Maybe, but at 5V, 12V or 20V, or any of them? How much power? Enough to charge your laptop? Who knows! Does it support power input, for that matter? Sometimes only one of the ports on a device does. Can you use it for display output? Quite possibly, but it's anyone's guess what resolutions and refresh rates are supported, or even which protocol is being used (HDMI is possible, AFAIK). It's quite common that 4K 60Hz doesn't work. And even if you just use it for data transfer, there's no consistency there either. It's a loving mess.
|
# ? Apr 5, 2020 01:16 |
|
|
# ? Jun 3, 2024 23:29 |
|
DrDork posted:The actual reliability of USB 3 is perfectly fine. What's not has been the compatibility, as 3.x decided to introduce multiple layers of confusing and conflicting speeds, capabilities, etc. They'll all work as a generic 3.0 host, but then whether it supports super high speed, a given wattage for charging, video, etc., is all a poo poo-show. We wanted so badly for USB 3 to be viable for the original Rift, because it would have reduced tracker latency and improved accuracy, but so many chipsets were broken in so many ways that we abandoned work on that mode. Poor guy on my team spending hours each day on the phone with firmware teams trying to get them to even acknowledge that they might want to be compatible at some point in the future. I forget what it was that we wanted to do, maybe something with isochronous mode, but pretty much every engagement with a vendor ended with “oh, yeah, the standard says that but we don’t do it. yeah, we advertise it because we do this other thing and some software expects to see both together, sorry about your life”. Maybe it’s better now, but I wouldn’t have the heart to ask Justin and find out!
|
# ? Apr 5, 2020 01:40 |