|
2K and 4K refer to horizontal resolutions (2048 and 4096) because they originate in film scanning: it makes more sense to standardize scanning resolutions by the width of the film.
|
# ¿ Nov 27, 2017 06:50 |
|
|
# ¿ May 16, 2024 15:03 |
|
jonathan posted:Why are the centrifugal fan/blower cards considered worse ? They don't seem to recirculate any of the warm air back into the case. Is it just a matter of them being noisy ?
|
# ¿ Feb 2, 2018 19:04 |
|
My AMD card self-reports at ~35W idle (at 300 MHz too, so it's probably mostly VRAM?). I've read that having two monitors of differing resolution increases the idle power significantly, no idea how true this is though.
|
# ¿ Feb 20, 2018 20:43 |
|
Zedsdeadbaby posted:I thought ray tracing was supposed to be virtually impossible to do in realtime. Mind you, I read that around 10-15 years ago. Is it just the actual existence of raw computing power that's making it possible now? Dadbod Apocalypse posted:For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now? The softening of the shadow here as it extends further out would be expensive to implement in current engines, for instance. In practice I think the most immediate benefit of path tracing is more accurate reflections, since current games use either a cubemap or some screen space effect, both of which can look pretty janky.
|
# ¿ Mar 19, 2018 21:21 |
|
You can do both at once: https://en.wikipedia.org/wiki/Path_tracing#Bidirectional_path_tracing
|
# ¿ Mar 19, 2018 21:27 |
|
Alpha Mayo posted:All you lose with AMD right now is single-threaded performance, which is important for gaming and almost nothing else. https://www.anandtech.com/show/11859/the-anandtech-coffee-lake-review-8700k-and-8400-initial-numbers/10 https://uk.hardware.info/reviews/76...4-x265-and-flac Using video encoding as an example: in the x264 tests Intel 6c/6t is about equal to Ryzen 6c/12t, in x265 (which I think leans even more on AVX) the 6c/6t i5s can even match the 8c/16t Ryzens.
|
# ¿ Mar 21, 2018 19:51 |
|
orcane posted:The human eye is not a camera though, so DoF effects and motion blur are 100% about games trying to be movies. So yeah, art. There are good reasons to have motion blur in video that are unrelated to movies: it's arguably less correct to treat each pixel as a point sample in the temporal dimension, and conventional 180 degree shutter angle is a good compromise between sharpness and coverage. Movies have the option to reduce motion blur too, but they don't because less blur looks unnaturally jerky.
|
# ¿ Jun 5, 2018 20:52 |
|
DrDork posted:More like because everyone is conditioned to how movies at 24fps look, so anything higher "looks wrong" despite the higher rates actually being closer to how the eye would normally perceive and register moving scenes. For a visual example: https://www.youtube.com/watch?v=xn1mpszEjZM
|
# ¿ Jun 5, 2018 21:23 |
|
1gnoirents posted:Its probably out there but this is a pretty stark change from seriously 1 year ago where you'd almost always see sandy bridge thrown into benchmarks when reviewing the newest CPU on at least some major sites. Now its hard to even find Haswell.
|
# ¿ Jun 10, 2018 05:09 |
|
That depends on what's meant by foreseeable I think? The next gen of consoles are expected to release in 2020, and I can imagine games needing more than a 1060 at 1080p60 after that (if you want all the fancy new effects and stuff)
|
# ¿ Jun 21, 2018 11:20 |
|
buglord posted:Is there a ray tracing/ no ray tracing comparison video because all my smooth brain sees in these demos are just better shadows or reflections or something Better reflections and shadows (and occlusion) are basically the whole point of it~
|
# ¿ Aug 16, 2018 04:34 |
|
Partial Octopus posted:So it seems like most 3-fan cards are in the 300mm range. My case can fit a max of around a 286mm card. Will I lose much performance if I go with a shorter card with fewer fans? Or is it just going to be a bit louder? Is it worth changing my case to just accommodate larger cards? There are some outliers like MSI's Armor cooler where they tried to save too much money on the heatsink, or small form factor cards that intentionally give up performance to fit in a smaller size.
|
# ¿ Aug 20, 2018 16:24 |
|
The FE designs do look sorta cool
|
# ¿ Aug 20, 2018 20:08 |
|
It's kinda telling how many of the games demoed so far do only one effect at a time. Shadows in Tomb Raider, reflections in Battlefield, diffuse GI for Metro. Control looks likes it uses all of the above at once, though. I'm guessing it's even less performant in real time.
|
# ¿ Aug 25, 2018 23:22 |
|
BIG HEADLINE posted:It's examples like this that prove something definitive - if you're playing a competitive shooter, you're going to play with RTX off simply because of better visibility. Who cares if it means washed out, unnatural-looking scenes if it's the difference between getting a kill and getting killed.
|
# ¿ Aug 26, 2018 19:26 |
|
Quake 3 must be the earliest example of this, it had a console command to turn down texture resolution to basically nothing... which became standard in competitive play.
|
# ¿ Aug 29, 2018 19:02 |
|
Also standard in competitive play: forcing other players to render with the biggest, most visible model. Coloured bright green:
|
# ¿ Aug 29, 2018 19:21 |
|
Channel Well Technology does the RMx line.
|
# ¿ Aug 30, 2018 11:30 |
|
Zero VGS posted:My general impression was that "per transistor cost may increase" means in practice that a 2080 chip costs Nvidia $32 instead of $30 to manufacture, or something to that effect. Would the chip wholesale price difference of a couple bucks really scale exponentially to the MSRP? I would think MSRP is entirely based on what the market will bear and how much the competition is undercutting you. https://semiengineering.com/big-trouble-at-3nm/ - this article suggests that design costs alone are expected to increase by the hundreds of millions per chip: quote:Design costs are also a problem. Generally, IC design costs have jumped from $51.3 million for a 28nm planar device to $297.8 million for a 7nm chip and $542.2 million for 5nm, according to IBS. But at 3nm, IC design costs range from a staggering $500 million to $1.5 billion, according to IBS. The $1.5 billion figure involves a complex GPU at Nvidia. Llamadeus fucked around with this message at 03:07 on Aug 31, 2018 |
# ¿ Aug 31, 2018 03:05 |
|
And the TU104 cards (2070 & 2080) are going to be 8 GB too, you'll have to go up to a 1080 Ti or 2080 Ti to get 11 GB.
|
# ¿ Sep 14, 2018 05:43 |
|
AVeryLargeRadish posted:What you outlined is reasonable, but it's also not what I was complaining about. I'm seeing people say that Nvidia should not release cards with new tech unless there is software on the market that can use that new tech, to me at least that is ridiculous, someone has to take the first step and hardware makers are the ones that make the most sense for that. Like, nobody releases game consoles with zero launch games for the same reason.
|
# ¿ Sep 19, 2018 15:41 |
|
AVeryLargeRadish posted:Yes, it is an unreasonable expectation on the PC. Consoles are completely different, you can't play a console game at all without the console, you're basically saying that developers should put resources into features that might be used at some point in the future and no dev in their right mind will do that when there are other things they could put their resources towards that see immediate benefit now. Nvidia just decided to launch the cards before a single implementation was ready.
|
# ¿ Sep 19, 2018 16:08 |
|
Zero VGS posted:It definitely smears stuff in some of those shots. In motion it does seem hard to notice. However there's sustained shots in that video where it's the same or even a frame or two slower than TAA, so while much faster on average, it's not exactly consistent. I would like to see a comparison video like that, but showing DLSS, TAA, and no aliasing at all, so I can see how accurate they're both getting to the reference material.
|
# ¿ Sep 20, 2018 03:52 |
|
K8.0 posted:Real-time TAA still always looks like dogshit, which is specifically why Nvidia chooses it as a comparison point. When we can compare DLSS to non-TAA images, then we'll see how it actually looks, and whether it makes 4k monitors a worthwhile purchase when worthwhile ones come out, or whether it's better to stick with 1440.
|
# ¿ Sep 21, 2018 14:32 |
|
It's worth pointing out that PSNR is not necessarily an effective way to judge encoding quality across different implementations. x264 contains a few optimizations that worsen PSNR and SSIM but help a lot in improving perceptual quality.
|
# ¿ Sep 24, 2018 19:19 |
|
Noctua do have black versions of their fans as well, as well as fully black versions of the D15 etc coming in the next year.
|
# ¿ Oct 4, 2018 01:00 |
|
I run the latest for a few hours (sometimes it crashes/errors when other stuff doesn't), which I can't imagine to be that much more damaging than running "normal" 100% loads with the same overclock for hours at a time over years.
|
# ¿ Oct 6, 2018 01:45 |
|
Paul MaudDib posted:Asus's Haswell-E Overclocking guide: In any case I'd expect anyone pushing the limits of >300W CPUs to just use some actual common sense in this case.
|
# ¿ Oct 6, 2018 02:06 |
|
Scionix posted:Isn't an overclocked 2700x about 8700k performance? The 9900K will be ahead in both so they'll be able to charge a large premium for it, maybe even after the next generation of Ryzens launch. Llamadeus fucked around with this message at 21:23 on Oct 12, 2018 |
# ¿ Oct 12, 2018 21:19 |
|
Truga posted:40% upscaling means it's rendering 1.4 times pixels your monitor has. Though a few do it by total pixel count instead. Llamadeus fucked around with this message at 00:19 on Oct 21, 2018 |
# ¿ Oct 20, 2018 23:39 |
|
That single fan doesn't look any bigger than the dual fans though, it's just a smaller heatsink to save costs and for small form factor builds. And heatsink surface area is a more important factor.
|
# ¿ Jan 17, 2019 07:11 |
|
There's one unused fan slot at the bottom
|
# ¿ Mar 3, 2019 19:44 |
|
Digital Foundry also had a weird outlier where the 5700 and 5700 really underperformed at 1080p and 4K in Metro Exodus (but not 1440p): https://www.eurogamer.net/articles/digitalfoundry-2019-amd-radeon-rx-5700-rx-5700-xt-review-head-to-head-with-nvidia-super?page=5 I'd guess this is something fixable in a driver update.
|
# ¿ Jul 7, 2019 23:02 |
|
The other disadvantage to the probe approach is that the resolution of the GI is limited by the density of the probes. The per-pixel approach has more refined higher frequency detail, so the probes are more reliant on screen space AO to make up for it. Still a pretty good trade-off if your main goal is to capture the general feel of the lighting in a space (and no worse than current baked probes).
|
# ¿ Jul 12, 2019 21:27 |
|
Zotix posted:How much better at 1440p is a 2070 Super than a 1070GTX? I know we are basically talking 2+ generations of video cards, but I'm building an entirely new rig with a 3900x in it, and I was going to carry over my 1070 for a year until the next gen is fully released. However, I'm not sure of the performance I'll see on my new rig with the 1070. I'm upgrading from a 3570k to a 3900x, so I'm assuming I'll have some performance increase.
|
# ¿ Jul 13, 2019 01:17 |
|
ZobarStyl posted:Having said that, I couldn't find anything on how the 3400G is laid out internally - does it use a distinct GPU chiplet or is it a single die with a CCX and GPU cores?
|
# ¿ Jul 16, 2019 15:06 |
|
Well also keep in mind that that article is written more in the context of buying TVs for watching video, which generally has closer to ideal levels of aliasing.
|
# ¿ Nov 6, 2019 00:05 |
|
Enos Cabell posted:Weird, I have a 9700k but haven't noticed any stuttering at all. 1080ti @1440p mostly high settings, averaging around 65fps.
|
# ¿ Nov 7, 2019 14:38 |
|
Taima posted:Yeah definitely. What’s the deal with that? It was also sometimes like an overexposed photo. The formats are futureproofed to the point that they specify much wider gamuts and higher peak brightnesses than any consumer display is capable of, so the display itself has to do an additional tone mapping operation to make it look acceptable. And I assume HDR monitors are not as good at this as TVs are if they're doing anything at all. Llamadeus fucked around with this message at 17:28 on Dec 10, 2019 |
# ¿ Dec 10, 2019 17:18 |
|
|
# ¿ May 16, 2024 15:03 |
|
"PS5 = 5500" is beyond lowballing it though, the current Xbox One X already has 5500 level perfomance and the fanboys are already mildly disappointed by the leaked 36 CU number.
|
# ¿ Jan 6, 2020 04:48 |