|
Warmachine posted:tbh I do lock my framerate because for what I do I don't need the thing putting out a hundred extra frames I'll never see just to suck down power and heat my apartment. It's also what you should be doing if you have a G-Sync monitor!
|
# ? Oct 2, 2020 18:07 |
|
|
# ? Jun 4, 2024 08:40 |
|
K8.0 posted:I wouldn't be surprised if the announced 3070 is a farce that was never going to exist. I've been saying from the beginning that it looks like really poor value compared to the 3080. Replacing it with a better value GPU would be a perfect way to yank the rug out from under AMD. I guess I am going to have to eat poo poo on this one, good call, looks like they are going for the "jebait" approach to fine-tune pricing after AMD announces the navi cards. I still dunno about changing up the hardware spec itself, I still think they may be going for a yield strategy where they have GA102 3090/3080, GA103 3080/3070, GA104 3070/3060 so they can swap in one of two different dies for any given card but the chips are cut to the same configuration regardless of which chip it is, but they definitely look like they're going for price tweaks at a minimum if nvidia announces that 2080 ti performance is now only $449.99 then people are gonna lose their poo poo lol Paul MaudDib fucked around with this message at 18:11 on Oct 2, 2020 |
# ? Oct 2, 2020 18:07 |
|
The FEs have been removed from the nvidia store. Only AIB 3080s and 3090s show up. The speculation is currently going wild. What if it really was a limited release Fauxtool fucked around with this message at 18:09 on Oct 2, 2020 |
# ? Oct 2, 2020 18:07 |
|
Rolo posted:A CDW rep on Reddit just provided estimated arrival dates to their warehouses. Does CDW mark up their cards? I'm seeing their XC3 as $899 and their FTW3 as $936 Fauxtool posted:The FEs have been removed from the nvidia store. Only AIB 3080s and 3090s show up. The speculation is currently going wild. What if it really was a limited release
|
# ? Oct 2, 2020 18:08 |
|
Someone was asking about Amazon - I set up an updated version of one of the nvidia bots last night that polled for a few different kinds of cards on Amazon and used the add-all-to-cart link trick. Looks like it snagged a non-OC Asus TUF for me at 1:35 AM Pacific. Estimated delivery is currently Dec 7-9.
jkyuusai fucked around with this message at 18:12 on Oct 2, 2020 |
# ? Oct 2, 2020 18:09 |
|
The reality is probably that they finally figured out a way to stop bots from pinging their stock and will add it back shortly before any drops
|
# ? Oct 2, 2020 18:10 |
|
Paul MaudDib posted:after a lot of tut-tutting about "never trust first-party benchmarks!" the NVIDIA benchmarks for 3080 ended up being basically accurate. You have to carefully watch what they are choosing to show you, like the "1.9x perf watt! (in a locked framerate scenario that strains the 2080 but lets the 3080 idle down)" figure, or all the benchmarks being done at 4K to hide poor scaling at 1080p, but they're not actually gimmicking the numbers themselves. They're not lying, just being deliberately misleading! Lies of omission don't violate advertising laws! Go back to r/Nvidia paul. Maybe you can grow a grassroots fanbase and go to forumswar with r/amd.
|
# ? Oct 2, 2020 18:10 |
|
Fauxtool posted:The reality is probably that they finally figured out a way to stop bots from pinging their stock and will add it back shortly before any drops ahhh great, i love having even more f5'ing to do
|
# ? Oct 2, 2020 18:11 |
|
Fauxtool posted:The FEs have been removed from the nvidia store. Only AIB 3080s and 3090s show up. The speculation is currently going wild. What if it really was a limited release nvidia, summer 2020 "there's a 12 week lead time on these cards so we need to plan for launch. how many cards should we make?" "i dunno, like, a thousand?" "a thousand of them? god drat buddy i don't even know a thousand people. let's not get ahead of ourselves here." "you're right. let's start with 300" "ok, that sounds fair. 300 cards for launch and then we'll re-evaluate."
|
# ? Oct 2, 2020 18:12 |
|
Some Goon posted:They're not lying, just being deliberately misleading! Lies of omission don't violate advertising laws! someone woke up on the wrong side of the bed
|
# ? Oct 2, 2020 18:13 |
|
Paul MaudDib posted:You have to carefully watch what they are choosing to show you, like the "1.9x perf watt! (in a locked framerate scenario that strains the 2080 but lets the 3080 idle down)" figure, or all the benchmarks being done at 4K to hide poor scaling at 1080p, but they're not actually gimmicking the numbers themselves. My favourite thing about this is that 1080p being largely CPU limited nowadays because GPUs are so powerful is somehow Nvidia's fault and a shameful secret they must hide.
|
# ? Oct 2, 2020 18:14 |
|
Sagebrush posted:nvidia, summer 2020 its just fancy dirt, how much are people really gonna want them?
|
# ? Oct 2, 2020 18:14 |
|
It was a mistake to teach sand to think.
|
# ? Oct 2, 2020 18:15 |
|
Paul MaudDib posted:
He isn't wrong though. Misrepresentation is misrepresentation.
|
# ? Oct 2, 2020 18:18 |
|
8-bit Miniboss posted:It was a mistake to teach sand to think. thou shalt not make a machine in the likeness of a human mind
|
# ? Oct 2, 2020 18:23 |
|
MikeC posted:He isn't wrong though. Misrepresentation is misrepresentation. What was misrepresented, though? They hit pretty close to all the marks they talked about. It seems the biggest issue is that they've been using the 2080 as a base point for comparison with their metrics, while a lot of people are instead thinking of it compared to the 2080Ti. That's not misrepresentation at all, and if anything is the more reasonable way to do the comparison given the relative prices.
|
# ? Oct 2, 2020 18:23 |
|
Sagebrush posted:thou shalt not make a machine in the likeness of a human mind We didn't. These ones do only what they're told and never decide to wander off and peruse Facebook instead of finishing their work. They're very bad at thinking.
|
# ? Oct 2, 2020 18:24 |
|
AirRaid posted:My favourite thing about this is that 1080p being largely CPU limited nowadays because GPUs are so powerful is somehow Nvidia's fault and a shameful secret they must hide. also it’s the exact same people who spent literal years whining about “who buys a 2080 Ti to play at 1080p!?!” in CPU benchmarks who are suddenly getting the vapors about poor 1080p scaling on the 3070/3080 Paul MaudDib fucked around with this message at 18:35 on Oct 2, 2020 |
# ? Oct 2, 2020 18:30 |
|
1080p is the dark ages now. 2 megapixels??? haha no. get a 5-megapixel ultrawide please
|
# ? Oct 2, 2020 18:33 |
|
The funny thing is how Intel has harped on how their gaming performance sets apart their shitass 14nm++++++++++++++++++++++++++++++++++++++++++++++++ refresh vs. the Ryzen 3000 series, when all the benchmarks that show it are 1080p. It's like, who the poo poo is buying an i9-10900K atomic pile you have to cool with liquid helium to run 5% more FPS at 1080-loving-p?
|
# ? Oct 2, 2020 18:37 |
|
3080 FEs are still showing on the nvidia uk store (out of stock)
|
# ? Oct 2, 2020 18:39 |
|
sean10mm posted:The funny thing is how Intel has harped on how their gaming performance sets apart their shitass 14nm++++++++++++++++++++++++++++++++++++++++++++++++ refresh vs. the Ryzen 3000 series, when all the benchmarks that show it are 1080p. Because 1080p shows up CPU differences best, because, as stated literally on this page, games at that res are CPU limited and so CPU performance shines through more at that res.
|
# ? Oct 2, 2020 18:39 |
|
Also showing on my phone in the US.Shogunner posted:Does CDW mark up their cards? I'm seeing their XC3 as $899 and their FTW3 as $936 They do mark up to make up for the customer service representatives they include with their sales, which I guess aren’t really tailored for nerds buying 1 off.
|
# ? Oct 2, 2020 18:40 |
|
"nvidia should have delayed their launch so they had more launch day stock" ok, we are delaying the 3070 release by 2 weeks to have more launch day stock "what is nvidia hiding?? its a conspiracy!!" Gamer Brain is real
|
# ? Oct 2, 2020 18:41 |
|
AirRaid posted:Because 1080p shows up CPU differences best, because, as stated literally on this page, games at that res are CPU limited and so CPU performance shines through more at that res. You're missing the point. No one gives a real gently caress whether CS:GO runs at 450 FPS or 460 FPS at 1080p. If that's the only place you can demonstrate your "superior CPU performance," you've missed the boat.
|
# ? Oct 2, 2020 18:42 |
|
the greatest shooter ever made, ut99, runs beautifully at 120hz in 3440x1440 on my 1060 so honestly i don't even know why i need a 3080 at this point
|
# ? Oct 2, 2020 18:44 |
|
DrDork posted:You're missing the point. Exactly.
|
# ? Oct 2, 2020 18:45 |
|
Can't wait to get a 3080 so I play indie treasures that run on anything and browse the SA forums with it
|
# ? Oct 2, 2020 18:47 |
|
DrDork posted:You're missing the point. But it's not just Intel. That is the standard for CPU benchmarking in games as far as I can tell. It's the way to show the greatest degree of difference between differing CPUs. It's not to say that those difference only show up at 1080p, but really if you're gaming at 4K then your CPU is not going to be the issue at all and all the different CPUs will perform similarly so whats the point? It's the same for GPU testing. Benchmarks are moving away from 1080p GPU testing because the numbers are the same. Some games show the same numbers at 1080p from a 3090 and a 1080Ti. Also, say you've got a 10% difference in CPU speed. You can show at 1080P the difference between 150FPS and 165FPS. or you can show the difference at 4K as maybe between 50fps and 55fps. (numbers pulled literally out of my arse) The larger margin gives more room for error, and shows more clearly any differences outside standard deviation.
|
# ? Oct 2, 2020 18:49 |
|
Sagebrush posted:the greatest shooter ever made, ut99, runs beautifully at 120hz in 3440x1440 on my 1060 so honestly i don't even know why i need a 3080 at this point https://www.youtube.com/watch?v=DV1fUwKMdAI
|
# ? Oct 2, 2020 18:50 |
|
AirRaid posted:But it's not just Intel. That is the standard for CPU benchmarking in games as far as I can tell. It's the way to show the greatest degree of difference between differing CPUs. It's not to say that those difference only show up at 1080p, but really if you're gaming at 4K then your CPU is not going to be the issue at all and all the different CPUs will perform similarly so whats the point? It used to be that the standard for CPU benchmarking was 480p. I'm sure if we brought that back it'd show even greater Intel domination! Intel has been sticking to 1080p high-fps benchmarks for their promotional material because that's about the only place their chips look noticeably better than AMD's right now, and even then it's only if you overclock them to the moon and compare vs non-OC'ed R7/9's. I'm not saying that reviewers shouldn't throw some 1080p benchmarks into their CPU reviews, if for no other reason than a lot of people still use 1080p, so it's nice to be able to verify that a new chip doesn't have some unexpected weirdness that would hold you back. What I am saying is that, in terms of corporate marketing and cherry picking benchmark results, Intel showcasing a $550 CPU at 1080p is hilarious because no one sane is spending $550 on a CPU and then playing at 1080p. They're intentionally avoiding the worksets and use-cases that a lot of people spending that much on a CPU are likely to actually use it for specifically because they don't look great there. The context here was "lol at NVidia for picking benchmarks that show stuff in ways we don't like," when they've actually been picking comparatively reasonable test sets this time around compared to a lot of other tech companies.
|
# ? Oct 2, 2020 19:05 |
|
For what it is worth, the other half of why I stuck with Intel for this build is that the games I play that are also CPU bound benefit from high clocks because they either don't take advantage of multithreading or do it really really badly. So breaking the 5GHz barrier actually does matter for me. FPS? Who loving cares as long as it meets or exceeds the refresh rate of your monitor consistently?
|
# ? Oct 2, 2020 19:11 |
|
What's the point of even looking at "average fps" in CPU benchmarks? Stuttering and 1% lows are the only thing relevant for anything but those idiotic 1080p 360hz monitors.
|
# ? Oct 2, 2020 19:25 |
|
Nothing I love more than a rock steady 15 fps.
|
# ? Oct 2, 2020 19:29 |
|
If anyone wants a deeper insight you can read GN's rationale for 1080p CPU testing here https://www.gamersnexus.net/guides/3577-cpu-test-methodology-unveil-for-2020-compile-gaming-more
|
# ? Oct 2, 2020 19:33 |
|
For Canadians that aren't in a rush, Memory Express is now allowing backorders.
|
# ? Oct 2, 2020 19:33 |
|
DrDork posted:It used to be that the standard for CPU benchmarking was 480p. I'm sure if we brought that back it'd show even greater Intel domination! if you seriously believe this has a single thing to do with intel then you have no idea why they're ahead in game performance metrics currently. please go inform yourself rather than spreading your nonsense everywhere, you're comically misinformed on, well, everything
|
# ? Oct 2, 2020 19:34 |
|
Man, no respect for anyone with a 240hz 1080p monitor just trying to get max fps with modern games, some of which are now CPU bound.
|
# ? Oct 2, 2020 19:37 |
|
slidebite posted:For Canadians that aren't in a rush, Memory Express is now allowing backorders. I hope you're really not in a rush, they've been taking them since the 19th.
|
# ? Oct 2, 2020 19:38 |
|
|
# ? Jun 4, 2024 08:40 |
|
I've literally been going to their website every day or two just for curiosity, and they've never shown anything for adding to cart - only in-store only. Today is the first day I noticed otherwise.
|
# ? Oct 2, 2020 19:47 |