Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
space marine todd
Nov 7, 2014



Warmachine posted:

tbh I do lock my framerate because for what I do I don't need the thing putting out a hundred extra frames I'll never see just to suck down power and heat my apartment.

It's also what you should be doing if you have a G-Sync monitor!

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

K8.0 posted:

I wouldn't be surprised if the announced 3070 is a farce that was never going to exist. I've been saying from the beginning that it looks like really poor value compared to the 3080. Replacing it with a better value GPU would be a perfect way to yank the rug out from under AMD.

I guess I am going to have to eat poo poo on this one, good call, looks like they are going for the "jebait" approach to fine-tune pricing after AMD announces the navi cards. I still dunno about changing up the hardware spec itself, I still think they may be going for a yield strategy where they have GA102 3090/3080, GA103 3080/3070, GA104 3070/3060 so they can swap in one of two different dies for any given card but the chips are cut to the same configuration regardless of which chip it is, but they definitely look like they're going for price tweaks at a minimum

if nvidia announces that 2080 ti performance is now only $449.99 then people are gonna lose their poo poo lol

Paul MaudDib fucked around with this message at 18:11 on Oct 2, 2020

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
The FEs have been removed from the nvidia store. Only AIB 3080s and 3090s show up. The speculation is currently going wild. What if it really was a limited release

Fauxtool fucked around with this message at 18:09 on Oct 2, 2020

Shogunner
Apr 29, 2010

Ready to crash and burn.
I never learn.
I'm on the rapetrain.

Rolo posted:

A CDW rep on Reddit just provided estimated arrival dates to their warehouses.

For people like me, they mentioned that they’re getting just over 100 of the XC3 cards so those that ordered late the other day will have to wait another round.



Sorry for the image, the Reddit app won’t let me just copy text.

Does CDW mark up their cards? I'm seeing their XC3 as $899 and their FTW3 as $936 :(

Fauxtool posted:

The FEs have been removed from the nvidia store. Only AIB 3080s and 3090s show up. The speculation is currently going wild. What if it really was a limited release

:negative:

jkyuusai
Jun 26, 2008

homegrown man milk
Someone was asking about Amazon - I set up an updated version of one of the nvidia bots last night that polled for a few different kinds of cards on Amazon and used the add-all-to-cart link trick. Looks like it snagged a non-OC Asus TUF for me at 1:35 AM Pacific. Estimated delivery is currently Dec 7-9.

jkyuusai fucked around with this message at 18:12 on Oct 2, 2020

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

The reality is probably that they finally figured out a way to stop bots from pinging their stock and will add it back shortly before any drops

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Paul MaudDib posted:

after a lot of tut-tutting about "never trust first-party benchmarks!" the NVIDIA benchmarks for 3080 ended up being basically accurate. You have to carefully watch what they are choosing to show you, like the "1.9x perf watt! (in a locked framerate scenario that strains the 2080 but lets the 3080 idle down)" figure, or all the benchmarks being done at 4K to hide poor scaling at 1080p, but they're not actually gimmicking the numbers themselves.

no reason to suspect that 3070 is where they suddenly started lying

They're not lying, just being deliberately misleading! Lies of omission don't violate advertising laws!

Go back to r/Nvidia paul. Maybe you can grow a grassroots fanbase and go to forumswar with r/amd.

Shogunner
Apr 29, 2010

Ready to crash and burn.
I never learn.
I'm on the rapetrain.

Fauxtool posted:

The reality is probably that they finally figured out a way to stop bots from pinging their stock and will add it back shortly before any drops

ahhh great, i love having even more f5'ing to do

Sagebrush
Feb 26, 2012

Fauxtool posted:

The FEs have been removed from the nvidia store. Only AIB 3080s and 3090s show up. The speculation is currently going wild. What if it really was a limited release

nvidia, summer 2020

"there's a 12 week lead time on these cards so we need to plan for launch. how many cards should we make?"
"i dunno, like, a thousand?"
"a thousand of them? god drat buddy i don't even know a thousand people. let's not get ahead of ourselves here."
"you're right. let's start with 300"
"ok, that sounds fair. 300 cards for launch and then we'll re-evaluate."

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Some Goon posted:

They're not lying, just being deliberately misleading! Lies of omission don't violate advertising laws!

Go back to r/Nvidia paul. Maybe you can grow a grassroots fanbase and go to forumswar with r/amd.

:whitewater:

someone woke up on the wrong side of the bed

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

Paul MaudDib posted:

You have to carefully watch what they are choosing to show you, like the "1.9x perf watt! (in a locked framerate scenario that strains the 2080 but lets the 3080 idle down)" figure, or all the benchmarks being done at 4K to hide poor scaling at 1080p, but they're not actually gimmicking the numbers themselves.

My favourite thing about this is that 1080p being largely CPU limited nowadays because GPUs are so powerful is somehow Nvidia's fault and a shameful secret they must hide.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Sagebrush posted:

nvidia, summer 2020

"there's a 12 week lead time on these cards so we need to plan for launch. how many cards should we make?"
"i dunno, like, a thousand?"
"a thousand of them? god drat buddy i don't even know a thousand people. let's not get ahead of ourselves here."
"you're right. let's start with 300"
"ok, that sounds fair. 300 cards for launch and then we'll re-evaluate."

its just fancy dirt, how much are people really gonna want them?

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:
It was a mistake to teach sand to think.

MikeC
Jul 19, 2004
BITCH ASS NARC

Paul MaudDib posted:

:whitewater:

someone woke up on the wrong side of the bed

He isn't wrong though. Misrepresentation is misrepresentation.

Sagebrush
Feb 26, 2012

8-bit Miniboss posted:

It was a mistake to teach sand to think.

thou shalt not make a machine in the likeness of a human mind

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

MikeC posted:

He isn't wrong though. Misrepresentation is misrepresentation.

What was misrepresented, though? They hit pretty close to all the marks they talked about. It seems the biggest issue is that they've been using the 2080 as a base point for comparison with their metrics, while a lot of people are instead thinking of it compared to the 2080Ti. That's not misrepresentation at all, and if anything is the more reasonable way to do the comparison given the relative prices.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Sagebrush posted:

thou shalt not make a machine in the likeness of a human mind

We didn't. These ones do only what they're told and never decide to wander off and peruse Facebook instead of finishing their work. They're very bad at thinking.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

AirRaid posted:

My favourite thing about this is that 1080p being largely CPU limited nowadays because GPUs are so powerful is somehow Nvidia's fault and a shameful secret they must hide.

also it’s the exact same people who spent literal years whining about “who buys a 2080 Ti to play at 1080p!?!” in CPU benchmarks who are suddenly getting the vapors about poor 1080p scaling on the 3070/3080

Paul MaudDib fucked around with this message at 18:35 on Oct 2, 2020

Sagebrush
Feb 26, 2012

1080p is the dark ages now. 2 megapixels??? haha no. get a 5-megapixel ultrawide please

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World
The funny thing is how Intel has harped on how their gaming performance sets apart their shitass 14nm++++++++++++++++++++++++++++++++++++++++++++++++ refresh vs. the Ryzen 3000 series, when all the benchmarks that show it are 1080p.

It's like, who the poo poo is buying an i9-10900K atomic pile you have to cool with liquid helium to run 5% more FPS at 1080-loving-p?

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
3080 FEs are still showing on the nvidia uk store (out of stock)

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

sean10mm posted:

The funny thing is how Intel has harped on how their gaming performance sets apart their shitass 14nm++++++++++++++++++++++++++++++++++++++++++++++++ refresh vs. the Ryzen 3000 series, when all the benchmarks that show it are 1080p.

It's like, who the poo poo is buying an i9-10900K atomic pile you have to cool with liquid helium to run 5% more FPS at 1080-loving-p?

Because 1080p shows up CPU differences best, because, as stated literally on this page, games at that res are CPU limited and so CPU performance shines through more at that res.

Rolo
Nov 16, 2005

Hmm, what have we here?
Also showing on my phone in the US.

Shogunner posted:

Does CDW mark up their cards? I'm seeing their XC3 as $899 and their FTW3 as $936 :(

They do mark up to make up for the customer service representatives they include with their sales, which I guess aren’t really tailored for nerds buying 1 off.

Cygni
Nov 12, 2005

raring to post

"nvidia should have delayed their launch so they had more launch day stock"

ok, we are delaying the 3070 release by 2 weeks to have more launch day stock

"what is nvidia hiding?? its a conspiracy!!"

Gamer Brain is real

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

AirRaid posted:

Because 1080p shows up CPU differences best, because, as stated literally on this page, games at that res are CPU limited and so CPU performance shines through more at that res.

You're missing the point.

No one gives a real gently caress whether CS:GO runs at 450 FPS or 460 FPS at 1080p. If that's the only place you can demonstrate your "superior CPU performance," you've missed the boat.

Sagebrush
Feb 26, 2012

the greatest shooter ever made, ut99, runs beautifully at 120hz in 3440x1440 on my 1060 so honestly i don't even know why i need a 3080 at this point

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

DrDork posted:

You're missing the point.

No one gives a real gently caress whether CS:GO runs at 450 FPS or 460 FPS at 1080p. If that's the only place you can demonstrate your "superior CPU performance," you've missed the boat.

Exactly.

b0ner of doom
Mar 17, 2006
Can't wait to get a 3080 so I play indie treasures that run on anything and browse the SA forums with it

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

DrDork posted:

You're missing the point.

No one gives a real gently caress whether CS:GO runs at 450 FPS or 460 FPS at 1080p. If that's the only place you can demonstrate your "superior CPU performance," you've missed the boat.

But it's not just Intel. That is the standard for CPU benchmarking in games as far as I can tell. It's the way to show the greatest degree of difference between differing CPUs. It's not to say that those difference only show up at 1080p, but really if you're gaming at 4K then your CPU is not going to be the issue at all and all the different CPUs will perform similarly so whats the point?

It's the same for GPU testing. Benchmarks are moving away from 1080p GPU testing because the numbers are the same. Some games show the same numbers at 1080p from a 3090 and a 1080Ti.

Also, say you've got a 10% difference in CPU speed. You can show at 1080P the difference between 150FPS and 165FPS. or you can show the difference at 4K as maybe between 50fps and 55fps. (numbers pulled literally out of my arse) The larger margin gives more room for error, and shows more clearly any differences outside standard deviation.

b0ner of doom
Mar 17, 2006

Sagebrush posted:

the greatest shooter ever made, ut99, runs beautifully at 120hz in 3440x1440 on my 1060 so honestly i don't even know why i need a 3080 at this point



https://www.youtube.com/watch?v=DV1fUwKMdAI

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

AirRaid posted:

But it's not just Intel. That is the standard for CPU benchmarking in games as far as I can tell. It's the way to show the greatest degree of difference between differing CPUs. It's not to say that those difference only show up at 1080p, but really if you're gaming at 4K then your CPU is not going to be the issue at all and all the different CPUs will perform similarly so whats the point?

It used to be that the standard for CPU benchmarking was 480p. I'm sure if we brought that back it'd show even greater Intel domination!

Intel has been sticking to 1080p high-fps benchmarks for their promotional material because that's about the only place their chips look noticeably better than AMD's right now, and even then it's only if you overclock them to the moon and compare vs non-OC'ed R7/9's.

I'm not saying that reviewers shouldn't throw some 1080p benchmarks into their CPU reviews, if for no other reason than a lot of people still use 1080p, so it's nice to be able to verify that a new chip doesn't have some unexpected weirdness that would hold you back. What I am saying is that, in terms of corporate marketing and cherry picking benchmark results, Intel showcasing a $550 CPU at 1080p is hilarious because no one sane is spending $550 on a CPU and then playing at 1080p. They're intentionally avoiding the worksets and use-cases that a lot of people spending that much on a CPU are likely to actually use it for specifically because they don't look great there.

The context here was "lol at NVidia for picking benchmarks that show stuff in ways we don't like," when they've actually been picking comparatively reasonable test sets this time around compared to a lot of other tech companies.

Warmachine
Jan 30, 2012



For what it is worth, the other half of why I stuck with Intel for this build is that the games I play that are also CPU bound benefit from high clocks because they either don't take advantage of multithreading or do it really really badly. So breaking the 5GHz barrier actually does matter for me.

FPS? Who loving cares as long as it meets or exceeds the refresh rate of your monitor consistently?

Mercrom
Jul 17, 2009
What's the point of even looking at "average fps" in CPU benchmarks? Stuttering and 1% lows are the only thing relevant for anything but those idiotic 1080p 360hz monitors.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Nothing I love more than a rock steady 15 fps.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
If anyone wants a deeper insight you can read GN's rationale for 1080p CPU testing here https://www.gamersnexus.net/guides/3577-cpu-test-methodology-unveil-for-2020-compile-gaming-more

slidebite
Nov 6, 2005

Good egg
:colbert:

For Canadians that aren't in a rush, Memory Express is now allowing backorders.

Wiggly Wayne DDS
Sep 11, 2010



DrDork posted:

It used to be that the standard for CPU benchmarking was 480p. I'm sure if we brought that back it'd show even greater Intel domination!

Intel has been sticking to 1080p high-fps benchmarks for their promotional material because that's about the only place their chips look noticeably better than AMD's right now, and even then it's only if you overclock them to the moon and compare vs non-OC'ed R7/9's.

I'm not saying that reviewers shouldn't throw some 1080p benchmarks into their CPU reviews, if for no other reason than a lot of people still use 1080p, so it's nice to be able to verify that a new chip doesn't have some unexpected weirdness that would hold you back. What I am saying is that, in terms of corporate marketing and cherry picking benchmark results, Intel showcasing a $550 CPU at 1080p is hilarious because no one sane is spending $550 on a CPU and then playing at 1080p. They're intentionally avoiding the worksets and use-cases that a lot of people spending that much on a CPU are likely to actually use it for specifically because they don't look great there.

The context here was "lol at NVidia for picking benchmarks that show stuff in ways we don't like," when they've actually been picking comparatively reasonable test sets this time around compared to a lot of other tech companies.
what worksets are they "intentionally avoiding"? what use-cases? can you substantiate a single one of your insane rants against real world benchmarks against applications people use? the resolution used to be 480p, yes, and in a decade it'd be 4k - it's about it being relative to the world not your favoured spec and once you hit a high enough fps the bottlenecks become the rest of the components rather than the cpu itself

if you seriously believe this has a single thing to do with intel then you have no idea why they're ahead in game performance metrics currently. please go inform yourself rather than spreading your nonsense everywhere, you're comically misinformed on, well, everything

Freakazoid_
Jul 5, 2013


Buglord
Man, no respect for anyone with a 240hz 1080p monitor just trying to get max fps with modern games, some of which are now CPU bound.

TerminalSaint
Apr 21, 2007


Where must we go...

we who wander this Wasteland in search of our better selves?

slidebite posted:

For Canadians that aren't in a rush, Memory Express is now allowing backorders.

I hope you're really not in a rush, they've been taking them since the 19th.

Adbot
ADBOT LOVES YOU

slidebite
Nov 6, 2005

Good egg
:colbert:

I've literally been going to their website every day or two just for curiosity, and they've never shown anything for adding to cart - only in-store only. Today is the first day I noticed otherwise.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply