Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
eames
May 9, 2009

Gamer's Nexus has a good video that should cover this topic — 2200G / 2400G vs GT 1030 with esports titles including DOTA2.

https://www.youtube.com/watch?v=cNKxrzY9WDY

Adbot
ADBOT LOVES YOU

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



FaustianQ posted:

which can be done on a wraith spire

Can't fit one in the case he picked.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

Craptacular! posted:




I don't know enough to know what DP++ is about, Gigabyte's GA-AB350N-Gaming WIFI is the only ITX motherboard for Ryzen I know of that has a DisplayPort. Some MSI GT 1030s have DisplayPort, depending on prices it may or may not be worth it for you to rethink this plan and consider a Pentium G4560 with a low profile 1030 instead.

EDIT: 2200G is $99 and that board is the same price so it's $200, while a Pentium is like $50 and a basic ASRock board with wifi is $60. That leaves you with enough money to get the card you want even at these prices (MSI new at $80 on Newegg), and your FPS will be higher. The downside is you'll need to find a case that has a PCI slot.

That performance is a lot worse than I thought it would be at 720p, thanks.

Rastor
Jun 2, 2001

KingEup posted:

That performance is a lot worse than I thought it would be at 720p, thanks.

As was already mentioned, the real tiny case / console-like games performance champion will actually be the Intel Hades Canyon NUCs.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Munkeymon posted:

Can't fit one in the case he picked.

Wraith Spire is 54mm, max height for the E-W150 is 63mm, no?

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



FaustianQ posted:

Wraith Spire is 54mm, max height for the E-W150 is 63mm, no?

The thing I found said it's 70mm :shrug:

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.
Can anyone shed some light on why the min FPS at 720p is worse than the min FPS at 1080p?



Craptacular!
Jul 9, 2001

Fuck the DH

KingEup posted:

Can anyone shed some light on why the min FPS at 720p is worse than the min FPS at 1080p?





In the video version of that review, he didn't mention that discrepancy at all. My guess is the drops at 720 is probably CPU bottlenecking, and at 1080 either Very High settings will task the GPU so the CPU isn't outgunned by it. Techspot/Hardware Unboxed also eventually used a 2200G with CSGO and Dota, watching a tournament in DotaTV at the highest settings and wasn't impressed thought it looked playable, if frames were a serious concern you could crank the details down a little.

Craptacular! fucked around with this message at 03:48 on Mar 29, 2018

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

Craptacular! posted:

My guess is the drops at 720 is probably CPU bottlenecking, and at 1080 either Very High settings will task the GPU so the CPU isn't outgunned by it.

This is what I originally suspected but was advised that CPU bottlenecking was highly unlikely: https://forums.somethingawful.com/newreply.php?action=newreply&postid=482613870#post482585328

IMO there is serious room for error in reviews because no one ever replicates their benchmarks.

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:
I prefer the Gamers Nexus APU benchmark results, as they go out of their way to specify RAM speed and CAS latencies used (which matters quite a bit).

Mister Facetious fucked around with this message at 07:44 on Mar 29, 2018

Craptacular!
Jul 9, 2001

Fuck the DH

KingEup posted:

This is what I originally suspected but was advised that CPU bottlenecking was highly unlikely

I think when it comes to widely popular but totally non-linear games like Dota, you can just look at YouTube videos with the frame counter going and see if you like what you see. This video uses -wtf to spawn Axe a bunch of times and have Tinker spawn March at a rate that would be illegal in real play to create the most visually complex situation, and you can see the effect the quality settings have on the picture (which is also 1080.) Another video watched a tournament in Vulkan, which tends to help AMD in games such as Doom. AMD graphics can generally be described as okay at DX9, weak in DX11, and prepared for DX12/Vulkan. But most of big name gaming is stubbornly hanging onto DX11, the API they're furthest behind with.

Dota has also remained one of the most graphically intensive top-down MOBA. Valve has done things like moving to Source 2 to make it easier for the game to meet the modest specs of cybercafe PCs in far-flung parts of the world (pre-Reborn Dota was much worse on low end hardware), but we're not quiiiiiiite at the point where the very highest detail settings can be played with over-60FPS in all situations on integrated fanless hardware. But three years ago the game barely even would launch on such a spec.

A 2400G, seemingly at stock settings, gets super close, though. Maybe a modest slight OC is needed to make it shine, or maybe launching in Vulkan will give it the extra oomph.

Craptacular! fucked around with this message at 08:49 on Mar 29, 2018

Arzachel
May 12, 2012

KingEup posted:

Can anyone shed some light on why the min FPS at 720p is worse than the min FPS at 1080p?





I can't find what settings they used or how they performed the benchmark (some other article mentions them using the tutorial) so I'm going to guess it's just variance in their bench method and nobody sanity checked the results.

ufarn
May 30, 2009
How deterministic are CS benchmarks? Is it a scripted fly-by?

eames
May 9, 2009

If AMD APUs work like Intel iGPUs (CPU/GPU on one die and dynamically sharing a limited TDP) then I can see some edge cases where 720p could cause lower minimums than 1080p.

Example: The higher average frame rates of 720p require more CPU performance, possibly to the point where rendering becomes CPU bottlenecked, so a higher percentage of the shared thermal envelope is assigned to the CPU.
Suddenly a very GPU intensive event happens (i.e. lots of volumetric smoke) and the GPU stutters briefly until the power management detects the imbalance and shifts power/TDP from CPU to GPU.
In 1080p the GPU part would always more power envelope assigned so the effect of such an event wouldn't be as severe (= higher mins).

I've observed oddities like this with Crystalwell all the time, certainly noticeable enough to affect 1% min FPS.

Arzachel
May 12, 2012

eames posted:

If AMD APUs work like Intel iGPUs (CPU/GPU on one die and dynamically sharing a limited TDP) then I can see some edge cases where 720p could cause lower minimums than 1080p.

Example: The higher average frame rates of 720p require more CPU performance, possibly to the point where rendering becomes CPU bottlenecked, so a higher percentage of the shared thermal envelope is assigned to the CPU.
Suddenly a very GPU intensive event happens (i.e. lots of volumetric smoke) and the GPU stutters briefly until the power management detects the imbalance and shifts power/TDP from CPU to GPU.
In 1080p the GPU part would always more power envelope assigned so the effect of such an event wouldn't be as severe (= higher mins).

I've observed oddities like this with Crystalwell all the time, certainly noticeable enough to affect 1% min FPS.

I could buy this but the Pentium + 1030 also has lower minimums on 720p for some reason.

Kazinsal
Dec 13, 2011


The actual answer is "Source is a pile of crap engine that still has code from Quake 1 in core places".

Khorne
May 1, 2002

Kazinsal posted:

The actual answer is "Source is a pile of crap engine that still has code from Quake 1 in core places".
Why do you have to blame it on Q1 code? Q1 movement code, for example, is far better than source's.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Source is so Ship of Theseus'd at this point, it's not even fair to compare the two.

And that's not even accounting for the fact that the current Source itself is even more distantly related to the older, Quake-derived GoldSrc.

(and hell, Titanfall 1, maybe Titanfall 2, was built on a heavily modified version of Source, I wouldn't even hold Q1 movement against it at this point either.)

SwissArmyDruid fucked around with this message at 20:53 on Mar 29, 2018

Kazinsal
Dec 13, 2011


Khorne posted:

Why do you have to blame it on Q1 code? Q1 movement code, for example, is far better than source's.

The Q1 code is not necessarily the problem. The problem is that it's an engine that has parts dating back to 1995 and has been incrementally updated and rebranded over the course of 23 years.

The whole thing was designed and built long before modern x86 SoCs and n-core hyperthreaded x86-on-RISC CPUs, and four thousand core vector processors as graphics cards. Incremental work was built in a dozen different eras of CPU and GPU philosophy. It is entirely unsurprising that it runs incredibly weirdly and not well at all.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Old code isn't necessarily bad code.

wargames
Mar 16, 2008

official yospos cat censor

Combat Pretzel posted:

Old code isn't necessarily bad code.

But it doesn't reinvent the wheel, so we need to do that.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

One of the strongest predictors of defects in code is how recently it was modified. New code sucks.

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

Subjunctive posted:

One of the strongest predictors of defects in code is how recently it was modified. New code sucks.

But the new code is to fix the old code! :v:

Old code is great until you need to do something it wasn't meant to do, then it sucks again.

Mr Shiny Pants
Nov 12, 2012

Subjunctive posted:

One of the strongest predictors of defects in code is how recently it was modified. New code sucks.

New code made my ThreadRipper work with KVM and GPU passthrough, some new code is awesome. :)

Khorne
May 1, 2002
Someone streamed a lotta 2700x stuff today. Including overclocking on a high end x370 board and benchmarks. The end result is: still buy an 8700k at the high end because the 2700x loses single core to ivy bridge just like previous gen. I suppose I can still wait for x470 overclocking, but the x470 improvements are mostly for people too lazy to overclock. On the lower end, whether you should get a 1600x or 2600 variant depends on the price you're getting it at, and they are probably better buys than the lower end intel.

bobfather
Sep 20, 2001

I will analyze your nervous system for beer money
Or, don’t support the Intel hegemony and buy the processor from the company that made Intel poo poo its pants and play it’s hand 18 months earlier than they wanted to play it.

Khorne
May 1, 2002

bobfather posted:

Or, don’t support the Intel hegemony and buy the processor from the company that made Intel poo poo its pants and play it’s hand 18 months earlier than they wanted to play it.
"Wow I dropped $1200 on this system and my fps is... identical" - me buying an AMD system in 2018 to replace my 2012 intel system.

Khorne fucked around with this message at 02:50 on Apr 2, 2018

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Or, you can buy Intel now and encourage AMD to make a better CPU later

SamDabbers
May 26, 2003



Khorne posted:

I'm not going to scam myself by buying a side grade to my 6 year old processor.

It may be a side grade in single threaded performance, but double the cores. Multitasking is buttery smooth on Ryzen.

bobfather
Sep 20, 2001

I will analyze your nervous system for beer money

Khorne posted:

"Wow I dropped $1200 on this system and my fps is... identical" - me buying an AMD system in 2018 to replace my 2012 intel system.

$1200 on a system, of which $600 is the video card and $300 is the RAM. Processor is a $150 part that performs as well as the top of the line Intel processor from 2016, which performs identically to their top of the line processor from 2012. Right?

Khorne
May 1, 2002

SamDabbers posted:

It may be a side grade in single threaded performance, but double the cores. Multitasking is buttery smooth on Ryzen.
I tried to justify it to myself that way, because I genuinely want a ryzen. It just doesn't hold up. The only real justification is if AMD hits their 2020 launch goals, sticks with AM4, and an x370/x470 board works as well as the new one, and I can swap out the processor for a 5GHz+ ryzen on a 10nm/7nm process. There are a lot of ifs there vs buying an 8700k now, hitting 5GHz without issue, and getting a solid upgrade while planning on buying a new system in 2020-2024.

bobfather posted:

$1200 on a system, of which $600 is the video card and $300 is the RAM. Processor is a $150 part that performs as well as the top of the line Intel processor from 2016, which performs identically to their top of the line processor from 2012. Right?
Not buying a video card because I have a decent 1070 I picked up for under MSRP, and the RAM will cost more than $300 which also complicates the Ryzen purchase because I want 16gb sticks.

Khorne fucked around with this message at 02:59 on Apr 2, 2018

SamDabbers
May 26, 2003



I guess if your main use case is gaming and your current machine is CPU bottlenecking the games you want to play, then Ryzen might not be the best option for you.

Khorne
May 1, 2002

SamDabbers posted:

I guess if your main use case is gaming and your current machine is CPU bottlenecking the games you want to play, then Ryzen might not be the best option for you.
The only game I played where I had < 165 fps was h1z1, and I had like 75 fps unless I ocd to 4.7-4.8ghz on my ivy bridge then it was like 94-100 fps which is okay I guess but a little low. Ryzen beats my processor ipc wise by some small margin, but ipc isn't relevant if your processor lacks the 'c' part.

Most of the games I play are indie, old as hell, or have sufficient low settings to reach an adequate competitive fps. Minecraft isn't even a good justification because in 1.12.2 I get 140+ fps even in real big modded bases vs the issue of having 40-60 fps in 1.7.10 and prior in similar bases.

The only other annoying thing to upgrading is picking between win7+wufuc even though support drops in 2020, using linux on the desktop and being unable to play certain games, or using win10 and being forced to play games in fullscreen (+other issues, due to how it handles >60Hz screens and doing stuff on secondary monitors). I don't like any of those options. I mean I like linux, but I don't like not being able to play whatever I feel like and I already have linux on my work laptop. I wish MS weren't lazy assholes who have left win8/win10 busted for gaming. The MS store is just another reason to boycott win10 for me, too.

Khorne fucked around with this message at 03:09 on Apr 2, 2018

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
My 3.9ghz 1700 games pretty good, but if the 2700x can hit 4.6 well I better buy that too.

Khorne
May 1, 2002

Risky Bisquick posted:

My 3.9ghz 1700 games pretty good, but if the 2700x can hit 4.6 well I better buy that too.
It looks like it hits ~4.3.

SamDabbers
May 26, 2003



Yeah I have no complaints with my 4.0GHz 1700, but my monitors are 1080p/60 and I do more with it than just gaming. If only that sweet Samsung B-die wasn't so outrageously expensive...

Khorne posted:

The only other annoying thing to upgrading is picking between win7+wufuc even though support drops in 2020, using linux on the desktop and being unable to play certain games, or using win10 and being forced to play games in fullscreen (+other issues, due to how it handles >60Hz screens and doing stuff on secondary monitors). I don't like any of those options. I mean I like linux, but I don't like not being able to play whatever I feel like and I already have linux on my work laptop. I wish MS weren't lazy assholes who have left win8/win10 busted for gaming. The MS store is just another reason to boycott win10 for me, too.

My dude, Ryzen is excellent for virtualization including PCIe passthrough. Run the best OS for each game, at essentially native speeds!

SamDabbers fucked around with this message at 03:38 on Apr 2, 2018

Khorne
May 1, 2002

SamDabbers posted:

Yeah I have no complaints with my 4.0GHz 1700, but my monitors are 1080p/60 and I do more with it than just gaming. If only that sweet Samsung B-die wasn't so outrageously expensive...
Well to be clear, if I had a 1700 or 1700x I would have no complaints either. It performs almost identically to my current processor or better depending on the use case. If I had something worse than my current processor I'd be far more inclined to buy one. The only reason I kind of want to buy something now is I can bite the bullet on an OS change, and I suppose if I were to apply spectre/meltdown patches the 2700x would perform better than my current system under quite a few actual use cases. The other issue is if I buy now I can sit on what I buy until 2020-2022 or whenever some big upgrades drop. I don't think my ivy bridge system is going to last beyond the next 2 years, and I don't think the next generation or two is going to be a big gap over the current one.

If I wait until 2020 to upgrade it will be like the i5 before my ivy bridge all over again. I'm going to buy something in 2020 and want to replace it in 2021/2022 if 7nm hits then.

I also play 1080p but at 144Hz. I've been debating picking up a nice 1440p monitor, but that's a somewhat difficult sell as well due to the price premium for 165Hz there. I don't even want to get started on thinking about IPS vs TN, because it's "the best monitor when you don't have to aim in fps/tps games vs the best monitor for fps that's not as good everywhere else". :(

SamDabbers posted:

My dude, Ryzen is excellent for virtualization including PCIe passthrough. Run the best OS for each game, at essentially native speeds!
Yeah. Instead of windows+linux vms on desktop I'll likely switch to linux+windows vms if it's viable when I do. If it isn't, there's always dual booting. But I really hate restarting. To the point where I usually only update windows at specific times, and it having to restart for every drat update is the bane of my existence.

I really would rather buy AMD. Maybe not a video card because I have to do CUDA stuff for contract work sometimes.

Khorne fucked around with this message at 04:16 on Apr 2, 2018

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
If I were buying a new CPU I would seriously consider Ryzen, Intel is still probably "better" for my use case but only marginally so and I'd rather my money go to AMD than Intel. If only AMD made good video cards too. :sigh:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Khorne posted:

I also play 1080p but at 144Hz. I've been debating picking up a nice 1440p monitor, but that's a somewhat difficult sell as well due to the price premium for 165Hz there. I don't even want to get started on thinking about IPS vs TN, because it's "the best monitor when you don't have to aim in fps/tps games vs the best monitor for fps that's not as good everywhere else". :(

This one is real easy to answer: TN like the XG270HU or S2716DG gives you about a 1-2 ms average pixel-response-time advantage over a XB270HU IPS or PG279Q, no human being is going to notice the remotest difference there. 27" is becoming too large for a flat TN panel and you will want to be looking at it straight-on to avoid color shift. It's not the end of the world, but I did find it noticeable on a dual-screen setup, and IPS solves this problem with no real downside apart from being another $100 more expensive.

(more or less, TN panels are lying about their average response times while IPS are generally not... and VA are not lying about average, but have a huge problem with maximum response times, like that Eizo with a 8ms average/44ms maximum RTC... that's normal for VA panels.)

Paul MaudDib fucked around with this message at 06:24 on Apr 2, 2018

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Paul MaudDib posted:

This one is real easy to answer: TN like the XG270HU or S2716DG gives you about a 1-2 ms average pixel-response-time advantage over a XB270HU IPS or PG279Q, no human being is going to notice the remotest difference there. 27" is becoming too large for a flat TN panel and you will want to be looking at it straight-on to avoid color shift. It's not the end of the world, but I did find it noticeable on a dual-screen setup, and IPS solves this problem with no real downside apart from being another $100 more expensive.

(more or less, TN panels are lying about their average response times while IPS are generally not... and VA are not lying about average, but have a huge problem with maximum response times)

That S2716G is also has a ton more RTC overshoot error than the XB270HU IPS making any response time advantage a hollow victory at best. But we have to keep that TN audiophile gaming myth alive because.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply