Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

gwrtheyrn posted:

If my main use case is 11-boxing eve online, actively running another game, and watching some sort of streaming video (twitch, amazon, youtube), would ryzen be a better purchase over an intel hex-core with the information out so far?

For reference, I am currently running a 4690k more or less at stock, and I have to stop doing at least one of the things above to approach reasonable performance. Each eve client takes between 4 and 15% cpu in task manager depending on if they're backgrounded (limited to 10fps), my second game usually takes 50%, and streaming video can take a lot especially from amazon. This squarely basically pegs me at 100% load at all times. I'm not in the biggest rush to buy, but I've been wanting to upgrade for ages because the 4690k felt like a downgrade from the 980x I had before it so skylake-x might be a little too far out to wait. I could overclock a bit, but I doubt that will get me where I want to be

You are the usage case that 8+ core CPUs are for, congratulations.

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

gwrtheyrn posted:

If my main use case is 11-boxing eve online, actively running another game, and watching some sort of streaming video (twitch, amazon, youtube), would ryzen be a better purchase over an intel hex-core with the information out so far?

For reference, I am currently running a 4690k more or less at stock, and I have to stop doing at least one of the things above to approach reasonable performance. Each eve client takes between 4 and 15% cpu in task manager depending on if they're backgrounded (limited to 10fps), my second game usually takes 50%, and streaming video can take a lot especially from amazon. This squarely basically pegs me at 100% load at all times. I'm not in the biggest rush to buy, but I've been wanting to upgrade for ages because the 4690k felt like a downgrade from the 980x I had before it so skylake-x might be a little too far out to wait. I could overclock a bit, but I doubt that will get me where I want to be

8-core Ryzen is definitely the CPU for you

Kazinsal
Dec 13, 2011


gwrtheyrn posted:

If my main use case is 11-boxing eve online, actively running another game, and watching some sort of streaming video (twitch, amazon, youtube), would ryzen be a better purchase over an intel hex-core with the information out so far?

For reference, I am currently running a 4690k more or less at stock, and I have to stop doing at least one of the things above to approach reasonable performance. Each eve client takes between 4 and 15% cpu in task manager depending on if they're backgrounded (limited to 10fps), my second game usually takes 50%, and streaming video can take a lot especially from amazon. This squarely basically pegs me at 100% load at all times. I'm not in the biggest rush to buy, but I've been wanting to upgrade for ages because the 4690k felt like a downgrade from the 980x I had before it so skylake-x might be a little too far out to wait. I could overclock a bit, but I doubt that will get me where I want to be

...You have 11 EVE accounts?

God drat.

You can definitely afford to go whole hog with Ryzen then. Post a trip report.

gwrtheyrn
Oct 21, 2010

AYYYE DEEEEE DUBBALYOO DA-NYAAAAAH!

Kazinsal posted:

...You have 11 EVE accounts?

God drat.

You can definitely afford to go whole hog with Ryzen then. Post a trip report.

Active ones. I have paid real money for eve in years

WhyteRyce
Dec 30, 2001

All this talk of streaming games makes me feel like an old man who just doesn't understand these youths

Klyith
Aug 3, 2007

GBS Pledge Week

Palladium posted:

Well, even AMD themselves are making their own SKUs above $320 look plenty bad already. Pay more for factory OC and XFR = lol

top of range parts are traditionally way out in the land of diminishing returns, ryzen 1800x is no different.

when you look at a situation where that isn't the case -- like the core i5s having hyperthreading disabled -- what you're seeing is a monopoly distortion, not "good value".

teagone
Jun 10, 2003

That was pretty intense, huh?

WhyteRyce posted:

All this talk of streaming games makes me feel like an old man who just doesn't understand these youths

Why do you watch the Kings play basketball? Because it's entertaining and they're better at basketball than you. Apply that logic to a streamer playing a game you might also enjoy and is probably better at it than you.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

teagone posted:

better at basketball than you

Well, on a good day at least.

eames
May 9, 2009

AMD posted:

As a general guideline: a CPU voltage of up to 1.35 V is acceptable for driving everyday overclocks of the AMD Ryzen processor. Core voltages up to 1.45 V are also sustainable, but our models suggest that processor longevity may be affected. Regardless of your voltage, make sure you’re using capable cooling to keep temperatures as low as possible.

Most R7 CPUs seem to go up to 3.8 Ghz at that voltage which matches the "Critical 2" point in the voltage vs frequency chart linked earlier.
This also explains why a 1800X with XFR only boosts to 3.7 Ghz with load on all 8 cores. It only goes up to 4.1 Ghz nominal turbo with load on up to two cores. :sigh:

Somebody linked a overclocking guide, open at your own risk, could have a virus embedded for all I know.

http://www.mediafire.com/file/3knlj278nr6jdx9/C6H+XOC+Guide+v04.pdf

gwrtheyrn
Oct 21, 2010

AYYYE DEEEEE DUBBALYOO DA-NYAAAAAH!
Apparently x370 boards are basically nonexistant at this point :v:

I'm probably looking to do something like

CPU: 1700x
GPU: GTX1080 (already have)
Motherboard: asrock taichi or killer or something, everything is preorder, OoS, or backordered
RAM: 4x16gb g.skill 3000 cas 15
Cooler: NH-U12s
Case: Fractal design r5
PSU: evga supernova 750w
SSD: 960 evo 500gb

Not sure about going for a full 64gb of ram, but I do probably want more than the 32 I have right now. Total cost is ~1500 excluding the gpu which I already have

WhyteRyce
Dec 30, 2001

teagone posted:

Why do you watch the Kings play basketball? Because it's entertaining and they're better at basketball than you. Apply that logic to a streamer playing a game you might also enjoy and is probably better at it than you.

If we are going with the Kings basketball analogy it would be like watching someone poorly play CoD and manage to staple their balls to the their leg while doing it

teagone
Jun 10, 2003

That was pretty intense, huh?

Sinestro posted:

Well, on a good day at least.


WhyteRyce posted:

If we are going with the Kings basketball analogy it would be like watching someone poorly play CoD and manage to staple their balls to the their leg while doing it

I laughed, well played. Haha.

Anime Schoolgirl
Nov 28, 2002

eames posted:

Nice launch. :bravo:
:pwn:

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

eames posted:

Most R7 CPUs seem to go up to 3.8 Ghz at that voltage which matches the "Critical 2" point in the voltage vs frequency chart linked earlier.
This also explains why a 1800X with XFR only boosts to 3.7 Ghz with load on all 8 cores. It only goes up to 4.1 Ghz nominal turbo with load on up to two cores. :sigh:

Somebody linked a overclocking guide, open at your own risk, could have a virus embedded for all I know.

http://www.mediafire.com/file/3knlj278nr6jdx9/C6H+XOC+Guide+v04.pdf

"If ratio is set above default (on 1800X = 36.25x), the CPU will enter “OC Mode” and disable CPU/XFR and any power saving or limitations."
"- CPU temperature will read ~60 °C in BIOS due to no power savings enabled."
"CPU Core Voltage reading from the SIO (BIOS/CPU-Z) fluctuates, use DMM for accurate readings."

My god, rough around the edges is an understatement for AM4 or at least this Asus Crosshair VI mobo. How is this excusable when Intel got this right for almost 10 years.

Palladium fucked around with this message at 02:29 on Mar 4, 2017

Anime Schoolgirl
Nov 28, 2002

Alereon posted:

So to clarify, you're trying to use software to compress the video for the game you're playing in real-time? Yeah that'll be hard on a quad-core CPU but it seems like there's better solutions, like hardware encoding.
Hardware encoding shits itself pretty reliably at 60fps unless you're using AMD cards, hilariously.

Also, QuickSync duplicates your display when you encode with that, which is nice if you're playing at 1080p60fps, not so much when you're playing anything with a higher refresh rate or resolution

Toalpaz
Mar 20, 2012

Peace through overwhelming determination

Palladium posted:

"If ratio is set above default (on 1800X = 36.25x), the CPU will enter “OC Mode” and disable CPU/XFR and any power saving or limitations."
"- CPU temperature will read ~60 °C in BIOS due to no power savings enabled."
"CPU Core Voltage reading from the SIO (BIOS/CPU-Z) fluctuates, use DMM for accurate readings."

My god, rough around the edges is an understatement for AM4 or at least this Asus Crosshair VI mobo. How is this excusable when Intel got this right for almost 10 years.

I think for weeks before the release people in the thread were talking about the mobo manufacturers dragging their feet for this release. The release bios in video game terms must be at least an alpha state release. Presumably cause they didn't know Ryzen was a serious (solid?) performing chip and weren't planning on releasing so soon. Also maybe because AMD didn't whip them into shape fast enough. I don't know how this works.

Who knows, maybe 9 months ago the r5/r3's release date was The release date.

wargames
Mar 16, 2008

official yospos cat censor

Palladium posted:

How is this excusable?

AMD has been broke as poo poo for a while and didn't have the resources. But if you look at the 480 at launch and where it is today you will see improvements because guess what amd doesn't just release a product and move on. The 480's directx 11 performs has improved a bunch since release and equals a 1060. These are first generation teething issues because guess what amd hasn't really released a new chip in 5 years they forgot how to do it.

Look at first gen i7s the 860 and 920 vs 2500k, the first gen i7s where good but the refreshes where better and you can see zen doing the same. Also SMT seems to just be just stupid broken so benchmarks seem off?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
It's a hilariously rushed launch, errata, BIOS and coding problems abound. But if I understand correctly, it has forced board manufacturers to start pushing product, which is what AMD wanted. I mean if the 1 million CPUs already sold are correct, accompanying number of motherboards usually follow suit. AMD could be willing to eat this if it mean they get volume up, and a staggered release (betting R5 CPUs with RX 500 series, and R3 series CPUs in tandem with Raven Ridge) might mean the much more affordable and better value for gamers R5 and R3 will have rosier reviews.

wargames
Mar 16, 2008

official yospos cat censor

FaustianQ posted:

I mean if the 1 million CPUs already sold are correct.

Quarterly report should be interesting if true.

FuturePastNow
May 19, 2014


This is no more chaotic than any previous AMD processor launch. If you're looking for perfection, you probably have some rose-tinted memories of Athlon 64. At least these problems are nothing compared to the hot garbage of Phenom I and Bulldozer.

FuturePastNow fucked around with this message at 03:52 on Mar 4, 2017

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



VostokProgram posted:

Video encoding seems like it would be exactly the kind of workload that would saturate a core's execution units. Not a lot of dependencies and branches, just lots and lots of math.

I'm sure it's pretty cache efficient, yeah, but unless the AVX instructions block the whole core (would not be too surprised if they did) I don't see why it'd be a problem.

Boiled Water posted:

Very likely, but it looks like a decent amount of space is wasted compared to the good ol' LGA mounts.

If you're counting the heat sink mount as wasted space, I'm going to count the label as the real wasted space :)

Assepoester
Jul 18, 2004
Probation
Can't post for 10 years!
Melman v2

MaxxBot posted:

[H] did some VR stuff in their review, it looks stronger in VR than in the general gaming benchmarks.

http://www.hardocp.com/article/2017/03/02/amd_ryzen_1700x_cpu_review
It's losing in every single VR bench tho, it says lower numbers are better http://www.hardocp.com/article/2017/03/02/amd_ryzen_1700x_cpu_review/5



Truga posted:

GPU encoding produces poo poo quality unless you're willing to stream at higher bitrates and then everything goes to poo poo anyway, because youtube will recompress your stream to poo poo so people can actually watch it.

GPU encoding is great for dumping videos to disk, since there's basically no performance hit, and you can just give it a 30Mb/s bitrate. Internet streaming not so much, and at 5Mbit/s, which I stream at (max for twitch is 3.5, even, which is why I moved to youtube), x264 produces a vastly better stream that youtubes digest far better.
This is not borne out by the OBS tests https://obsproject.com/forum/threads/comparison-of-x264-nvenc-quicksync-vce.57358/

Dante80
Mar 23, 2015

Some more news...

AMD SMT cores are mapped differently than Intel:
- Some websites claim than Intel logical core mapping is: thread 1 of every CPU 1,2,3..,8 and thread 2 of every CPU 9, 10, 11... 16.
- AMD Ryzen logical cores are apparently mapped sequentially (one core at a time): CPU1 = 1,2, CPU2 = 3,4... CPU8 = 15,16.
- This causes problems in game engines that core lock their 6-8 worker threads (assuming console port). A game engine like this would only use 3 or 4 cores (both SMT threads on each) on AMD 8-core Ryzen. This would explain huge gains seen by some reviewers when disabling hyperthreading.

----------------

G.SKILL Announces Flare X Series and FORTIS Series DDR4 Memory for AMD Ryzen
https://www.gskill.com/en/press/view/g-skill-announces-flare-x-series-and-fortis-series-ddr4-memory-for-amd-ryzen



---------------

https://twitter.com/Dresdenboy/status/837219996166205442
(Context. Apparently, the reason AIDA64 is not reporting L2/L3 stats correctly is because they were not given a Ryzen sample before launch. Changes coming now.)

Platystemon
Feb 13, 2012

BREADS

Dante80 posted:

AMD SMT cores are mapped differently than Intel:
- Some websites claim than Intel logical core mapping is: thread 1 of every CPU 1,2,3..,8 and thread 2 of every CPU 9, 10, 11... 16.
- AMD Ryzen logical cores are apparently mapped sequentially (one core at a time): CPU1 = 1,2, CPU2 = 3,4... CPU8 = 15,16.

Why would they do this? :psyduck:

PC LOAD LETTER
May 23, 2005
WTF?!

Twerk from Home posted:

Where is AVX2 expected to matter and do any of these reviews look at AVX2 256-bit wide commands?

HPC stuff. Doesn't matter really for games or desktop stuff. Maybe in several years that will change but not a whole looks to benefit from it on the desktop. Maaaybe physics engines in games? That could just as easily get tossed at the GPU though.

Dante80
Mar 23, 2015

Platystemon posted:

Why would they do this? :psyduck:

How do you mean? The scheduling is the same that the consoles have, so it was pretty easy for AMD to do the same (it also makes logical sense due to the way the Uarch is arranged on both dies).

Microsoft and game engines on PC have different scheduling, and one of the jobs needed to port the game into PC from the newest consoles is to change said scheduling from what Jaguar does to what Intel does (AMD never had SMT before, so there was no need to do otherwise in the past).

Thus, without proper patches to Windows and the games themselves, we see Ryzen performance getting worse when SMT is enabled. Which also speaks a lot about how loving clown-ish and rushed the launch was...AMD stock has lost more than $1.5bn in two days due to their PR department lol.

Dante80 fucked around with this message at 12:33 on Mar 4, 2017

Platystemon
Feb 13, 2012

BREADS
I mean, why would they not go with Intel’s convention?

This is a foreseeable outcome of the failure to do so.

I don’t care if AMD engineers think that this ordering scheme is more logical. When in Rome, do as the Romans do.

Anime Schoolgirl
Nov 28, 2002

Intel cores have always been arranged as 1/H1/2/H2/3/H3.... and so on in Windows, as Ryzen is. I'm pretty sure that's just baseless theorycrafting when the real problem is that SMT just doesn't work properly in the first place.

In unix-based systems, intel cores are arranged as 1/2/3/4/H1/H2/H3/H4. I can see how the problems arise if Ryzen doesn't follow this convention.

Anime Schoolgirl fucked around with this message at 12:56 on Mar 4, 2017

repiv
Aug 13, 2009

Dante80 posted:

Thus, without proper patches to Windows and the games themselves, we see Ryzen performance getting worse when SMT is enabled.

It looks like Windows is already working correctly, The Stilt ran coreinfo against his Ryzen chip and the logical cores are being detected as sequential pairs as you described:

code:
Logical to Physical Processor Map:
**--------------  Physical Processor 0 (Hyperthreaded)
--**------------  Physical Processor 1 (Hyperthreaded)
----**----------  Physical Processor 2 (Hyperthreaded)
------**--------  Physical Processor 3 (Hyperthreaded)
--------**------  Physical Processor 4 (Hyperthreaded)
----------**----  Physical Processor 5 (Hyperthreaded)
------------**--  Physical Processor 6 (Hyperthreaded)
--------------**  Physical Processor 7 (Hyperthreaded)
On the other hand coreinfo shows that the chip presents itself as a single NUMA node - surely each CCX should be a separate node so the scheduler knows to avoid migrating threads between them?

repiv fucked around with this message at 13:43 on Mar 4, 2017

Dante80
Mar 23, 2015

So the performance disparity with SMT on could be attributed to the OS or the engine not seeing that the chip is two 4 core complexes, thus instead of utilizing the slaved L3 for each complex it puts the interconnect (fabric or sth) in overdrive?

That would give a lot of latency, right?

In other news, it seems that memory is pretty important to the platform.

Memory Speed Has a Large Impact on Ryzen Performance

Dante80 fucked around with this message at 13:22 on Mar 4, 2017

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Dante80 posted:

Some more news...

AMD SMT cores are mapped differently than Intel:
- Some websites claim than Intel logical core mapping is: thread 1 of every CPU 1,2,3..,8 and thread 2 of every CPU 9, 10, 11... 16.

On my 3770K, core 1 is the second thread of core 0, core 2 is physical, core 3 is the second thread of core 2, etc. etc.. alternating.

This launch has kind of sucked really badly for AMD. I think they should have delayed it until Vega launch, so they could not only have worked out all the bugs in the firmware and so on, so nobody ends up with pre-release firmware on their board, giving lovely performance; but also so they could generate Vega sales with new Ryzen/Vega systems as opposed to Ryzen/NVIDIA systems.

Klyith
Aug 3, 2007

GBS Pledge Week

Dante80 posted:

Which also speaks a lot about how loving clown-ish and rushed the launch was...AMD stock has lost more than $1.5bn in two days due to their PR department lol.

Well, it gained quite a bit of that 1.5 billion due to their PR department as well, by handing out pre-release samples selectively to hypesters and LN overclockers. :v:

PC LOAD LETTER
May 23, 2005
WTF?!
Says the difference from DDR4 2133 (CL10) to DDR4 3466 (CL14) is about 10% increase in performance which is noticeable.

Weird to see performance improve that much from RAM speed. I wonder if the firmware updates they're going to be doing will make a difference there. DDR4 3466 (CL14) isn't easy to achieve on Ryzen right now. Some AMD specific DRAM is coming for Ryzen from GSkill I think so maybe that will help.

HalloKitty posted:

This launch has kind of sucked really badly for AMD.
Nah I think the Phenom TLB bug launch was worse. That thing caused a big performance hit that flat out wasn't fixable without redoing the hardware a bit which took a long time. It seems like they've got a decent shot to fix many of Zen's problems with firmware and actual (lol) windows drivers and without those fixes it does actually perform decently if inconsistently.

Realistically they seem to be selling well if some of the rumors are anything to go by so people are at least giving them some benefit of the doubt.

But yeah delaying another month or 2 to let things get fixed would've been a drat good idea. I have no clue why they rushed things.

\/\/\/\/On the value stuff yeah it really could be that simple, the higher performance stuff might be a different story\/\/\/\/

PC LOAD LETTER fucked around with this message at 13:47 on Mar 4, 2017

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
:airquote:AMD specific :airquote: being single rank dimms

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

PC LOAD LETTER posted:

But yeah delaying another month or 2 to let things get fixed would've been a drat good idea. I have no clue why they rushed things.

AMD is still a loss making company. Cash flow problems could have dictated that they release Ryzen as soon as they could. There's a good chance that performance will improve with firmware updates and as games or whatever are optimised for the new architecture. Meanwhile they'll have more cash on hand for the Vega launch.

Truga
May 4, 2014
Lipstick Apathy

Are we reading the same thread?

"mse" is mean square error, i.e. "how much of this video is different from the lossless one"

x264 is noticeably better in general, and copes far better with high motion games like 1st/3rd person games, which I stream. Again, from that thread:

quote:

From this computer-generated rating with mainly the mse as criteria, you may come to the conclusion that NVENC is on par with x264 preset=veryfast (the default in OBS), or even a bit better, but unfortunately it isn't. At least for high motion scenes.

I was super hyped about gpu encoding some years ago, but it just can't be that good when you actually sit down and think about it. It's annoying and I wish it weren't the case, but there's not much that can be done about this, especially about high motion games. GPUs just aren't suited very well to encoding video - video encoding is highly deterministic and thus isn't something that profits that much off of parallel threads after a certain amount. You need the previous frame to encode the next one correctly and the next frame might be completely different.

Highly simplified: by the time a frame gets through a GPU's longass pipeline, a cpu has already gone over it 4-5 times, choosing progressively better settings each time. GPUs on the other hand just throw n threads at the problem, with a rainbow of settings loosely on the previous frames, and hope it produces a good frame. You can see how this might not be ideal in many scenarios.

There's a reason why a decent encoding card costs $1000 or more. :P

Also, I've been getting the eve itch again, but that guy with 11 clients reminded me why I stopped playing. Thanks, goons!

Platystemon
Feb 13, 2012

BREADS
Beating x264 on “veryfast” is better than I expected for hardware encoding.

quote:

From this computer-generated rating with mainly the mse as criteria, you may come to the conclusion that NVENC is on par with x264 preset=veryfast (the default in OBS), or even a bit better, but unfortunately it isn't. At least for high motion scenes.

Well then.

Truga
May 4, 2014
Lipstick Apathy
Yeah, hardware encoding is getting pretty good, in about 5 more years I expect it to be on-par :v:

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Platystemon posted:

Why would they do this? :psyduck:

game developers are generally under crunch pressure and don't make the best decisions

https://msdn.microsoft.com/en-us/library/windows/desktop/ms683194(v=vs.85).aspx

is the right api to use and I guess they didn't use it!

Adbot
ADBOT LOVES YOU

Platystemon
Feb 13, 2012

BREADS
This is a tangentially related question, but I don’t know of a thread it fits in better and we’re on the subject anyway:

What hardware do commercial video streaming services use for encoding?

  • Locked thread