Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Anime Schoolgirl
Nov 28, 2002

jm20 posted:

It looks mighty hard to fit a pcie card and use sata on that board, inventive indeed.
If only someone would step up to the plate and bring us a single slot half height RX 4x0/GTX 10x0

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
With a single-slot cooler, amirite?

Anime Schoolgirl
Nov 28, 2002

Yes :smith:

SwissArmyDruid
Feb 14, 2014

by sebmojo
So, a while back, I tried to figure out what it would take to make a single-slot cooler for a low-profile video card. Long story short, the extruded heatsinks you get from a long-rear end bar and just chop off every couple of inches have a minimum order price, and they really don't cool all that well when you cut them down severely, and there's a reason why every modern enthusiast GPU moved to really thin fins. Even on the reference coolers, the fins are vastly thinner than I would feel comfortable machining.

It looks dire, I'm afraid. You're going to have to compromise on *one* thing. Even the Galax/Galaxy/KFA2 low-profile 750 Ti I have is a double-height cooler, and it uses an extruded heatsink with no heatpipes.



:toot: 50 cents on the day.

SwissArmyDruid fucked around with this message at 08:36 on Aug 20, 2016

eames
May 9, 2009

EdEddnEddy posted:


I continue to really hope for the best here though. Intel needs a kick in the nuts and AMD needs a winning architecture that can bring them back into the game full swing. If Zen ends up being great, and they make some APU's with HBM2 for the mobile market that can swing within striking distance of say 25% slower than an Nvidia 1060 (would that be possible?) then they could really have some killer products on the market in the next year or two.


Since this seems to be a realistic scenario now: If this really happens, why the heck doesn't Apple just buy AMD? Surely a company like Apple has analysts and insiders that can gauge the performance of Zen by now.
They'd free themselves from Intel and probably get a boatload of useful GPU/CPU patents for their ARM chips as well.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Will save you the trouble of digging the answer out from answer from many times over: AMD's present arrangement with Intel contains a clause that states that if either party should be acquired (obviously this makes it only pertinent to AMD, and not Chipzilla) AMD's x86 license is terminated, as is Intel x86-64 license. This makes acquisition untenable for anyone desiring to enter the x86 space.

eames
May 9, 2009

SwissArmyDruid posted:

Will save you the trouble of digging the answer out from answer from many times over: AMD's present arrangement with Intel contains a clause that states that if either party should be acquired (obviously this makes it only pertinent to AMD, and not Chipzilla) AMD's x86 license is terminated, as is Intel x86-64 license. This makes acquisition untenable for anyone desiring to enter the x86 space.

Oh I see, thanks. That changes everything as AMD minus the x86 license and designs is not even attractive to Apple, even though they're rumored to switch to ARM. (thinner! lighter! )

I think there's a decent good chance we'll see Zen in future Macbooks assuming the performance pans out, unless they really switch ARM before that.

Arzachel
May 12, 2012

EdEddnEddy posted:

I continue to really hope for the best here though. Intel needs a kick in the nuts and AMD needs a winning architecture that can bring them back into the game full swing. If Zen ends up being great, and they make some APU's with HBM2 for the mobile market that can swing within striking distance of say 25% slower than an Nvidia 1060 (would that be possible?) then they could really have some killer products on the market in the next year or two.

I could picture a good bang for the buck laptop being a 4C/8T Zen powered laptop with a 460X level APU with HBM on the die, but would they/could they make such a thing before Nvidia/Intel somehow beat them to the punch? (Intel sure won't, they have pretty much dropped updating the IGPU past what you see with the Iris Pro 580 from what I have seen.)

I've been waiting for something like this since Llano. :smith:

PC LOAD LETTER
May 23, 2005
WTF?!

Palladium posted:

Gotta love when Zen is rumored to be competitive with BDW-E, the minds of fanboys go straight to "gimme that for <$200", because competitive performance = competitive pricing doesn't apply to AMD.

With a healthy fanbase like this small wonder why AMD has been bleeding money since forever.
Zen being competitive with Broadwell-E is better than most (including myself) realistically hoped for but even if they manage to pull that off for general work loads that means Zen will still lose by ~5-10% to Skylake and 15-20% to Purleylake on a per clock basis.

For general desktop and gaming and 'real world' use purposes that sort of performance difference won't mean much. Even if they can only get Zen to clock to around 3.5Ghz at stock clockspeeds it won't matter much vs existing Skylake or future Purleylake chips.

But they'll lose the synthetic benchmark battles by significant amounts and they probably won't have as good perf/watt vs what Intel will be offering which will matter when they go to try and sell to the server markets. Being less than the best typically means you have to reduce prices by quite a bit in the PC biz unfortunately. The good news is even selling their chips for hundreds less than Intel will still result in dramatically improved ASP's and revenue so I'd hardly look at it as a 'bad' situation to be in.

What will be really interesting to see is if Zen+ performs like AMD is saying. If Zen+ really does end up being ~15% faster per clock than Zen it means they'll have essentially near identical performance to Purleylake even in synth benches. I wouldn't be surprised if AMD sold Zen+ for just a tad ($20-50) less than Intel's prices if they pull of that degree of performance.

PC LOAD LETTER fucked around with this message at 12:16 on Aug 20, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Twerk from Home posted:

Yes and no. It's massively memory bottlenecked and falls off at high resolution / AA / any other memory-intensive situations. I think on paper it's even faster than those!

So basically it's even more retarded on Intels part than I thought, as integrating an Iris Pro 580 into PCB + GDDR5 and memory controller would give them a very competitive product for single slot/low profile. I mean a 580 is paired with 4 skylake cores, so we're talking about an ASIC which pulls what 20W at most in a standalone configuration paired with 2-4GB memory pulling 10-20 more watts. That's a pretty slamdunk perfect product from my perspective, and it clearly has more room for say a 108/144EU part that still fits inside PCIE power spec. Intel could be selling bus powered 380X/960s to OEMs and it boggles my mind why it's not happening.

EmpyreanFlux fucked around with this message at 22:18 on Aug 22, 2016

is that good
Apr 14, 2012
Because there's just not that much profit in a market as small as consumer graphics cards, especially compared to all the other stuff Intel does

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Allstone posted:

Because there's just not that much profit in a market as small as consumer graphics cards, especially compared to all the other stuff Intel does

I'm not thinking consumer though, GPU's still have use in professional and enterprise, and it's clear they could build an up to snuff GPU to challenge even Nvidia, so it's kind of baffling.

Anime Schoolgirl
Nov 28, 2002

Unfortunately, CUDA rules the roost in that. There's a small reason why Intel is instead stuffing lots of little Pentium 2s in a Tesla-like card.

Arzachel
May 12, 2012

Anime Schoolgirl posted:

Unfortunately, CUDA rules the roost in that. There's a small reason why Intel is instead stuffing lots of little Pentium 2s in a Tesla-like card.

Is CUDA that important for a Autocad/Maya/Illustrator wokstation?

If I recall correctly, AMD's APUs have had a hybrid DDR3/GDDR5 mem controller for a few generations already, so there's likely some underlying issues to why no one's selling a PS4/Xbone style SoC.

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
"So, what, this thing's basically a PS4/XBONE? Why not just buy one of those for a cheaper price and "just works"-ness?"

Arzachel
May 12, 2012
Consoles make for poor laptops :v:

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Arzachel posted:

Consoles make for poor laptops :v:

If there were a market for it, I'm pretty sure that a first-party console laptop is the most possible it's ever been right now. The die shrunk xbone / PS4 have got to be relatively efficient, and they're just x86 + GCN.

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

Twerk from Home posted:

If there were a market for it, I'm pretty sure that a first-party console laptop is the most possible it's ever been right now. The die shrunk xbone / PS4 have got to be relatively efficient, and they're just x86 + GCN.

I would unironically buy a nintendo tablet that was basically just a web browser and a nintendo game unit, with a wii u pro controller. Maybe buy an optional tv dock for improved graphical performance.

cbirdsong
Sep 8, 2004

Commodore of the Apocalypso
Lipstick Apathy

mediaphage posted:

I would unironically buy a nintendo tablet that was basically just a web browser and a nintendo game unit, with a wii u pro controller. Maybe buy an optional tv dock for improved graphical performance.

You're in luck! It seems like this is basically what the NX is, except it uses Tegra and not an AMD SoC. http://www.eurogamer.net/articles/2016-07-26-nx-is-a-portable-console-with-detachable-controllers

Rastor
Jun 2, 2001

WCCFt slide dump for processor nerds:
http://wccftech.com/amd-zen-architecture-hot-chips/

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

mediaphage posted:

Maybe buy an optional tv dock for improved graphical performance.

eGPU is the future.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Still no word about how they're managing NUMA L3 between the two "core complex" units in an 8C/16T part. :argh:

Also, I personally am waiting for that final form of APU where they put 4C/8T Zen, enough GPU power to be around X60, and a couple dots of HBM2 into a thin-and-light. Up until earlier this month, there would have been no competition to such a product.

SwissArmyDruid fucked around with this message at 05:05 on Aug 23, 2016

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

cbirdsong posted:

You're in luck! It seems like this is basically what the NX is, except it uses Tegra and not an AMD SoC. http://www.eurogamer.net/articles/2016-07-26-nx-is-a-portable-console-with-detachable-controllers

I like this idea, but I'm going to be pretty cross if the dock doesn't actually make it look nice on the TV (that is, true alternate GPU rendering rather than some upscaling nonsense). I'm honestly not sure if this console is a replacement for the Wii U or the 3DS, or maybe both. I am a little skeptical of their ability to retain AAA 3rd party titles if they once again choose to poo poo the graphics bed, but whatever.

I mean I'm still going to buy this almost no matter what so

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

mediaphage posted:

I like this idea, but I'm going to be pretty cross if the dock doesn't actually make it look nice on the TV (that is, true alternate GPU rendering rather than some upscaling nonsense). I'm honestly not sure if this console is a replacement for the Wii U or the 3DS, or maybe both. I am a little skeptical of their ability to retain AAA 3rd party titles if they once again choose to poo poo the graphics bed, but whatever.

I mean I'm still going to buy this almost no matter what so

Well they already don't have much in the way of third party AAA titles on the Wii U to begin with, so they're not really losing out. Most Wii U third party stuff is ports from last-gen console versions, when the Wii U is supported at all, due to Wii U basically being an Xbox 360 with 2 GB of RAM and a slightly faster GPU.

Gwaihir
Dec 8, 2009
Hair Elf

FaustianQ posted:

95W for 8C/16T, 150W for 24C/48T and 180W for 32C/64T. I'm not sure why there isn't a 16C/32T chip for what seems to be a 125W slot but I'm not AMD.

Basically they seem to trade blows with Broadwell at each turn, so I want to say this will all be a pricing issue. What's the die size on a Broadwell Xeon? Zen is supposed to be 143mm˛.

143mm^2 is suuuuuuuuper small for a many core CPU- Broadwell Xeons range from 246-456mm^2, for the 10 core to 24 core versions.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Gwaihir posted:

143mm^2 is suuuuuuuuper small for a many core CPU- Broadwell Xeons range from 246-456mm^2, for the 10 core to 24 core versions.

I was going off what the original guesses were based on the original die shot leak, but apparently it's been revised upwards towards 200mm˛.

NewFatMike
Jun 11, 2015

SwissArmyDruid posted:

Still no word about how they're managing NUMA L3 between the two "core complex" units in an 8C/16T part. :argh:

Also, I personally am waiting for that final form of APU where they put 4C/8T Zen, enough GPU power to be around X60, and a couple dots of HBM2 into a thin-and-light. Up until earlier this month, there would have been no competition to such a product.

I still think there is no real competition to this tragically hypothetical product. The Pascal based laptop solutions would probably be double the price of such a device. It would probably be the ideal college laptop as well. Powerful enough for your low power games and emulators (CS:GO, Overwatch, WoW, etc), drop a Freesync panel in there so it ages a little more gracefully and get near - ultrabook level battery life by eating up all that dGPU space with battery? Oh yeah, sign me right up.

You can probably do some decent rendering work in there as well, modeling, and so forth. Pretty good for just about any major.

This is the kind of stuff that makes me to hnnnnngggggggg for Zen. AMD are so within striking distance of something incredible like this. But then again, it's up to the OEMs to make use of the capability if it's there. Even if it requires jamming a CPU grade part into a laptop chassis.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

NewFatMike posted:

But then again, it's up to the OEMs to make use of the capability if it's there.

NewFatMike posted:

it's up to the OEMs to make use of the capability if it's there.

NewFatMike posted:

it's up to the OEMs

When has this ever bit AMD in the rear end?

EdEddnEddy
Apr 5, 2012



mediaphage posted:

I like this idea, but I'm going to be pretty cross if the dock doesn't actually make it look nice on the TV (that is, true alternate GPU rendering rather than some upscaling nonsense). I'm honestly not sure if this console is a replacement for the Wii U or the 3DS, or maybe both. I am a little skeptical of their ability to retain AAA 3rd party titles if they once again choose to poo poo the graphics bed, but whatever.

I mean I'm still going to buy this almost no matter what so

I have held off getting a Wii U for mostly this reason. I want one, but outside of the Mario games, what else is the draw of the U?

My N3DS however I am bummed I didn't get sooner. This thing is a ton of fun.

Potato Salad
Oct 23, 2014

nobody cares


Nintendo's next console going to the NX really doesn't help AMD, does it? Global Foundries is making their Tegras.

To my knowledge, this would make Nintendo the first console corp to include an actual, honest-to-god COTS GPU in a system. That's....possibly attractive given the performance of Tegra-accelerated devices over Xbone / PS4. (I am only parroting articles citing stuff like SHIELD devices pushing easy 1080p 60FPS for the new DOOM where the Xbone can barely deliver 720p and very choppy 45-50 fps). If that actually works out and the power of their full-handheld-but-dockable-console isn't intentionally cut in the name of battery life, it would be hilarious to see Nintendo possibly force the rest of the console market forward with good hardware.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

EdEddnEddy posted:

I have held off getting a Wii U for mostly this reason. I want one, but outside of the Mario games, what else is the draw of the U?

My N3DS however I am bummed I didn't get sooner. This thing is a ton of fun.

I have one mostly for the Mario games, which are really really good. The Zelda remakes are nice too.

Anime Schoolgirl
Nov 28, 2002

Potato Salad posted:

Nintendo's next console going to the NX really doesn't help AMD, does it? Global Foundries is making their Tegras.

To my knowledge, this would make Nintendo the first console corp to include an actual, honest-to-god COTS GPU in a system. That's....possibly attractive given the performance of Tegra-accelerated devices over Xbone / PS4. (I am only parroting articles citing stuff like SHIELD devices pushing easy 1080p 60FPS for the new DOOM where the Xbone can barely deliver 720p and very choppy 45-50 fps). If that actually works out and the power of their full-handheld-but-dockable-console isn't intentionally cut in the name of battery life, it would be hilarious to see Nintendo possibly force the rest of the console market forward with good hardware.

That 1080/60fps demo was on Doom 3, actually

FuturePastNow
May 19, 2014


NewFatMike posted:


This is the kind of stuff that makes me to hnnnnngggggggg for Zen. AMD are so within striking distance of something incredible like this. But then again, it's up to the OEMs to make use of the capability if it's there. Even if it requires jamming a CPU grade part into a laptop chassis.

I don't think I've ever seen an AMD-powered laptop that wasn't a complete piece of poo poo.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Potato Salad posted:

Nintendo's next console going to the NX really doesn't help AMD, does it? Global Foundries is making their Tegras.

To my knowledge, this would make Nintendo the first console corp to include an actual, honest-to-god COTS GPU in a system. That's....possibly attractive given the performance of Tegra-accelerated devices over Xbone / PS4. (I am only parroting articles citing stuff like SHIELD devices pushing easy 1080p 60FPS for the new DOOM where the Xbone can barely deliver 720p and very choppy 45-50 fps). If that actually works out and the power of their full-handheld-but-dockable-console isn't intentionally cut in the name of battery life, it would be hilarious to see Nintendo possibly force the rest of the console market forward with good hardware.

The new Doom only runs on x86-64 systems. There are no ARM builds so it can't run natively on the Shield or any Tegra device. You were either watching people streaming it from a nice computer, or maybe a heavily stripped down tech demo.

Like you grossly overestimate the chipset Nvidia is offering here. It's just a particularly good tablet/smartphone SOC, maybe, which still puts it far behind the performance of the Xbox One or PS4, let alone the upgraded Xbox One and PS4 models that will be coming out next year. The current Nividia Shield K1 tablet, for instance, has a quad core 2.2 ghz 32 bit ARMv7-A CPU with a 192 core Kepler based GPU setup, with 2 GB of RAM for the system. The Xbox One's AMD APU is 8 1.6 GHz x86-64 cores, with 768 GPU cores that don't have to be downclocked so as to prevent overheating in a non-fan-equipped mobile device, and there's 8 GB of total RAM. PS4 is similar.

What Nintendo's on track to do, is for the third console in a row they're going to end up as the slowest/least powerful system out there.

The Wii was of course noticeably faster than the previous generation of consoles but was still quite a bit behind the 360 and PS3, including both only being able to render in standard def and only having 91 MB of RAM total (PS3 and 360 both have 512 MB) and only being single core PowerPC when the 360 was triple core/6 thread PowerPC and the PS3 was a weird setup with 1 main PowerPC core and 7 different cores used for software as well.

The Wii U is basically just an Xbox 360's design with more RAM - 2 GB specifically. The CPU is similarly triple core/six thread PPC, although at ~1.25 GHz instead of the 360's 3.2 GHz it has actual performance on most things about the same because of improvements in instruction set and the like. The GPU is only slightly more powerful than the 360's as well. So basically it launched as a 7 year old design when it came out in 2012.

Now if Nintendo's next regular console really will be the NX, and the NX really will be any sort of current or near future NVidia Tegra chipset - it's not even going to be as fast as the 2013 Xbox One and PS4, it'll be a decent bit faster than the Wii U but that's a very low bar. Depending on which particularly Tegra setup they use since it'll supposedly end up handheld too, it might even be only the performance of the Wii U! It's a very bad sign, their only real option for coming out ahead of the current Xbox One and PS4, let alone the upgraded models coming next year, would be securing a high core count Intel or AMD setup.

Anime Schoolgirl posted:

That 1080/60fps demo was on Doom 3, actually

If that's what he saw, then yeah that's just showing the hardware can handle a 12 year old game which ran in 720p on the original Xbox. Not exactly impressive!

fishmech fucked around with this message at 17:54 on Aug 24, 2016

EdEddnEddy
Apr 5, 2012



The current Tegra X1 in the Shield TV is a 8 core (4 A57 + 4 A53 cores) and a 256 core Maxwell GPU that is actively cooled and while it still isn't quite to XBox One or PS4 level, it is drat close. It runs War Thunder's flying parts almost better than the PS4 does. And can handle video like a champ up to 4K/60FPS H265.

Since the PX1 is now out even which is an even meatier 12Core with Pascale in it, capable of 8TF (though it used 250W and is water cooled for use in cars), it is reasonable to assume that whatever Nvidia tech the NX uses should probably be around the X1 but possible with Pascale tech for the GPU in it I would guess(hope?). Which would bring it pretty close to current gen console level graphics wise if not better.

Setset
Apr 14, 2012
Grimey Drawer

EdEddnEddy posted:

The current Tegra X1 in the Shield TV is a 8 core (4 A57 + 4 A53 cores) and a 256 core Maxwell GPU that is actively cooled and while it still isn't quite to XBox One or PS4 level, it is drat close. It runs War Thunder's flying parts almost better than the PS4 does. And can handle video like a champ up to 4K/60FPS H265.

Since the PX1 is now out even which is an even meatier 12Core with Pascale in it, capable of 8TF (though it used 250W and is water cooled for use in cars), it is reasonable to assume that whatever Nvidia tech the NX uses should probably be around the X1 but possible with Pascale tech for the GPU in it I would guess(hope?). Which would bring it pretty close to current gen console level graphics wise if not better.

PX1 still only uses 256 cuda cores, yes? Unless they are clocked a lot higher than the TX1 I don't see how the GPU would be much of an improvement, outside of power efficiency - which is kind of a big deal for sure. Maxwall and Pascal IPC are virtually the same.

Potato Salad
Oct 23, 2014

nobody cares



I dramatically mis-understood the article, so I'm going to offer a lame excuse of "It was 12:15 am and I was trying to sleep with a cold"

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

EdEddnEddy posted:

The current Tegra X1 in the Shield TV is a 8 core (4 A57 + 4 A53 cores) and a 256 core Maxwell GPU that is actively cooled and while it still isn't quite to XBox One or PS4 level, it is drat close. It runs War Thunder's flying parts almost better than the PS4 does. And can handle video like a champ up to 4K/60FPS H265.

Since the PX1 is now out even which is an even meatier 12Core with Pascale in it, capable of 8TF (though it used 250W and is water cooled for use in cars), it is reasonable to assume that whatever Nvidia tech the NX uses should probably be around the X1 but possible with Pascale tech for the GPU in it I would guess(hope?). Which would bring it pretty close to current gen console level graphics wise if not better.

It's nowhere close to the AMD GPU performance in the Xbox One or PS4, and being able to handle a video codec doesn't tell you much about games performance, it just tells you they have hardware codec support. Also, the 8 cores don't get used simultaneously, they switch between the sets of 4 cores based on system load - great at keeping battery draw or heating load down when you're doing non-intensive things, but you can't use them simultaneously to increase performance.

Plus the NX is supposed to be able to be used portably if the same rumors listing it as a Tegra chipset are true, which places serious constraints on what sort of graphics performance the games can expect, unless you've got the most amazing cooling tech for a handheld system in the world and a really good battery on it. You consider all this stuff combined and getting performance out of the thing that's as good as the 3 year old XBO/PS4 is a distant hope, let alone anything better, and once again improved CPU/GPU XBO/PS4 are due out next year. Both of those are expected to be able to have real time gameplay at at least ~3K horizontal resolution if not full 4K horizontal resolution, and of course better performance at 1080p regardless.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

fishmech posted:

being able to handle a video codec doesn't tell you much about games performance, it just tells you they have hardware codec support.

Nothing reminds me more of this than when I ran a Pentium 120 as a DVD player. Software decoding? Absolutely unusable. MPEG2 card installed? Perfect, of course.

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

HalloKitty posted:

Nothing reminds me more of this than when I ran a Pentium 120 as a DVD player. Software decoding? Absolutely unusable. MPEG2 card installed? Perfect, of course.

It's still like this if you've somehow got your hands on a 4K 60 fps video source and are trying to play it back without hardware acceleration.

  • Locked thread