|
redeyes posted:Im sure no one cares at this point but i got the opportunity to test a RX580 vs a GTX 1070 founders edition. The 1070 is so much faster its insane, not to mention quieter, cooler running, AND a lot less buggy. All hail our nvidia overlord.
|
# ? Feb 12, 2018 21:28 |
|
|
# ? May 16, 2024 17:13 |
|
GRINDCORE MEGGIDO posted:Really hoping they keep soldering the ryzen+
|
# ? Feb 12, 2018 21:30 |
|
Paul MaudDib posted:realtime CPU encoding will put the hurt even on something as powerful as a 1700. I can speak from personal experience here. GPU > CPU for real time compression, even using a capture card and playing on a console. My R7 1700 doesn't even come close.
|
# ? Feb 12, 2018 22:19 |
|
I still think the output of my 270X looks subpar compared to software encoding, but I wouldn't be surprised if Nvidia come up with a superior way to encode at 3,500 and 6,000 kbps, especially with all the algos madshi is cranking out.
|
# ? Feb 12, 2018 22:36 |
|
Can raven ridge stream Netflix 4k?
|
# ? Feb 12, 2018 22:37 |
|
ufarn posted:I still think the output of my 270X looks subpar compared to software encoding, but I wouldn't be surprised if Nvidia come up with a superior way to encode at 3,500 and 6,000 kbps, especially with all the algos madshi is cranking out. 270X is GCN 1.0 (Pitcairn), the newer cards have better quality at a given bitrate. Kepler is pretty rough too.
|
# ? Feb 12, 2018 22:41 |
|
Malcolm XML posted:Can raven ridge stream Netflix 4k? Look up my previous post please
|
# ? Feb 12, 2018 22:50 |
|
redeyes posted:Im sure no one cares at this point but i got the opportunity to test a RX580 vs a GTX 1070 founders edition. The 1070 is so much faster its insane, not to mention quieter, cooler running, AND a lot less buggy. Isn't the 580 positioned against the 1060?
|
# ? Feb 12, 2018 22:57 |
|
sauer kraut posted:Look up my previous post please So, no.
|
# ? Feb 12, 2018 23:09 |
|
ufarn posted:I still think the output of my 270X looks subpar compared to software encoding, but I wouldn't be surprised if Nvidia come up with a superior way to encode at 3,500 and 6,000 kbps, especially with all the algos madshi is cranking out. I've streamed MK8D at 6mbps (pretty much the max I can spare out of my 10mbps upload), with my 1700 and my 1060, and the difference is stark. Even with no overhead except Windows and the streaming software, a CPU isn't as good.
|
# ? Feb 12, 2018 23:13 |
|
That's the opposite of basically every other piece of data and anecdote available on the internet re x264 vs gpu encoding though...
|
# ? Feb 12, 2018 23:18 |
|
Assuming Twitch hasn't deleted it, I have video proof. Don't get me wrong, I heard the same thing; I tried my cpu first. Just wasn't as smooth or as clean. Mister Facetious fucked around with this message at 23:27 on Feb 12, 2018 |
# ? Feb 12, 2018 23:23 |
|
Malcolm XML posted:So, no. You'd probably be better of if you just got a Nvidia Shield, Apple TV 4k, Roku Ultra or similar streambox.
|
# ? Feb 12, 2018 23:25 |
|
Mister Facetious posted:Assuming Twitch hasn't deleted it, I have video proof. Did you tune the settings on the CPU encoder or just leave it on veryveryveryveryveryfast
|
# ? Feb 12, 2018 23:39 |
|
I attempted to emphasize quality with every setting, since I was capturing footage from a console. And because artifacts are the Devil's pixel. Is it possible the capture card itself prefers a GPU (Avermedia)?
|
# ? Feb 12, 2018 23:47 |
|
with a capture card you're not using the gpu at all to do things other than put a flat texture on the overlay and encode so it's probably actually using real amounts of resources to encode in that case
|
# ? Feb 13, 2018 00:01 |
|
Well it lets me choose which I want to encode with. I can do a test run tonight if you want. I'll try to get screencaps. Mister Facetious fucked around with this message at 00:41 on Feb 13, 2018 |
# ? Feb 13, 2018 00:37 |
|
Twerk from Home posted:Isn't the 580 positioned against the 1060? I was just testing cards I could get my hands on. Not like AMD has anything else you can buy.
|
# ? Feb 13, 2018 00:47 |
|
Seems like the 2400G is a winner with the general populace:
|
# ? Feb 13, 2018 01:37 |
|
Are the 2200G and 2400G APUs overclockable and if so by how much?
|
# ? Feb 13, 2018 01:50 |
|
spasticColon posted:Are the 2200G and 2400G APUs overclockable and if so by how much? Yes, and not much according to Linus. They only got to 3.8ghz all core on the 2400G, which is less than its stock 1 core turbo speed. I imagine this might get better with better bios.
|
# ? Feb 13, 2018 01:51 |
|
Cygni posted:Yes, and not much according to Linus. Not even with a very beefy aftermarket cooler? Bummer.
|
# ? Feb 13, 2018 01:52 |
|
spasticColon posted:Are the 2200G and 2400G APUs overclockable and if so by how much? Yes, and from what I've seen the review samples are hitting around 4.0?
|
# ? Feb 13, 2018 01:53 |
|
the biggest problems are that it's not soldered (which is basically the case for every non-K/maybe X APU) and the graphics itself takes a nice chunk of the power and thermal budget as well.
|
# ? Feb 13, 2018 01:57 |
|
I don't really care about the Raven Ridge on desktop but will jump on the laptop version if the stars align aka OEMs not loving up on AMD offerings.
|
# ? Feb 13, 2018 03:44 |
|
ufarn posted:How many cores do you need before you're just being ridiculous? Big games only use up to 6 cores, and how many can OBS use for software encoding? I imagine there aren't any benchmarks where people "turn off" cores and compare but it'd be fun to see 6C vs 8C vs 12C. Dark Shikari posted:You generally don't want (vertical resolution of video) / (threads) to be lower than 40-50, and definitely not lower than ~30. You can afford to let this number be a bit lower (maybe 25-50% lower) if you use B-frames. edit: just remembered that x264 doesn't do 1:1 threads/logical cores but you get the idea
|
# ? Feb 13, 2018 04:02 |
|
spasticColon posted:Are the 2200G and 2400G APUs overclockable and if so by how much? The CPU side struggles a bit with 4.0Ghz on a lot of review samples, but the GPU seems capable of hitting 1.6Ghz with 1.3v based on Hardware Unboxed. I've seen some reviewers unable to push the iGPU much past 1.35Ghz and am super confused by that. At 1.6Ghz/CL14 3200Mhz DDR4, Vega 11 will compete with the RX 550 at 720p handily but falls behind @ 1080p because of bottlenecking. I'd suggest a Cryorig C7 or H7 for any overclocking on Raven Ridge. I can't wait for 7nm Raven successor, maybe 14CUs, 4C/8T, DDR4 4000 support and improved color compression. Maybe HBM2 won't be ruinously expensive by then and they can add it to an R7 variant @ 220$.
|
# ? Feb 13, 2018 04:27 |
|
spasticColon posted:Not even with a very beefy aftermarket cooler? Bummer. They're pasted like modern Intel. I saw at least one remark of "can't wait for the delidding videos". Presumably they're thinking if you really care about OC you shouldn't get an APU.
|
# ? Feb 13, 2018 04:38 |
|
Craptacular! posted:They're pasted like modern Intel. I saw at least one remark of "can't wait for the delidding videos". Presumably they're thinking if you really care about OC you shouldn't get an APU. Speaking of: https://www.youtube.com/watch?v=Qtiiu7m6V4M
|
# ? Feb 13, 2018 05:01 |
|
https://youtu.be/FntY5rYR4cE The Hardware Unboxed review really is the most comprehensive one I've seen today. Really interesting. Really can't wait for their overclocking videos. The Fortnite result was interesting, but getting nearly 30% extra out of each GPU has me salivating.
|
# ? Feb 13, 2018 05:27 |
|
Has anyone explored the least important question of them all: buttcoins
|
# ? Feb 13, 2018 05:56 |
|
the Steves are all right
|
# ? Feb 13, 2018 06:47 |
|
Malloc Voidstar posted:a 2008 post from x264 lead dev says is that a "maximum number of threads that x264 will use" or is that a performance recommendation? I'm guessing the former but it's unclear at what settings, because if you up the preset then you can pretty much generate an arbitrary amount of work. I'd guess that searching for optimal video encoding with b-frames and poo poo is pretty much NP-complete - it's worth going a reasonable way down the rabbithole (veryslow/slowest) but boy howdy can you waste some CPU time if you want to go from 99.9% to 100% optimal. Paul MaudDib fucked around with this message at 06:59 on Feb 13, 2018 |
# ? Feb 13, 2018 06:50 |
|
Palladium posted:Has anyone explored the least important question of them all: buttcoins
|
# ? Feb 13, 2018 15:03 |
|
it lives!
|
# ? Feb 13, 2018 15:10 |
|
.....Not sure if memeing, or genuine mistake. Also, source?
|
# ? Feb 13, 2018 15:17 |
|
https://www.youtube.com/watch?v=FntY5rYR4cE
|
# ? Feb 13, 2018 15:20 |
|
Anime Schoolgirl posted:newegg is selling the APUs at a 30 dollar markup for this reason Why would buttcoin miners buy these APUs? Pentiums and Celerons are still cheaper if they need cheap CPUs for their mining rigs.
|
# ? Feb 13, 2018 21:12 |
|
spasticColon posted:Why would buttcoin miners buy these APUs? Pentiums and Celerons are still cheaper if they need cheap CPUs for their mining rigs. Because an APU has a decent built in GPU.
|
# ? Feb 13, 2018 21:16 |
|
|
# ? May 16, 2024 17:13 |
|
Xae posted:Because an APU has a decent built in GPU. But discreet GPUs would still be a lot better for buttcoin mining. 704 SPs at 1250MHz on slower DDR4 RAM ain't going to mine a bunch of buttcoins like a dedicated video card would. spasticColon fucked around with this message at 21:25 on Feb 13, 2018 |
# ? Feb 13, 2018 21:21 |