|
Shadowplay records at a high bitrate because that's really what GPU encoding is actually good in
|
# ? Oct 7, 2017 18:20 |
|
|
# ? May 19, 2024 03:20 |
|
22 Eargesplitten posted:
To record the whole match you need to manually start and stop recording.
|
# ? Oct 7, 2017 18:24 |
|
Acer Predator XB271HU abmiprz 1440p 144hz g-sync trip report
|
# ? Oct 7, 2017 19:16 |
|
Kramjacks posted:To record the whole match you need to manually start and stop recording. Thanks. Let me know if there's a different thread for recording stuff, but a couple questions. Is there a way to change the recording quality (not broadcast quality)? I don't want to try to upload a 30gb file to Youtube, that seems excessive for a 30 minute game. Also, is there a way to set the push to talk button to a mouse button instead of keyboard? I'm not planning on being a ~pro streamer~ or anything, someone just requested I upload a game with me dropping someone at 400m in PUBG and then getting 2 more wiggling half in cover trying to revive him. I want the option of uploading.
|
# ? Oct 7, 2017 19:47 |
|
Prescription Combs posted:Acer Predator XB271HU abmiprz 1440p 144hz g-sync trip report Isn't that the TN version?
|
# ? Oct 7, 2017 21:12 |
|
22 Eargesplitten posted:Thanks. Let me know if there's a different thread for recording stuff, but a couple questions. Is there a way to change the recording quality (not broadcast quality)? I don't want to try to upload a 30gb file to Youtube, that seems excessive for a 30 minute game. Also, is there a way to set the push to talk button to a mouse button instead of keyboard? Yes. Press alt+z and select recording then customize for resolution and bitrate. Click the mic symbol for those options. You really should record high quality footage though because you can with no performance loss then compress it afterwards with something like Handbrake. Can basically not see the difference and make it 10 times smaller. You wont ever see file sizes all that small for real time recording. The feature youre using is the "DVR" like function that saves the last X amount of time to save highlights. Actually very cool feature however not for recording games If you have good upload then stream directly to Youtube and save a lot of hassle. However the quality suffers accordingly (max bitrate like 18 whereas recording start at 50+), but its free to try and see its acceptable 1gnoirents fucked around with this message at 21:20 on Oct 7, 2017 |
# ? Oct 7, 2017 21:15 |
|
lDDQD posted:Isn't that the TN version? It's "144hz" It's really 165hz if you turn on some built in overclocking I guess. MagusDraco fucked around with this message at 22:19 on Oct 7, 2017 |
# ? Oct 7, 2017 21:35 |
|
So I recently bought a 1080ti Strix and it does something really annoying and that is that one fan spins up and then spins down. It sounds like it's revving almost. It's really annoying, since I thought they are either silent or the fans are running. Is this normal?
|
# ? Oct 7, 2017 21:50 |
|
havenwaters posted:Nah that model sounds like the IPS one. It's "144hz" It's really 165hz if you turn on some built in overclocking I guess. It's the TN version. abmiprz
|
# ? Oct 7, 2017 21:52 |
|
1gnoirents posted:Yes. I've got something like 6mb up, so not great. Good to know about Handbrake compression. It looks like for who knows what reason Shadowplay doesn't let you map PTT to a mouse button, just a keyboard. I just set my mouse button to do double duty in Logitech's software. I still have to test it out though.
|
# ? Oct 7, 2017 21:54 |
|
Simple Simon posted:So I recently bought a 1080ti Strix and it does something really annoying and that is that one fan spins up and then spins down. It sounds like it's revving almost. I think that's a "semi-passive" fan profile that shuts down the fans when they the temp is below 60 c. You should be able to disable it with the strix software.
|
# ? Oct 7, 2017 22:30 |
|
lDDQD posted:Isn't that the TN version? Yah it is. Really not as bad as people make it out to be and I'm coming from a U2415.
|
# ? Oct 7, 2017 23:51 |
|
What is the best way to make sure my video card is running optimally. Like I am not OCing it but I want to make sure it is doing everything right, this card feels like its under performing.
|
# ? Oct 7, 2017 23:53 |
|
Knifegrab posted:What is the best way to make sure my video card is running optimally. Like I am not OCing it but I want to make sure it is doing everything right, this card feels like its under performing. Just run 3DMark or Timespy or something and post your score
|
# ? Oct 7, 2017 23:57 |
|
Knifegrab posted:What is the best way to make sure my video card is running optimally. Like I am not OCing it but I want to make sure it is doing everything right, this card feels like its under performing. MaxxBot posted:Just run 3DMark or Timespy or something and post your score Graphics score to verify GPU/driver performance, which is a subscore if you scroll down. The main composite score is also useful to suggest a CPU bottleneck or not (and is also defined as a sub group score below), though 3d mark in general is much more useful for the GPU score. If you dont know what 3d mark really is, Timespy and Firestrike are common tests that 3d Mark the program contains)
|
# ? Oct 8, 2017 00:02 |
|
sauer kraut posted:Anything with grass makes hardware encoders weep. I don't have any idea what you're talking about.
|
# ? Oct 8, 2017 00:59 |
|
Prescription Combs posted:Yah it is. Really not as bad as people make it out to be and I'm coming from a U2415. Good TN panels these days are really not that bad at all, suffering mainly in the blacks. Colour shift is much better and you won't notice it if you use your monitor in a normal way. The Dell 24" 165hz G-Sync TN monitor (S2417DG) is on my radar for a monitor upgrade. Don't want bigger than 24 which rules out IPS+GSync+144, and the TN quality is supposedly great.
|
# ? Oct 8, 2017 01:02 |
|
Shrimp or Shrimps posted:Good TN panels these days are really not that bad at all, suffering mainly in the blacks. Colour shift is much better and you won't notice it if you use your monitor in a normal way. The black levels are outright better than IPS on my Dell 27 gsync. Its the overall color quality that suffers on good TN panels, but all other faults are alleviated these days. The only better black levels I've seen are on OLED displays. *Aside from coating, gloss typically looks better with black levels even if it technically is not
|
# ? Oct 8, 2017 01:11 |
|
Measly Twerp posted:I don't have any idea what you're talking about. Man, something seems wrong there. Linus had his people do a bunch of testing and they found nvenc looking reasonable next to x264 at 'faster', and keep in mind the average streamer using x264 is still running worse than that at veryfast. These screaming 6/8 core CPUs of 2017 will be running it at fast or even medium over the next few years, but if your stream isn't actually making money (Twitch won't give you a sub button, your YouTube channel has fewer than 10,000 views, etc) then why bother.
|
# ? Oct 8, 2017 01:48 |
|
Craptacular! posted:Man, something seems wrong there. Linus had his people do a bunch of testing and they found nvenc looking reasonable next to x264 at 'faster', and keep in mind the average streamer using x264 is still running worse than that at veryfast. These screaming 6/8 core CPUs of 2017 will be running it at fast or even medium over the next few years, but if your stream isn't actually making money (Twitch won't give you a sub button, your YouTube channel has fewer than 10,000 views, etc) then why bother. Maybe it's AMD VCE. AMD's hardware encoder is godawful, last time I checked it was somewhere below Kepler-generation NVENC quality. I personally think that unless this is a source of income for you and/or you have a standalone encoding box (a beefy fileserver or whatever), NVENC is pretty much the best overall option. As things start to scale across cores better, CPU encoding is going to cause bigger and bigger impacts on your framerate since you're no longer just filling wasted cycles, you're competing with the game for CPU time. Paul MaudDib fucked around with this message at 06:22 on Oct 8, 2017 |
# ? Oct 8, 2017 03:03 |
|
Measly Twerp posted:I don't have any idea what you're talking about. That pic looks as good as DayZ plays
|
# ? Oct 8, 2017 04:35 |
|
Paul MaudDib posted:Maybe it's AMD VCE. AMD's hardware encoder is godawful, last time I checked it was somewhere below Kepler-generation NVENC quality. for most people this is true but have you seen quicksync encoding on a 7700k? 1080p 60fps is super sharp with no dropped frames. My cpu load is about 6%. Its using the onboard graphics to do this right? Does that mean any intel CPU with HD 630 would have similar results? Fauxtool fucked around with this message at 06:01 on Oct 8, 2017 |
# ? Oct 8, 2017 05:55 |
|
Fauxtool posted:for most people this is true but have you seen quicksync encoding on a 7700k? 1080p 60fps is super sharp with no dropped frames. My cpu load is about 6%. Its using the onboard graphics to do this right? Does that mean any intel CPU with HD 630 would have similar results? That's the exact same processor that LTT used in the video I linked (the 6900X also used doesn't have an IGP). It looked like rear end. The real question is how he got x264 to use less of a performance hit than the hardware encoders. Everything I've read suggests the opposite.
|
# ? Oct 8, 2017 06:11 |
|
x264 isnt QSE though and settings matter. x264 default settings looks like ultra rear end too. On my 7700k specifically, it works really good for live streaming to twitch I also watched that video when it first came out along with the similar gamers nexus one and my experience has been different. Maybe new drivers or OBS updates fixed something. To be fair, none of the encoders actually do a good job of replicating exactly what my monitor is showing. They are all sorta bad but in different ways. I think the best one is the the one that uses whatever part of your pc can spare the power and everyone is different. x264 also uses less cpu than QSE for me. Maybe its being compressed better? Fauxtool fucked around with this message at 06:34 on Oct 8, 2017 |
# ? Oct 8, 2017 06:16 |
|
Shrimp or Shrimps posted:The Dell 24" 165hz G-Sync TN monitor (S2417DG) is on my radar for a monitor upgrade. Don't want bigger than 24 which rules out IPS+GSync+144, and the TN quality is supposedly great. Literally just bought the s2417dg. It's pretty sweet. Lucked out with an open box from Microcenter so I don't think I could get anything anywhere near the price range with 1440, gsync, and 144(165 oc)hz. BF1 is hilariously awesome to play now.
|
# ? Oct 8, 2017 18:14 |
I don't know if this is the GPU or what. I have a 1070, and when I was streaming something (minimal load, so it can't be power/voltage even if I didn't have a high quality one (Corsair SFX 600W)) both of my monitors displayed a single color (one grey, the other an unsaturated green), but with the bottom half doing this flickering/staticy line effect on both monitors. The computer itself was running as normal for the entire duration, and after 5 or so seconds it returned to normal. I can't really find anything on the internet about someone else encountering this, and I've never done so. While I technically did have overclocks on, like I said I wasn't running anything strenuous and there was only a 0-20% load (and sub 1V voltages) during the incident. Could it be bad vram? If so, why would it be only temporary? There's nothing in the logs that I can find either.
|
|
# ? Oct 8, 2017 20:19 |
|
I am only a computer enthusiast of several decades with no official certification, but I'd test your system memory first. If just because running memtest a couple times only takes a few minutes and then removes that from the equation. Otherwise, uhhh, assume it was a poltergeist for now?
|
# ? Oct 8, 2017 20:26 |
|
One correction, running memtest a couple times takes way more than a few minutes
|
# ? Oct 8, 2017 20:39 |
|
Phoronix noticed that Vegas primitive binning can be toggled on/off with the Linux driver, and compared performance with it enabled and disabled: It makes basically no difference
|
# ? Oct 8, 2017 20:46 |
|
https://www.youtube.com/watch?v=8OpdyP7amv8 Interesting ASUS Strix Vega 64 tear down by Steve "The Brick" Burke.
|
# ? Oct 9, 2017 02:32 |
|
Any thoughts on differences between MSI NVIDIA GTX 1070 Armor 8G OC and Asus ROG Strix GeForce GTX1070-O8G. There is almost a 100€ difference om amazon.de right now. I assume part of it is 2 vs. 3 fan setup, but is there anything else?
|
# ? Oct 9, 2017 10:32 |
|
refleks posted:Any thoughts on differences between MSI NVIDIA GTX 1070 Armor 8G OC and Asus ROG Strix GeForce GTX1070-O8G. There is almost a 100€ difference om amazon.de right now. I assume part of it is 2 vs. 3 fan setup, but is there anything else? At this point, forget both of them and wait for the 1070Ti at the end of the month...or buy EVGA and use "Step Up" when you get the chance.
|
# ? Oct 9, 2017 17:10 |
|
I currently have a 980ti and am thinking about flipping that card to upgrade to a 1080ti. I recently got a 4K monitor and want some extra performance to render at 4K for modern games. My case is pretty tight, and according to the Nvidia reference specs the size of the two cards is the exact same, so I think that solves one of my concerns. The second is my PSU. I'm looking at the EVGA GTX 1080 Ti Black Edition (no need for fancy OC components, lowest power draw, etc). They recommend 600W of system power, but my current PSU is 550W. I have the following PSU (EVGA SuperNOVA 550 GS 220-GS-0550-V1 80+ GOLD). My recollection from when I was super into PC hardware 5-10 years ago is that the quality of the PSU matters more than the wattage. Will my current PSU stand up to the task, or do I need a new one? https://www.newegg.com/Product/Product.aspx?Item=N82E16817438049&nm_mc=TEMC-RMA-Approvel&cm_mmc=TEMC-RMA-Approvel-_-Content-_-text-_-
|
# ? Oct 9, 2017 17:33 |
ryan_woody posted:I currently have a 980ti and am thinking about flipping that card to upgrade to a 1080ti. I recently got a 4K monitor and want some extra performance to render at 4K for modern games. Your PSU is completely fine, in practice the 1080ti draws less power than the 980ti and the 600W recommendation is assuming that the PSU in question is some cut rate garbage that you bought for $15.
|
|
# ? Oct 9, 2017 18:14 |
|
On the other hand, it is a good idea to not keep a PSU for more than 5-7 or so years, as even the high quality ones can deteriorate. But 550 should be fine for what you want.
|
# ? Oct 9, 2017 19:15 |
|
AVeryLargeRadish posted:Your PSU is completely fine, in practice the 1080ti draws less power than the 980ti and the 600W recommendation is assuming that the PSU in question is some cut rate garbage that you bought for $15. Volguus posted:On the other hand, it is a good idea to not keep a PSU for more than 5-7 or so years, as even the high quality ones can deteriorate. But 550 should be fine for what you want. I've only had it about 2.5 years, so it sounds like I'm good. Thank you, both!
|
# ? Oct 9, 2017 19:36 |
|
ryan_woody posted:I've only had it about 2.5 years, so it sounds like I'm good. Thank you, both! My total system power draw from the wall is 330 watts with max overclock on everything with a 1080ti and 6600k. Depending on your cpu you could run two 1080tis. I wouldnt reccomend it but its kind of amazing
|
# ? Oct 9, 2017 20:01 |
|
My 7700k non-overclocked with AMD Rx480 while benchmarking runs around 450-500w at the plug.
|
# ? Oct 9, 2017 20:03 |
|
redeyes posted:My 7700k non-overclocked with AMD Rx480 while benchmarking runs around 450-500w at the plug. So probably 375-450w actual psu load.
|
# ? Oct 9, 2017 20:14 |
|
|
# ? May 19, 2024 03:20 |
|
The latest NVIDIA drivers fix Forza 7 performance. Computerbase has new benchmarks for the 1060 and 1080, basically back where you'd expect, with 1080 on par with Vega 64 and 1060 on top of Fury X and 580. Guess NVIDIA was dropping the ball there.
|
# ? Oct 9, 2017 20:42 |