Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.
Yeah, same. I’ve got an i5-4670k and have only felt the urge to upgrade now that I’ve gone for 1440p @60hZ to 1440UW @120hZ. Still got 1333mhZ DDR3 RAM too. Probably gonna pull the trigger on a whole new motherboard/CPU/RAM whenever the replacement for the i9-9900 or whatever comes out.

E: oh yeah went from a 2GB 770 in 2013 to a 1070 to a 1080 Ti in 2017 (thanks buttcoin).

Adbot
ADBOT LOVES YOU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
I went from a 2500k -> 5820k, and from all I've seen so far, I should be able to ride that guy for at least another 2-3 GPU generations.

EdEddnEddy
Apr 5, 2012



tehinternet posted:

Yeah, same. I’ve got an i5-4670k and have only felt the urge to upgrade now that I’ve gone for 1440p @60hZ to 1440UW @120hZ. Still got 1333mhZ DDR3 RAM too. Probably gonna pull the trigger on a whole new motherboard/CPU/RAM whenever the replacement for the i9-9900 or whatever comes out.

E: oh yeah went from a 2GB 770 in 2013 to a 1070 to a 1080 Ti in 2017 (thanks buttcoin).

Jeeze, if anything strap on a good cooler to that cpu and OC it to ~4.6Ghz, and get some 2133mhz DDR3 and you should see a nice boost from that alone with your current setup.


Thinking about finding a home for my 980ti, and I remember my wifes dad is a avid gamer but on an older Ivy bridge i5 K series, and an AMD 370 I believe.

I happen to have a brand new H80 water cooler, my 980Ti, and if he needs it, a 650 or 680W PSU to spare. Might make for a solid upgrade on his end as he does enjoy some Destiny 2.

Crap, thinking about it, does anyone have an old X79 motherboard they aren't using that needs a new home? I have my old 3930K lying around after thinking I wore it out (was just a bad PSU&UPS) and while the chip is good, finding a motherboard for the thing is either Chinese garbage boards, or an overpriced used board on eBay unfortunately it looks.

EdEddnEddy fucked around with this message at 19:42 on Mar 5, 2019

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

EdEddnEddy posted:

Guess you can just say that the Core architecture was drat good.

Except for all the security holes popping up. They just found another one, that affects the entire Core family and might not be fixable in software. :downs:

Or the other perspective, where Core looks great only because Bulldozer was so bad and Intel got to milk it for years.

EdEddnEddy
Apr 5, 2012



isndl posted:

Except for all the security holes popping up. They just found another one, that affects the entire Core family and might not be fixable in software. :downs:

Or the other perspective, where Core looks great only because Bulldozer was so bad and Intel got to milk it for years.

True this, however unlike the P4 days and older where the OS itself made the system seem slow within a month/year of purchase, From Core2 on the performance has been good enough that you can still use say a E6600 system as a simple work PC as long as you have 4G+ Ram and a SSD.

I do however dislike how Intel did milk the crap out of it and made pretty much any upgrade after Sandybridge to be pretty negligible until only recently. Took AMD lighting a fire under their rear end like I expected it would. :/

Cygni
Nov 12, 2005

raring to post

isndl posted:

Or the other perspective, where Core looks great only because Bulldozer was so bad and Intel got to milk it for years.

yes core was secretly super bad but nobody noticed cause every other cpu ever made was even worse and everyone coulda made a better cpu architecture but they were like too busy or something

EdEddnEddy
Apr 5, 2012



I know they aren't all great games, but drat now Nvidia offers Anthem, BF5, and Metro Exodus for new 2080/2080Ti purchase. Too bad it's through the stupid partners and not Nvidia themselves. Dang it Frys!

If anyone doesn't care for the first two games and/or has a spare key they would be willing to trade/sell, hit me up.

Zigmidge
May 12, 2002

Exsqueeze me, why the sour face? I'm here to lemon aid you. Let's juice it.
Thanks for the advice yesterday. I've figured some things out and have a new question. First, there was no throttling going on, I was just lazily misreading the clock and voltage outputs. I'll say what was happening in a minute. Second, 60c just happens to be a very large plateau of stability where my fan keeps up with the heat output. With the fans turned off they get up to about 68-69c and stabilize there. Did EVGA create really good heatsinks??

So, first, sorry for the bad original question. I do have a new one: What's going on with the clock and voltage is that they jitter between two numbers. Depending on the step, the jitter difference can be between 13 and 18mhz. Because you usually have to set the clock offset at increments of 15 to see any changes in reported clocks, is that just the sensors doing their best to report a clock/voltage that's actually somewhere in between or something along those lines?

fknlo
Jul 6, 2009


Fun Shoe

craig588 posted:

Cooling can surprisingly be affected. Just a few degrees, but given that people will spend over 100 dollars on coolers a cheap stabilizing option might do well. There was that jack like thing someone made for a little while that pushed up from the bottom of the case, apparently not enough people cared because I can't find it now, but there are a ton of cheap options on Amazon https://www.amazon.com/s?k=videocard+sag

I'll see how it does when raiding tonight, but my card is absolutely running a few degrees cooler while just farting around in WoW after installing the uphere brace. I was not expecting that.

Phone
Jul 30, 2005

親子丼をほしい。
Am I missing something with Precision X1?

I set the fan curve to be quiet, apply, save, leave the app open... everything is nice and quiet. Close the app, and the fans immediately spin up to 2500 rpm according to the stock profile. Do I have to run X1 all of the time in order to control the fans?

EdEddnEddy
Apr 5, 2012



Phone posted:

Am I missing something with Precision X1?

I set the fan curve to be quiet, apply, save, leave the app open... everything is nice and quiet. Close the app, and the fans immediately spin up to 2500 rpm according to the stock profile. Do I have to run X1 all of the time in order to control the fans?

If it is anything like MSI Afterburner or their Precision XOC or whatever app, yes.

You usually have it boot on startup, then start minimized which hides it in the taskbar and handles both Overclocking and Fan curves for you.

orcane
Jun 13, 2012

Fun Shoe
There are ways to dump your GPU BIOS, edit the fan curve and flash it back. But uuuh, just put the OC-tools into autostart unless you want to risk producing an expensive piece of electronic scrap :v:

craig588
Nov 19, 2005

by Nyc_Tattoo
Can't edit the bios since Maxwell. I used to do that so I wasn't running any extra programs. To get really risky: I don't think a single Maxwell bios has gone bad from too many flashes. Sure it's possible to wear out flash, but I wasn't able to find any reports of it happening and the flash might be good for thousands of writes and even my 980 that I messed with a lot only had like 2 dozen flashes on it as I tried lots of things.

craig588 fucked around with this message at 23:32 on Mar 5, 2019

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Even the cheapest of flash memory for long term storage like that tends to be rated for dozens, if not hundreds of writes if your data size is small in proportion to the package

They're also often built with graceful failure modes if the unthinkable happens, like locking out writes only and continuing to be read

I was never really happy with Maxwell's scaling on my 980ti with unlocked voltage though, FWIW. Could be because it was refurb card

EdEddnEddy
Apr 5, 2012



Has anyone actually hit that Flash burn out point on some old device by flashing it too many times? I know some SD cards or SSD's may have died from something like that, but it seems more Firmware/Controller related then actual flash death from what I have seen/heard.


Also if anyone still has their Korean DVI 1440P special monitor and is concerned about using it with a 20XX series without a DVI port, This DP to DVI Adapter does work right out of the box. The Included HDMI to DVI adapter that came with the card also works with a bit of Pixel Clock unlocking, but then if your other display is DP, you get no video on POST until you boot up which can be an inconvenience. This dongle arrived today and so far works like a charm. You're stuck at 60hz, but for a secondary display its better than nothing.

EdEddnEddy fucked around with this message at 03:38 on Mar 6, 2019

Phone
Jul 30, 2005

親子丼をほしい。
Thanks for the input re: PX1.

Another dumb question! Both MPC-HC and PotPlayer are dropping frames if I watch a movie on the second monitor while playing a game on the first (in borderless windowed)... I don't know how or why this is happening unless that I'm drastically underestimating how powerful the GTX970 was? Dropping the resolution or disabling Gsync didn't help with the dropped frames at all.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Are you at 100% CPU or GPU utilization?

Capping your framerate is likely the only real solution, but for this specific scenario doing it with RTSS is bad because RTSS eats idle CPU to cap framerate.

Phone
Jul 30, 2005

親子丼をほしい。
I was hovering around 93% GPU utilization, but it happens when only the old monitor (24" 60Hz) is involved, too. But it's at 93% utilization whether or not I'm running the game at 1920x1200 or 3440x1440.

Looks like dropping the refresh rate of the ultrawide from 120Hz to 60Hz might have fixed it?

e: just dropping less frames and the jitter isn't all over the place, still stuttering a bit but not a lot

Phone fucked around with this message at 04:37 on Mar 6, 2019

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Oh I missed that you were using borderless. That's 100% the problem. Use exclusive fullscreen mode and it will probably work fine. Windows is not OK with a borderless fullscreen on one monitor and anything dynamic on another monitor.

Phone
Jul 30, 2005

親子丼をほしい。
I think it's more that I'm capping out performance on the 2070, reducing the refresh work was fine. I tried fullscreen and it didn't improve anything for video playback.

sauer kraut
Oct 2, 2004
Watch movies or browse on a cheapo Chromebook or something, running anything but the plainest jane Windows apps on a side monitor next to games is asking for trouble.

sauer kraut fucked around with this message at 05:53 on Mar 6, 2019

Phone
Jul 30, 2005

親子丼をほしい。
I've been doing it for the past forever, it's just the double whammy of way more pixels to push and >60fps framerates. It also didn't help that I was looking at going to 2560x1440 and was reading benchmarks concerning that resolution... before going with a 3440x1440 monitor and straight up ignoring that 900ish horizontal pixels is kind of a lot.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Phone posted:

I was hovering around 93% GPU utilization, but it happens when only the old monitor (24" 60Hz) is involved, too. But it's at 93% utilization whether or not I'm running the game at 1920x1200 or 3440x1440.

Looks like dropping the refresh rate of the ultrawide from 120Hz to 60Hz might have fixed it?

e: just dropping less frames and the jitter isn't all over the place, still stuttering a bit but not a lot

I had a similar problem, and my fix was to go into Chrome/FF/VLC/etc and disable hardware GPU acceleration. It doesn't take much CPU to play normal (non-4k) videos, and it fixed the issue right up for me.

Phone
Jul 30, 2005

親子丼をほしい。

DrDork posted:

I had a similar problem, and my fix was to go into Chrome/FF/VLC/etc and disable hardware GPU acceleration. It doesn't take much CPU to play normal (non-4k) videos, and it fixed the issue right up for me.

This is going to be my next step. After years of hardware acceleration, I need to figure out how to disable it all.

EdEddnEddy
Apr 5, 2012



I had a similar problem with my 980Ti using 100hz Gsync and watching video on a non gsync screen.

I believe bringing them both to matching hz helped there, now on my 2080 I don't seem to have that issue currently even though one is 100hz and the other is 60hz.


Another nice perk coming from the 980ti to the 2080 is I can now run Shadowplay with practically no performance penalty in games like Sea of Thieves. Now if only I could get gud, or find PST goons.

Cygni
Nov 12, 2005

raring to post

$660 dollar 3 fan 2080, with BFV, Anthem, and Metro Exodus. Pretty tasty deal as far as the high end goes.

https://www.theverge.com/good-deals/2019/3/6/18253134/devil-may-cry-5-microsd-cards-rtx-2080-amazon-deals-sale

(you have to use the verge link to get the deal i guess)

moot the hopple
Apr 26, 2008

dyslexic Bowie clone

Cygni posted:

$660 dollar 3 fan 2080, with BFV, Anthem, and Metro Exodus. Pretty tasty deal as far as the high end goes.

https://www.theverge.com/good-deals/2019/3/6/18253134/devil-may-cry-5-microsd-cards-rtx-2080-amazon-deals-sale

(you have to use the verge link to get the deal i guess)

Thanks, this was about the price I was looking to pay for one of these :tipshat:

OtherworldlyInvader
Feb 10, 2005

The X-COM project did not deliver the universe's ultimate cup of coffee. You have failed to save the Earth.


Where/how do I actually buy a used 1070ti without getting scammed? I know EVGA, Gigabyte, and MSI have transferable warranties via serial number. Whats a fair price for them these days?

iastudent
Apr 22, 2008

How much should I be looking to pay for a Vega 56/64?

Ihmemies
Oct 6, 2012

orcane posted:

There are ways to dump your GPU BIOS, edit the fan curve and flash it back. But uuuh, just put the OC-tools into autostart unless you want to risk producing an expensive piece of electronic scrap :v:

Nvflash is one bios flashing tool. I think you can just reflash the original bios if the modded one bricks the card? Just need another gpu for it.

https://www.techpowerup.com/download/nvidia-nvflash/

craig588
Nov 19, 2005

by Nyc_Tattoo
https://www.overclock.net/forum/71-...ti-titan-x.html
Here's the edited tool that allows modified bioses. It says Pascal support, but it's only partial, bioses still need to be signed, but it allows you to swap manufacturers.

Cygni
Nov 12, 2005

raring to post

iastudent posted:

How much should I be looking to pay for a Vega 56/64?

You should prolly buy this one that keeps popping up on sale if you are after Vega. $259 after rebate:

https://www.newegg.com/Product/Product.aspx?Item=N82E16814137263

The blower fan isn't ideal, but that price IS ideal.

VelociBacon
Dec 8, 2009

Crossposting from the OC thread:

New build of MSI Afterburner (4.6.0) is out with some improvements.

MSI posted:


What's New:

Includes full RVII support, including overclocking (which was missing in the previous early b15 build).
Now you can use and + keys to select next/previous point on the curve. This feature is useful for those who prefer to use keyboard / keys to fine-tune selected point frequency offset instead of adjusting the point with mouse cursor. I'm not sure if many of you know that, but I'll remind that you can also hold while fine-tuning frequency offset with / keys to jump to the nearest frequency value rounded to 10MHz
Now you can press key to edit selected point frequency offset from keyboard. Alternately you may press + to specify absolute target frequency value, so the editor will calculate desired offset automatically
Slightly changed keyboard control interface for AMD implementation of V/F curve editor. Previously you could select P-State and fine-tune frequency with / or +/+ keys or fine-tune voltage / or +/+ keys. Now both frequency and voltage are adjusted with / or +/+ keys and voltage or frequency keyboard input focus is selected with / keys. Voltage or frequency keyboard input focus is now highlighted by selected point brightness on the curve. Keyboard input focus also affects new / + functionality, allowing you to type in target voltage or frequency in both offset or absolute form
Similar to NVIDIA implementation, now you may also hold when dragging V/F curve point on AMD systems. This will result in moving whole curve up/down while preserving each point's offset
Added undo/redo support to voltage/frequency curve editor:
Now voltage/frequency curve editor is saving up to 1024 last states of the curve during editing and allows you to undo/redo changes with + or + / ++ keys
Undo history is discarded when you apply the curve, forcibly reread it from hardware with key or switch to different GPU on multi-GPU system
Number of recorded undo/redo steps is displayed in square brackets in the editor window caption
Application properties dialog window is now displayed with topmost style when the properties are activated from detached monitoring window and "Always on top" mode is enabled for it
NVIDIA Scanner components are updated to latest version, NVML.dll issue should be addressed now

e: Editing to say that it's allowing me now to increase the voltage for my EVGA 2080ti, something that I couldn't do before with afterburner. The way it refers to voltage is different than previously but it seems to work, same vcore measured with x1 and afterburner set up like the screenshot. Thank god I can finally get rid of precision X1.

VelociBacon fucked around with this message at 20:21 on Mar 7, 2019

Worf
Sep 12, 2017

If only Seth would love me like I love him!

sauer kraut posted:

Try setting the 144fps monitor to 120 and reboot.

just got around to trying this and no dice :(

I should mention that im running a x1080, x1440, and a 4k monitor (idk if that matters)

thank you for the advice though

Indiana_Krom
Jun 18, 2007
Net Slacker

Statutory Ape posted:

just got around to trying this and no dice :(

I should mention that im running a x1080, x1440, and a 4k monitor (idk if that matters)

thank you for the advice though

Plug whatever monitor doesn't need your main GPU's power into the iGPU if you have one. Nvidia cards do not idle if they have 3 or more displays connected (refresh rate is irrelevant).

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Indiana_Krom posted:

Plug whatever monitor doesn't need your main GPU's power into the iGPU if you have one. Nvidia cards do not idle if they have 3 or more displays connected (refresh rate is irrelevant).

Ok so I did this so far, and the card settled down and i made sure IGPU was enabled in BIOS (and then plugged monitor into the mobo) and the computer isnt recognizing it. Just for full disclosure its a DVI(monitor side) to HDMI going into the mobo, not that I would thing it should matter. I have an 8700k which afaik has a UHD 630 or something for graphics so :shrug:

E: literally as i typed this post the monitor came on on its own :shrug: lmao

E2: thank you btw. also now that i think back, i seem to remember that exact advice being given either ITT or in another i follow here

Worf fucked around with this message at 00:37 on Mar 8, 2019

Fuzz1111
Mar 17, 2001

Sorry. I couldn't find anyone to make you a cool cipher-themed avatar, and the look on this guy's face cracks me the fuck up.

EdEddnEddy posted:

Has anyone actually hit that Flash burn out point on some old device by flashing it too many times? I know some SD cards or SSD's may have died from something like that, but it seems more Firmware/Controller related then actual flash death from what I have seen/heard.
I've got 3 IP cameras streaming to an sdcard in a Beagle Bone Black (like a raspberry pi) and I've gone through 2 in 4 years. In both cases they would lock up if you wrote to them (and require power cycling the card) but reading was fine.

Wasn't all that graceful though because both had damaged filesystems and I had to image them to a file, then repair filesystem in image file, in order to get all data.

disaster pastor
May 1, 2007


Statutory Ape posted:

E2: thank you btw. also now that i think back, i seem to remember that exact advice being given either ITT or in another i follow here

It might have been the advice given to me. Which is kinda funny, because I had to revert to not using the IGPU and plugging the extra monitor back into the GPU because with it plugged into the motherboard, neither monitor would stay in power saver; they'd go off and immediately come back on.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
If you have an Asus ROG Strix 2080 Ti and you really like overclocking, a 1000W power limit BIOS leaked for it.

Adbot
ADBOT LOVES YOU

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Don't forget the water. 1000w yeesh.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply