Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

particle9 posted:

I read that the 970 is hosed in some weird way, the memory is bullshit? What is the deal with that?

Maxwell lets NVIDIA disable 1/8th increments of the ROPs without disabling a processing unit (unlike previous chips). This has the side effect that accessing the last 1/8th of memory totally fucks bandwidth (you allocate a memory-access cycle to yourself and use all 7/8th remaining ROPS to access the single memory chip) - meaning you access EITHER the fast 3.5GB segment OR the slow 512MB segment, but NOT both at the same time. Potentially this can gimp the memory bandwidth beyond belief if there's frequent random access into the slow segment - you should not do >3.5GB-sized compute on a 970. On the other hand NVIDIA tweaks the game profiles to try and figure out the things that get accessed the least and stick them into the slow segment. Plus like AMD they hand-tweak all their drivers to fix the bullshit that the game developers try and get away with. If you're not running a 1440p/SLI 1080p/4K/beyond rig that consumes >3.5GB of memory, this works pretty OK.

If you haven't read that link, it's a concise explanation of that driver problems and the efforts that go into optimizing it away.

Paul MaudDib fucked around with this message at 07:38 on Jun 5, 2015

Adbot
ADBOT LOVES YOU

Ragingsheep
Nov 7, 2009

particle9 posted:

I read that the 970 is hosed in some weird way, the memory is bullshit? What is the deal with that?

The last 0.5gb of VRAM is slower than the first 3.5gb (but still way faster than accessing system RAM). It has some impact if you're maxing out VRAM by rendering at 4k but minimal impact for regular users.

particle9
Nov 14, 2004
In the guide to getting dumped, this guy helped me realize that with time it does get better. And yeah, he did get his custom title.
Okay that's interesting. Will have to read that.

Also, for anyone who is crazy about the 980 ti EVGA Newegg is holding onto them so that you buy this power supply: http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.2383062 The card is still there if you want to buy a cheap PSU.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Ragingsheep posted:

The last 0.5gb of VRAM is slower than the first 3.5gb (but still way faster than accessing system RAM). It has some impact if you're maxing out VRAM by rendering at 4k but minimal impact for regular users.

System RAM bandwidth doesn't mean anything since the amount of CPU <-> GPU crosstalk is negligible - at 4K you can go from CrossFire'd PCIe 3.0 4x to CrossFire'd PCIe 3.0 16x (i.e. factor of 4 increase, from garbage mATX second slot performance to top of the line) and see negligible performance increase. During normal gameplay the amount of data changed per frame is just not that big. Until, of course, you decrease memory size past the critical point and you start thrashing data transfers across the bus, at which point you go from [data changed per frame] to [non-trivial chunk of the data required to render the frame + data changed per frame].

It also doesn't tell the whole picture - bandwidth isn't the same thing as latency. The fact that system memory can sustain a given bandwidth (especially when mostly unused) says absolutely nothing about the performance of a given system. Taken to an extreme, let's say that the system memory is sitting in New York and we transfer the data in a 747 full of LTO-5 tapes to the GPU which is located in LA. The overall bandwidth is 245 GB/s, which is more than the GTX 970 can do from its on-card memory (224 GB/s), but the 747 takes 6 hours to fly them over. Is that system going to perform better or worse at gaming than a GTX 970 sitting right on your desk?

This an issue that the 970 potentially wins out on - on-chip memory is better than system memory in latency. But, having a slow segment in memory is a hard thing to manage depending on how you report it. If you report it to games as 3.5GB and manage the extra 0.5GB behind their back as an L3 cache, that's fine. If you expose it to them and are reasonably successful at making sure it's the slow data, that's fine too. But if you gently caress up and start to thrash system memory you're gonna have a bad time because the latency is garbage on the scale required to render a frame. That's where the 970's microstutter thing starts to happen, where you use more than 3.5GB and high-level (pre-Mantle/Vulkan/DX12) APIs/drivers don't handle it well.

It's not a big deal with a single card driving 1080p, it does become an issue with 1440P and above and SLI rigs.

Paul MaudDib fucked around with this message at 08:45 on Jun 5, 2015

The Deadly Hume
May 26, 2004

Let's get a little crazy. Let's have some fun.
I guess the other thing for people to think about is if they're going to get the Rift when the consumer version finally pops out - they've stated the GTX 970 as the basic requirement which sounds reasonable, but I reckon that you're absolutely going to want more crank to get the FPS up to forestall nausea issues. The 980Ti will probably be in a good range for that.

LiquidRain
May 21, 2007

Watch the madness!

The Deadly Hume posted:

I guess the other thing for people to think about is if they're going to get the Rift when the consumer version finally pops out - they've stated the GTX 970 as the basic requirement which sounds reasonable, but I reckon that you're absolutely going to want more crank to get the FPS up to forestall nausea issues. The 980Ti will probably be in a good range for that.
We are potentially a full year away from a Rift. They've said 2016, or early 2016. For all we know Pascal may be around the corner by then. I'm not going to try future-proofing my decision now. I'll decide on what to get for an Oculus when the Oculus is out. Or I'll stick with my 970 and just turn down the options, god forbid.

I don't need everything on ultra and I'm perfectly happy saving my money and turning a few sliders down.

LiquidRain fucked around with this message at 09:03 on Jun 5, 2015

The Deadly Hume
May 26, 2004

Let's get a little crazy. Let's have some fun.

LiquidRain posted:

We are potentially a full year away from a Rift. They've said 2016, or early 2016. For all we know Pascal may be around the corner by then. I'm not going to try future-proofing my decision now. I'll decide on what to get for an Oculus when the Oculus is out. Or I'll stick with my 970 and just turn down the options, god forbid.

I don't need everything on ultra and I'm perfectly happy saving my money and turning a few sliders down.
Well, that's true as well.

:australia: PCCG have some more in for delivery next week. Won't last long. Cheapest is the EVGA version for AUD999.
http://www.pccasegear.com/index.php?main_page=index&cPath=193_1766&zenid=89693cbda40081d6f9158e4368eaed33

Truga
May 4, 2014
Lipstick Apathy
I, too, am waiting for vives/rifts before upgrading... as long as my 6950s hold up. They're really starting to get old, suddenly I can't have them overclocked as much without them crashing :shobon:

Hopefully they'll hold and I can get a real cheap pair of 970s or 290x or whatever ends up being best frames/$ on the rift.

Foxhound
Sep 5, 2007
Friend offered to trade a 980ti for my old electric bike lol. I am seriously considering it since i'm not using it anymore. Can I run it off a 650W PSU in an average gaming rig?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Foxhound posted:

Friend offered to trade a 980ti for my old electric bike lol. I am seriously considering it since i'm not using it anymore. Can I run it off a 650W PSU in an average gaming rig?

650 should be plenty.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
eVGA 980 Ti in stock for 669+9 at Newegg at the moment (for those waiting): http://www.newegg.com/Product/Product.aspx?Item=N82E16814487139

craig588
Nov 19, 2005

by Nyc_Tattoo

BobbyBaudoin posted:

Been wondering if I should just buy a 980ti with a reference cooler or be patient until the 12th for a possible non-ref EVGA one. If I'm not interested in doing any overclocking on my graphic card are the reference cooler decent considering I got decent airflow and that noise isn't an issue?

It seems like this question got skipped. Yes, that's exactly what the reference cooler is for. The only reason to get a 3rd party cooler is for overclocking or less noise.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

BobbyBaudoin posted:

Been wondering if I should just buy a 980ti with a reference cooler or be patient until the 12th for a possible non-ref EVGA one. If I'm not interested in doing any overclocking on my graphic card are the reference cooler decent considering I got decent airflow and that noise isn't an issue?

Non-reference is cooler and quieter unless you've got a situation where your case has no airflow or you have multiple cards. If you're not overclocking, the noise should be perfectly bearable though.

Also the Gigabyte 980ti should be getting a BIOS update to let it shut its fans off at idle like the MSI and ASUS offerings.

BIG HEADLINE posted:

I'd wait a bit longer if you decide to go with the 970, which you'll be forced to do anyway since everyone's sold out of the 980Ti, and you practically need to poopsock at your PC to get in on restocks. It's entirely possible that the base model 970s might hit ~$279-289 with rebates when the 3xx series is officially launched - they're already hitting $300 occasionally. The 970 was designed to feed a 1440p screen wonderfully, so save for certain games which offer retarded texture options, running one on a single 1080p display is slightly overkill.

My advice is to go with the 970 and consider the $300+ you 'saved' as going towards your new system build, since you're probably going to at least consider a 1440p screen.

This is the traditional way to get the most out of a limited budget for a good reason. Generational upgrades are big. A 970 can hold its own easily against a 780ti. Pascal's getting a huge memory tech upgrade and a two node process shrink.

I do think the 290X solution is a solid idea especially now the witcher promo is done and especially if the 290X has bundled codes. Aftermarket 290s are probably the best performance for the price available, and the better coolers go a long way to make them actually reasonably cool and quiet. The power draw is still a bit high, but you'd pay a good chunk more to lower that and only that.

The 970's split memory is primarily a longevity concern for higher resolutions and larger texture sizes. Considering how Kepler performance has been left behind, the card may lose performance over time and develop some issues. Then again, there are a lot of 970s in circulation and the internet hate machine got NV to try and squeeze more performance out of Kepler.

xthetenth fucked around with this message at 14:29 on Jun 5, 2015

BurritoJustice
Oct 9, 2012

xthetenth posted:

Considering how Kepler performance has been left behind, the card may lose performance over time and develop some issues..

The Kepler issue was identified and patched by Nvidia in the last driver, it is business as usual on Kepler now. People report 25-50%+ performance gains with Kepler cards in the Witcher 3.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

BurritoJustice posted:

The Kepler issue was identified and patched by Nvidia in the last driver, it is business as usual on Kepler now. People report 25-50%+ performance gains with Kepler cards in the Witcher 3.

But what about my pitchfork

BurritoJustice
Oct 9, 2012

Don Lapre posted:

But what about my pitchfork

Something something Gameworks something something GSync?

:devil:

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

BurritoJustice posted:

The Kepler issue was identified and patched by Nvidia in the last driver, it is business as usual on Kepler now. People report 25-50%+ performance gains with Kepler cards in the Witcher 3.

The Kepler issues in PCars and Witcher 3 did get fixed, the numbers I saw from people benching were a bit more conservative and I could've sworn the gains generally weren't enough to catch 780ti up to a 290X. Plus in general it seems like GCN cards are performing better in comparison to Keplers than when they launched. The 770 is losing to the 280X, for example. I don't think whatever happened with W3 is that big a deal, there's too many 970 owners out there to forget to optimize for it.

Sorry, pitchfork, right. "forget".

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Don Lapre posted:

But what about my pitchfork

Geralt's already been skewered once. Don't need to throw another one at him.

Bonby
Jan 13, 2008

Annoying Dog
Thanks for the answers! Ordered a EVGA reference 980 ti this morning from newegg.ca, can't wait to receive it.

Edit: vv I knooooooooooooooooow

Bonby fucked around with this message at 17:27 on Jun 5, 2015

Kramjacks
Jul 5, 2007

BobbyBaudoin posted:

Thanks for the answers! Ordered a EVGA reference 980 ti this morning from newegg.ca, can't wait to receive it.

Black Ops 2 will never have looked so good.

Parker Lewis
Jan 4, 2006

Can't Lose


BIG HEADLINE posted:

eVGA 980 Ti in stock for 669+9 at Newegg at the moment (for those waiting): http://www.newegg.com/Product/Product.aspx?Item=N82E16814487139

Got one, thanks!

Icept
Jul 11, 2001
I was playing some Witcher 3 and wondering why it was running like poo poo on my 970, even with everything on medium. Turns out the 60FPS cap was on in the graphics settings. Guys, don't become used to your 144Hz displays, it's a (delicious) curse.

Incredulous Dylan
Oct 22, 2004

Fun Shoe
I guess this is obvious, but the 980 Ti SC is just crushing GTA 5 and Witcher 3 at 3440x1440 without extra overclocking so far. Most of the benchmarks out there talk about 4K, but these frames tend to be consistently high enough that I can even use v-sync. Minimum 42 framerate on GTA 5's benchmarking test with maxed settings (edit: actually forgot just 2 x MSAA with TXAA on). Now we just need to see if the new crop of 21:9 g-sync monitors announced at Computex will be worth it :hellyeah:.

Incredulous Dylan fucked around with this message at 17:15 on Jun 5, 2015

Big Mackson
Sep 26, 2009
my 290x feels like maple syrup :negative:

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

3440:1440 is a sweet as hell resolution and I'd cheerfully take a lower max frame rate for it because of how awesome it is for non-gaming tasks, even compared to 2x2560x1440. Although my work ultrawide is an LG with the super neat window management stuff, it's probably still great without it.

AMD cards in Witcher advice: Either turn off hairworks or go into your drivers and reduce the tesselation factor from 32x to 16x. The tesselation factor can be knocked down one notch without losing much image quality for a pretty big improvement in performance.

xthetenth fucked around with this message at 17:14 on Jun 5, 2015

Big Mackson
Sep 26, 2009

xthetenth posted:

3440:1440 is a sweet as hell resolution and I'd cheerfully take a lower max frame rate for it because of how awesome it is for non-gaming tasks, even compared to 2x2560x1440. Although my work ultrawide is an LG with the super neat window management stuff, it's probably still great without it.

AMD cards in Witcher advice: Either turn off hairworks or go into your drivers and reduce the tesselation factor from 32x to 16x. The tesselation factor can be knocked down one notch without losing much image quality for a pretty big improvement in performance.

thx, i am downloading witcher 3 now and i want smooth gameplay and also gfx. My friend bought a 980 but does not want to upgrade his 1920x1200 dell ultrasharp to at least 2560. it is kind of funny though.

Parker Lewis
Jan 4, 2006

Can't Lose


xthetenth posted:

3440:1440 is a sweet as hell resolution and I'd cheerfully take a lower max frame rate for it because of how awesome it is for non-gaming tasks, even compared to 2x2560x1440. Although my work ultrawide is an LG with the super neat window management stuff, it's probably still great without it.

I have a 3440x1440 LG 34UM95 for work and a 2560x1440 Benq BL2710PT for gaming and the ultrawide has made it really, really hard to go back to 16:9 for gaming.

The tradeoff is that 3440x1440 is noticeably harder to drive than 2560x1440 (I've been using a 970 and it struggles with games like GTA V/Witcher 3, will be using a 980 Ti soon) and 21:9 support is still hit or miss and I end up having to use Flawless Widescreen hacks for quite a few games (while games like Heroes of the Storm just don't support 21:9 at all).

But yeah, when it all comes together running a game at 3440x1440 60fps is kind of mind blowing and I think I'm probably going to end up just using the 3440x1440 for everything once I get my 980 Ti. And then start budgeting for a second 980 Ti and a 3440x1440 g-sync monitor..

suddenlyissoon
Feb 17, 2002

Don't be sad that I am gone.
The EVGA SC 980ti with the blower is on Amazon now at the correct price. I'm leary to get the blower one...am I right to feel that way or should I hold out for the ACX2 version? I'm using a Corsair Air 240 case right now.

suddenlyissoon fucked around with this message at 17:54 on Jun 5, 2015

veedubfreak
Apr 2, 2005

by Smythe

Parker Lewis posted:

I have a 3440x1440 LG 34UM95 for work and a 2560x1440 Benq BL2710PT for gaming and the ultrawide has made it really, really hard to go back to 16:9 for gaming.

The tradeoff is that 3440x1440 is noticeably harder to drive than 2560x1440 (I've been using a 970 and it struggles with games like GTA V/Witcher 3, will be using a 980 Ti soon) and 21:9 support is still hit or miss and I end up having to use Flawless Widescreen hacks for quite a few games (while games like Heroes of the Storm just don't support 21:9 at all).

But yeah, when it all comes together running a game at 3440x1440 60fps is kind of mind blowing and I think I'm probably going to end up just using the 3440x1440 for everything once I get my 980 Ti. And then start budgeting for a second 980 Ti and a 3440x1440 g-sync monitor..

Now try playing on 3 2560x1440 monitors in surround. The immersion is insane. I loves my triple setup. Unfortunately even a single Titan X can't quite get the job done if I want 60 fps.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Michael Jackson posted:

thx, i am downloading witcher 3 now and i want smooth gameplay and also gfx. My friend bought a 980 but does not want to upgrade his 1920x1200 dell ultrasharp to at least 2560. it is kind of funny though.

With the way gaming monitors are improving in leaps and bounds, sticking with a 1920x1200 and using DSR/VSR where applicable (especially DSR, it handles non-integer multiple scaling poorly) , especially if you have a good IPS or *VA is a totally viable and sensible way to do things. I'm very happy I hopped on the 1440p Korean IPS thing, but it's as much if not more the latter part than the former for gaming, and 1200 deals with my Y height problem with 1080p for non-gaming tasks.


Parker Lewis posted:

I have a 3440x1440 LG 34UM95 for work and a 2560x1440 Benq BL2710PT for gaming and the ultrawide has made it really, really hard to go back to 16:9 for gaming.

The tradeoff is that 3440x1440 is noticeably harder to drive than 2560x1440 (I've been using a 970 and it struggles with games like GTA V/Witcher 3, will be using a 980 Ti soon) and 21:9 support is still hit or miss and I end up having to use Flawless Widescreen hacks for quite a few games (while games like Heroes of the Storm just don't support 21:9 at all).

But yeah, when it all comes together running a game at 3440x1440 60fps is kind of mind blowing and I think I'm probably going to end up just using the 3440x1440 for everything once I get my 980 Ti. And then start budgeting for a second 980 Ti and a 3440x1440 g-sync monitor..

Awesome. My 34UM95 lives at work, so I haven't gotten to try it for gaming. Glad to hear it's as cool as I'd hoped. Does it handle 16:9 aspects gracefully by just letterboxing them?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

suddenlyissoon posted:

The EVGA SC 980ti with the blower is on Amazon now at the correct price. I'm leary to get the blower one...am I right to feel that way or should I hold out for the ACX2 version? I'm using a Corsair Air 240 case right now.

If you're in a case where an ACX2 would do better are the better thermals and noise worth getting it a few days to a month later? This is entirely a personal preference thing.

Aquila
Jan 24, 2003

Paul MaudDib posted:

It could be argued that both of those cards are actually underpowered for what you're planning. 970 is really the ideal for 1080p and beyond that the 980 Ti has become the dominant value - performance of a SLI'd 970 rig in a single card at exactly the same price as the SLI rig. Linear performance-to-price increases don't really happen that often in the high-end market. If you're not playing on crazy settings you'll probably be fine though. That's an OK build if you're on a tight budget, I just wouldn't go less than that.

gently caress why do I do this to myself. I can afford a 970 and 980TI. I don't usually play release aaa 3d games, but every few years I find one I really like and end up getting a new graphics card for it, this might be the first time I'm not falling into that cycle. I also really want a 4k display for my desktop, so I know I really should get at least a... 980 for that? Is there any place left in the nvidia line for the plain 980? I think first step will be moving past my current 560ti 448 (200+ watt power draw!), my htpc isn't even built yet and my tv is still in a box.

Parker Lewis
Jan 4, 2006

Can't Lose


xthetenth posted:

Awesome. My 34UM95 lives at work, so I haven't gotten to try it for gaming. Glad to hear it's as cool as I'd hoped. Does it handle 16:9 aspects gracefully by just letterboxing them?

Games running at 2560x1440 get pillarboxed, yeah.

I'm sure there are many others but of the games I have tried I think only Diablo 3, Starcraft 2, and Heroes of the Storm are unable to run at 3440x1440, everything else has either had native support or a Flawless Widescreen hack available.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

suddenlyissoon posted:

The EVGA SC 980ti with the blower is on Amazon now at the correct price. I'm leary to get the blower one...am I right to feel that way or should I hold out for the ACX2 version? I'm using a Corsair Air 240 case right now.

Get the blower and then a hybrid kit

Incredulous Dylan
Oct 22, 2004

Fun Shoe
DOTA 2, TF2 and CS:GO run zoomed in at 3440x1440 due to the competitive nature of those games, so be aware. Just use 2560x1440 for those. You won't get all of the available information otherwise, plus DOTA 2 doesn't scale the HUD properly so it takes up like 30% of your screen and is horrible. Battlefield 4 doesn't care at all and will just give you the extra view (and it looks awesome). Aquila, I wouldn't bother building towards 3D gaming right now. Wait at least until we see what the performance impacts of the Rift look like. I used to run the 3D gaming threads and I can tell you there hasn't been a worthwhile 3D title in years. Last games I remember being properly developed for 3D vision were...Dead Space 2 and Metro 2033?

Incredulous Dylan fucked around with this message at 18:26 on Jun 5, 2015

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Don Lapre posted:

Get the blower and then a hybrid kit

That's what I'm doing, but the hybrid kits are back ordered a month or more, boo.

Truga
May 4, 2014
Lipstick Apathy

Aquila posted:

gently caress why do I do this to myself. I can afford a 970 and 980TI. I don't usually play release aaa 3d games, but every few years I find one I really like and end up getting a new graphics card for it, this might be the first time I'm not falling into that cycle. I also really want a 4k display for my desktop, so I know I really should get at least a... 980 for that? Is there any place left in the nvidia line for the plain 980? I think first step will be moving past my current 560ti 448 (200+ watt power draw!), my htpc isn't even built yet and my tv is still in a box.

I think a 980 is pretty much out at this point. If you want 4k, you'll get basically the same performance out of a 290x for a much lower price (though it won't overclock nearly as much if that's your thing). And either way, that performance will be unsatisfactory. It's also barely faster than a 970 at pretty much any lower resolution, for a rather big price increase.

Get a 980Ti for 4k. Since you don't seem to be in a hurry, wait until AMD puts their poo poo out in 2 weeks, it could end up being competitive and drop the prices a couple bucks.

Incredulous Dylan posted:

DOTA 2, TF2 and CS:GO run zoomed in at 3440x1440 due to the competitive nature of those games

I know the DOTA guys are gigantic idiots when it comes to dumb decisions (hey, guys, fighting with the loving camera/view is totally a game skill and not a sign of shittiest UX ever imagined), and is the main reason I stopped playing those kinds of games, but last time I played TF2 I was able to set my own FoV within fairly loose limits? Has the DOTA retardation spread to the rest of valve?

Incredulous Dylan
Oct 22, 2004

Fun Shoe
There are restrictions on your FOV in TF2 for a good while now, but you can modify the FOV of weapons and stuff. There's no way around getting the top and bottom cut off at 3440x1440 afaik. I totally understand it with DOTA, since being able to select something in the jungle while still being able to monitor your lane and check enemy items, etc. would be a big advantage over the other players.

vvv You might as well grab the new GPU if it's a good deal and wait for the new generation of Intel chips which will be a few months from now. Doubly so if you'll just be gaming at 1080p since a 980 will crush that for you. They have released some new chips in the last few months but it's worth waiting to see what they will be putting out then since there will be some new manufacturing processes behind them.

Incredulous Dylan fucked around with this message at 18:51 on Jun 5, 2015

Pierson
Oct 31, 2004



College Slice
Is there ever a point at which it ISN'T a good idea to upgrade my GPU and instead upgrade a different component? I'm using an i5-2500k system with 8GB of RAM and my GPU is a G770 that is getting kind of cranky and one of the fans is making a frankly annoying noise. A friend is offering me a cheap-ish 980 and I really want to take it off his hands. This is a 90% dumb question but I need to be sure because this is a big chunk of change; there isn't going to be some kind of weird thing where my other components will be too old/outdated to run the thing properly, right?

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

Incredulous Dylan posted:

being able to select something in the jungle while still being able to monitor your lane and check enemy items, etc. would be a big advantage over the other players.

If minimap wouldn't exist, I'd say yes. As it is, it's minor at best, dota players will keep screaming at you to look at your minimap at all times, why does my zoom matter then?

It's dumb, and it makes things absolutely huge and feels super cramped on my 30", to the point of being frustrating. I know, first world problems, but it's just dumb design. I played heroes of newerth a bit, and there was a console command to uncap the zoom and it made it better in every single way, but then idiots with 11" monitors started complaining about it and they removed it and I stopped playing again.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply