|
Paul MaudDib posted:AMD has traditionally held the cryptocurrency crown for some architectural reason I don't know. I think it might be that AMDs focus on integer performance while NVIDIA focuses on floating-point performance? Or instructions that AMD implements that NVIDIA doesn't? If I am remembering correctly you hit the nail on the head. There are a few integer operations in particular implemented on AMD chips that make them much better at SHA-256 algorithms than their NV equivalent. Also, and I may be mistaken, AMD consumer chips have more compute units than their NV counterparts for some reason (shaders, of course, being vastly more relevant for games). I am not that sophisticated in this arena, so someone may correct me.
|
# ? Apr 26, 2015 03:59 |
|
|
# ? Jun 5, 2024 19:05 |
|
Paul MaudDib posted:AMD has traditionally held the cryptocurrency crown for some architectural reason I don't know. I think it might be that AMDs focus on integer performance while NVIDIA focuses on floating-point performance? In one.
|
# ? Apr 26, 2015 04:11 |
|
Yudo posted:Also, and I may be mistaken, AMD consumer chips have more compute units than their NV counterparts for some reason (shaders, of course, being vastly more relevant for games). Shaders [shader programs] run in parallel (SIMD) on compute units (driving lockstep groups of 32 or 64 individual compute cores, branded stream processors or CUDA cores) in modern GPU architectures. There's no such thing as a dedicated shader unit anymore, it's all just programs that run on a general purpose SIMD processor. AMD sometimes refers to the compute units as "shader engines", but this refers to the SIMD compute processors, or an aggregation of SIMD compute processors, along with their associated support hardware (memory units, texture units, ROPs, etc). This is just an architectural decision, having your processors subdivided into smaller groups isn't necessarily better. e: I'm guessing you know this, but for those reading along, the number of cores is also not a relevant number either. NVIDIA's CUDA cores are significantly more complex/powerful than AMD's stream processors on an individual basis, AMD uses more of them. You can only compare number of cores/stream engines/etc within an architecture. I would guess that's why NVIDIA does better at lower resolutions and AMD catches up at higher resolutions - a greater degree of parallelism (higher resolution) will better exploit a greater number of processors. Paul MaudDib fucked around with this message at 05:43 on Apr 26, 2015 |
# ? Apr 26, 2015 05:13 |
|
Yudo posted:If I am remembering correctly you hit the nail on the head. There are a few integer operations in particular implemented on AMD chips that make them much better at SHA-256 algorithms than their NV equivalent. Also, and I may be mistaken, AMD consumer chips have more compute units than their NV counterparts for some reason (shaders, of course, being vastly more relevant for games).
|
# ? Apr 26, 2015 05:35 |
|
Paul MaudDib posted:AMD has traditionally held the cryptocurrency crown for some architectural reason I don't know. I think it might be that AMDs focus on integer performance while NVIDIA focuses on floating-point performance? Or instructions that AMD implements that NVIDIA doesn't? EDIT: Nevermind it's actually because the mining algorithm requires good performance in a 32-bit integer shifting operation that is faster on AMD cards than Nvidia. MaxxBot fucked around with this message at 06:20 on Apr 26, 2015 |
# ? Apr 26, 2015 06:14 |
|
Recently Adobe released Lightroom 6 and one of the key improvements is that it can better leverage on the GPU for faster editing. I am currently using a 6850 and I am happy with it for the games I play (Diablo III and LOL) but it seems like this card does not meet the minimum requirements for GPU acceleration in LR 6, and hence I am not really seeing any improvements in performance after upgrading to LR 6. I feel like getting a GTX 970 but have been unable to justify it as I am not really a hardcore gamer, but if it gives a significant boost to LR 6 performance then I would give it serious consideration. Just wanted to ask if anyone here has a GTX 970 and have upgraded from LR5 to LR6, what has your experience been like? Thanks in advance!
|
# ? Apr 26, 2015 10:05 |
|
Remo posted:Recently Adobe released Lightroom 6 and one of the key improvements is that it can better leverage on the GPU for faster editing. http://protogtech.com/adobe-lightroom/do-i-need-a-powerful-video-card-for-lightroom-cc-6/
|
# ? Apr 26, 2015 11:49 |
|
Just bought a couple of 980s after having used AMD/ATI my whole life. I think the last nvidia card I used was that one after the Riva TNT 2. I am really appreciating how much better the software is and can see why people have always preferred nvidia. I was having some trouble getting the DSR to work. I've watched some tutorials and the option used to switch it on simply isn't there. I did read that it will only pop up on supported games but I have tried a whole bunch and none of them allow it. Are games that support it rare? I mainly want it for Pillars of Eternity to increase the detail on the player models. Also in my googling I found that it does support 1440p monitors but it was an old article - maybe that has changed? edit: running Windows 8.1. Latest drivers and everything is updated.
|
# ? Apr 26, 2015 12:04 |
|
Enabling it should be easy, set the factors in nvidia control panel's 3d settings and those resolutions should be selectable in-game. Its not a dsr setting in-game, just a resolution. Only works on real fullscreen apps.
|
# ? Apr 26, 2015 12:25 |
|
fuckpot posted:Just bought a couple of 980s after having used AMD/ATI my whole life. I think the last nvidia card I used was that one after the Riva TNT 2. I am really appreciating how much better the software is and can see why people have always preferred nvidia. I was having some trouble getting the DSR to work. I've watched some tutorials and the option used to switch it on simply isn't there. I did read that it will only pop up on supported games but I have tried a whole bunch and none of them allow it. Are games that support it rare? I mainly want it for Pillars of Eternity to increase the detail on the player models. Also in my googling I found that it does support 1440p monitors but it was an old article - maybe that has changed? There's no "DSR - Factors" under Manage 3D settings?
|
# ? Apr 26, 2015 13:49 |
|
DSR and custom resolutions are incompatible with each other for some reason, so if you're using custom resolutions to overclock your display DSR gets disabled.
|
# ? Apr 26, 2015 14:27 |
|
The real problem is that EDIT: You're using a 1440p monitor? Is it a G-Sync one? https://forums.geforce.com/default/topic/784180/geforce-drivers/system-configurations-currently-supporting-dsr-as-of-driver-344-65/1/ This is a table for what setups are supposed to be able to support DSR: code:
Sidesaddle Cavalry fucked around with this message at 18:42 on Apr 26, 2015 |
# ? Apr 26, 2015 18:30 |
|
Has anyone had issues with play any video with VLC after installing the latest nvidia driver (for GTA V)? I'm not 100% if its related but from around the same time I cannot play a single video past 5-10 minutes before the video locks up/goes black while the audio keeps going, and if I try to interact with the program it just shuts off entirely. I tried reinstalling to no avail so I'm thinking it may be driver related.
|
# ? Apr 27, 2015 00:23 |
|
wolrah posted:DSR and custom resolutions are incompatible with each other for some reason, so if you're using custom resolutions to overclock your display DSR gets disabled.
|
# ? Apr 27, 2015 04:44 |
|
fuckpot posted:This would be it, I am running a custom resolution to overclock my monitor. Sorry should have mentioned that. Thanks. You have a qnix fuckpot?
|
# ? Apr 27, 2015 20:17 |
|
I'm considering upgrading from my Raedon HD7770 to a 2GB card at some point in the nearish future, but I don't know much about hardware - specifically I'm concerned about accidentally blowing out my PSU by installing a card that drains too much power. Would upgrading from a 1GB card to a 2GB card offer any major performance increase in and of itself? I checked the OP but it seems to be a few years out of date.quote:Case: Corsair Carbide Series 300R Case
|
# ? Apr 27, 2015 22:45 |
|
Dr Snofeld posted:I'm considering upgrading from my Raedon HD7770 to a 2GB card at some point in the nearish future, but I don't know much about hardware - specifically I'm concerned about accidentally blowing out my PSU by installing a card that drains too much power. Would upgrading from a 1GB card to a 2GB card offer any major performance increase in and of itself? I checked the OP but it seems to be a few years out of date. You want the parts picking thread, but it's worth noting that VRAM is way less relevant than how powerful a graphics card actually is on its own. Get a 970 if you can afford it, or a ~$230 AR r290 if you can't. Below that it's not a fantastic value for your money.
|
# ? Apr 27, 2015 22:50 |
|
Sounds like a GTX 960 would be perfect for you. Gives twice the performance for an extra 40 watts. e: nevermind you have a 600 watt PSU, get an R290 or GTX 970
|
# ? Apr 27, 2015 22:52 |
|
eggyolk posted:Sounds like a GTX 960 would be perfect for you. Gives twice the performance for an extra 40 watts. Well, I guess that future won't be as nearish as I thought it might be, but I'll certainly make a note of these suggestions for when I am better able to upgrade. Thanks very much.
|
# ? Apr 27, 2015 22:57 |
Dr Snofeld posted:I'm considering upgrading from my Raedon HD7770 to a 2GB card at some point in the nearish future, but I don't know much about hardware - specifically I'm concerned about accidentally blowing out my PSU by installing a card that drains too much power. Would upgrading from a 1GB card to a 2GB card offer any major performance increase in and of itself? I checked the OP but it seems to be a few years out of date. These days you want a 4GB card, especially as that seems to be the new standard and lots of future console ports will want over 2GB. The cards already suggested are good choices, I would avoid the 4GB 960s since they have a very narrow memory bus that chokes the whole card, the 970 is well worth it especially if you want to get The Witcher 3 since they are running an offer where if you buy a 970 you get TW3 free when it comes out.
|
|
# ? Apr 28, 2015 01:32 |
|
Dr Snofeld posted:I'm considering upgrading from my Raedon HD7770 to a 2GB card at some point in the nearish future, but I don't know much about hardware - specifically I'm concerned about accidentally blowing out my PSU by installing a card that drains too much power. Would upgrading from a 1GB card to a 2GB card offer any major performance increase in and of itself? I checked the OP but it seems to be a few years out of date. 600w is enough PSU for any *single* GPU on the market, you've got nothing to worry about. For 1080p-and-below gaming I'll echo everyone else, buy yourself a GTX 970. There are tons of them open-box on Newegg lately if you feel like taking a chance on the accessories. You will not get the Witcher bundle if you do that however. If you are going to do something oddball like 1440p / 144hz / multi-monitor / 4K / etc then you should consider aiming higher, or waiting to see what the 390X is like when it comes out. There's no real reason to buy a 2GB card unless you're reeeeally scraping bottom. A used 3GB 7950 is like $110 nowadays and that's the bare minimum I'd buy right now. The general consensus is that GPU memory usage of games is headed for a pretty steep incline in the near future. Something in the GTX 970 / R9 290 / R9 290X class will give you a nice bump in performance and a reasonable degree of future proofing. Again, you may want to wait a month and see what the 390X release does to prices, it should drive them down somewhat across the board I'd think. Paul MaudDib fucked around with this message at 02:19 on Apr 28, 2015 |
# ? Apr 28, 2015 02:10 |
|
This may be a crazy question, but I have a 2560x1440 monitor and a GTX 970. I've been playing Elite but it doesn't seem to have anti-aliasing that works. If I use DSR the I get really really bad framerates. Is there a way to use a DSR mode for 1080p with a non-1080p monitor? It only gives me multiples of the monitors native resolution.
|
# ? Apr 28, 2015 02:19 |
|
TomR posted:This may be a crazy question, but I have a 2560x1440 monitor and a GTX 970. I've been playing Elite but it doesn't seem to have anti-aliasing that works. If I use DSR the I get really really bad framerates. Is there a way to use a DSR mode for 1080p with a non-1080p monitor? It only gives me multiples of the monitors native resolution. "hello i'd like to order a happy meal, also super size it" What other methods of AA have you tried so far outside of the in-game options? E: There are third-party tools like nvidiainspector that allow manual forcing of MSAA and the like into games. Sidesaddle Cavalry fucked around with this message at 05:15 on Apr 28, 2015 |
# ? Apr 28, 2015 05:09 |
|
You could also tool around with SMAA injector.
|
# ? Apr 28, 2015 13:59 |
|
I've only changed settings in the NV control panel and it didn't seem to do much of anything. In game ranges from jaggies to sparkly jaggies when I change settings. I'll go read up on injectors, thanks.
|
# ? Apr 28, 2015 14:58 |
|
DSR is a multiple of native resolution in the sense that aspect ratio is preserved, but it doesn't have to be an integer multiple. I suspect that relatively few people use DSR at 2x rather than 1.25 or 1.5. You can select which DSR resolutions are available in the NVIDIA control panel.
|
# ? Apr 28, 2015 15:02 |
|
There's no way to trick your GPU to output at 1080 but really render at something else when your monitor is 1440. When you set up DSR, Windows actually thinks your monitor runs at that resolution. So if I set it up so I can run at 4K, I can actually set my windows resolution to 4K. Windows and any game that supports that resolution will render at that resolution. The GPU itself will downscale back to 1440p for the monitor.
|
# ? Apr 28, 2015 21:47 |
|
Some engines (Unreal for one) allow the internal rendering resolution to differ from the display resolution, meaning that you could have it basically do 1080p worth of work and then scale it up to display at 1440p. Roughly the opposite of DSR, perhaps "dynamic sub resolution"!
|
# ? Apr 28, 2015 22:02 |
|
Subjunctive posted:Some engines (Unreal for one) allow the internal rendering resolution to differ from the display resolution, meaning that you could have it basically do 1080p worth of work and then scale it up to display at 1440p. Roughly the opposite of DSR, perhaps "dynamic sub resolution"! This is how all the consoles work, and I think will be the trick to make gaming possible on high-end integrated graphics. I'd gladly play at upscaled 720p on HD 6000 graphics if the UI were somehow able to be proper scale.
|
# ? Apr 28, 2015 22:17 |
|
I'm reeeeealy curious where the 390X is going to clock in. If it has the rumored 4096 cores that would put it at about a 45% boost in performance over the 290X even ignoring any architectural improvements. At 4K that would basically be Titan X territory.
|
# ? Apr 29, 2015 02:09 |
|
Subjunctive posted:Some engines (Unreal for one) allow the internal rendering resolution to differ from the display resolution, meaning that you could have it basically do 1080p worth of work and then scale it up to display at 1440p. Roughly the opposite of DSR, perhaps "dynamic sub resolution"! Isn't this basically what happens when you select "GPU Scaling" in the control panel for resolutions that differ from native? The GPU delivers a native res signal to the monitor but the OS thinks it's dealing with something smaller.
|
# ? Apr 29, 2015 03:41 |
|
wolrah posted:Isn't this basically what happens when you select "GPU Scaling" in the control panel for resolutions that differ from native? The GPU delivers a native res signal to the monitor but the OS thinks it's dealing with something smaller. I actually commend Nvidia for their GPU resolution scaling options. If I am remembering correctly, you can scale old 4:3 games to 4:3 without making it wide-screen. Playing things like FallOut and Diablo at wide-screen was weird.
|
# ? Apr 29, 2015 04:02 |
|
is there a comparable benchmark I can run to get an idea of how well I will run witcher 3? Is witcher 2 on the same engine? If I need to upgrade my graphics, i would rather do it now while I can still get a free copy of the game. If there is a better thread to ask this in, please let me know and ill move it Fauxtool fucked around with this message at 04:11 on Apr 29, 2015 |
# ? Apr 29, 2015 04:07 |
|
Only way to tell how well your system will run Witcher 3 is to play Witcher 3. You got a GTX 970 and an i5, so you're probably fine. When it comes out, you can check out Guru3d or HardOCP to see if they have a performance review. This is probably a better question for the parts picking thread, but you won't get a good answer until Witcher 3 is out.
|
# ? Apr 29, 2015 05:58 |
|
IIRC the game was said to run maxed out 1080p on a 980, but it wouldn't surprise me if a 970 will do basically the same thing.
|
# ? Apr 29, 2015 06:05 |
|
GrizzlyCow posted:Only way to tell how well your system will run Witcher 3 is to play Witcher 3. You got a GTX 970 and an i5, so you're probably fine. cool. The last game I had to upgrade for was ffxiv ARR. They had an excellent benchmarking tool released way before the game launch. I wish more games would do that kind of thing. It totally nailed how much power the game needed. Fauxtool fucked around with this message at 09:23 on Apr 29, 2015 |
# ? Apr 29, 2015 09:20 |
|
BurritoJustice posted:IIRC the game was said to run maxed out 1080p on a 980 Kinda, they disabled Nvidia HairWorks on their demo boxes to get (mostly) stable 60fps on a 980. That poo poo is a huge hog. But everything else is fine for 970/980's.
|
# ? Apr 29, 2015 10:17 |
|
Fauxtool posted:cool. I agree, that benchmark tool was spot on. Shame they don't put that much thought and effort into their other games on PC...
|
# ? Apr 29, 2015 19:38 |
|
So I'm upgrading from an old GTX560Ti to a GTX970. I noticed that the 970 uses PCIe 3.0 where my current motherboard is still PCIe 2.0. I know that it's backwards-compatible, but will I notice any performance hit at all?
|
# ? Apr 29, 2015 22:21 |
|
|
# ? Jun 5, 2024 19:05 |
|
You won't notice it. PCIe 2.0 is already a huge amount of bandwidth on even an 8x link. A 16x 2.0 link has the same bandwidth as an 8x 3.0 link, and you generally do SLI on 4x or 8x links.
|
# ? Apr 29, 2015 22:25 |