|
Agreed posted:Basically, yeah. The review site had to use a third-party hacked driver to obtain those test results at all, and in my opinion that is an extremely charitable way to go about reviewing. Sort of like if every Haswell review just casually mentioned that you have to take the IHS off, scrape the glue out, and reseat it all while causing no harm to now on-die very sensitive components, no big deal, check out this 4.5GHz overclock we trivially got! Just not quite that simple I bought a laptop around the start of December last year for the final year of some heavy graphics programming University work. I had a choice between a 7970M and a 680M. Looking at the performance versus cost (I think the 680 would have cost an extra ~£400 at the time or something equally insane), I went with AMD. I have never made a worse loving decision in my life. Since day one, the drivers have been loving horrible, and any improvements at all have been marginal at best. lovely performance, awful graphical glitches, Enduro refusing to do what it's supposed to whenever it feels like it. Yeah, I hate this thing more than the last Team-Green GPU I bought, the (amazingly) shittastic 9800 GX2. Add to that some very odd performance issues too. My final year project work revolved around Tessellation; the University lab computers (Q6600s, 560 TIs) managed to run my project at 4 times the frame rate. I'd have really loved to get into the nitty-gritty as to why that was the case, but unfortunately, AMDs performance debugging tools are either hosed, broken, or just don't work with their mobile GPUs (or possibly all 3). PerfStudio can't capture anything other than very very basic frame data, and PerfApi won't even initialize at all, let alone reach a point where I could capture anything with it.
|
# ? Aug 13, 2013 00:39 |
|
|
# ? May 30, 2024 16:47 |
|
AMD has been known for shoddy tooling and APIs for developers compared to nVidia while being somewhat Ok about IP sharing / licensing while the opposite seems true for nVidia (see: Linus Torvalds' rants on nVidia).
|
# ? Aug 13, 2013 15:53 |
|
So I got a private live demo of GRID and got to tool around with it. Pretty fun. Also Nvidia HQ really loves their green color. And the cafe food isn't bad either.
|
# ? Aug 14, 2013 09:01 |
|
AMD is ~kinda~ not poo poo on desktops, but on laptops they basically do an android "Your OEM is responsible for your drivers" thing and it ends in a shitshow. When I had an HP with low end AMD graphics a few years back the display drivers had a build date of 6 months before my purchase and were never updated, it's terrible. They also give OEMS the ability to disable features artificially in order to segregate the market, this is the main reason they don't offer "stock" drivers.
|
# ? Aug 15, 2013 01:18 |
|
AMD does offer Catalyst Mobility drivers for laptops now, though some brands are blacklisted per OEM demand.
|
# ? Aug 15, 2013 02:31 |
|
SRQ posted:AMD is ~kinda~ not poo poo on desktops, but on laptops they basically do an android "Your OEM is responsible for your drivers" thing and it ends in a shitshow. Jesus, that sounds like a massive shitshow.
|
# ? Aug 15, 2013 04:43 |
|
It's something they've been trying to undo, but last I heard on that front was last year. The end goal is to have stock drivers Just Work for IGP and dGPU, regardless of vendor and customizations (TDP, clocks, config, etc.). I think that works with Intel and Nvidia stuff right now? But if it does it was only recently.
|
# ? Aug 15, 2013 04:48 |
|
I know this might open a can of worms but IMHO AMD should just give up on the desktop CPU market and focus on the GPU and mobile APU market instead. And of course continue to make the Custom APUs for the PS4 and Xbox One. What would be interesting is if in response to Nvidia's Maxwell (ARM CPU + Kepler 2.0 GPU) AMD would release a graphics card with an APU (Jaguar CPU + GCN 2.0 GPU) similar to what's going into the PS4 except more powerful of course.
|
# ? Aug 15, 2013 06:10 |
|
spasticColon posted:a graphics card with an APU (Jaguar CPU + GCN 2.0 GPU) similar to what's going into the PS4 except more powerful of course. To what end? The CPU would perform worse than whatever's in the host system unless someone made a very dumb decision. System-on-a-card devices like that and variations on the theme have existed before and do exist today, but the problem is that there's no systems architecture in place on the consumer side that can make that work without it completely replacing the main system. PCI and PCI Express are just 100% not suitable for being an interconnect between multiple processor sockets. There isn't enough bandwidth until you get to full PCIe 3.0 x16, and even then the latency is way too high. It's problematic enough for GPU workloads, never mind fused GPU/CPU. The ARM CPU in Maxwell isn't there to do work the main CPU couldn't do faster. It's there because PCIe is so drat slow to transfer back and forth that even a slow ARM core can do a small single-threaded workload faster than it would take to transfer that work to the CPU, have it do that work quickly, and then transfer the result back. It's there for HPC and compute and no other reason.
|
# ? Aug 15, 2013 06:25 |
|
Factory Factory posted:The ARM CPU in Maxwell isn't there to do work the main CPU couldn't do faster. It's there because PCIe is so drat slow to transfer back and forth that even a slow ARM core can do a small single-threaded workload faster than it would take to transfer that work to the CPU, have it do that work quickly, and then transfer the result back. It's there for HPC and compute and no other reason. But what if the CPU on an APU would do the same thing that a slow ARM CPU would be doing on Maxwell? Aren't ARM CPUs still slower than a low-power AMD CPU? spasticColon fucked around with this message at 06:49 on Aug 15, 2013 |
# ? Aug 15, 2013 06:42 |
|
1) That has nothing to do with gaming still. 2) AMD has an ARM license, they can just stick their own ARM cores in like Nvidia does. 3) Depends on the ARM core and what AMD wants out of it. Jaguar is faster, yes, but it also uses a lot more power. This is bad for every workload, because power being used by the integrated CPU is power that can't be used by the GPU. The purpose of the integrated core, again, isn't to do heavy lifting, it's just to avoid Ahmdal's Law bottlenecks without going to the main CPU.
|
# ? Aug 15, 2013 07:03 |
|
Factory Factory posted:System-on-a-card devices like that and variations on the theme have existed before and do exist today, but the problem is that there's no systems architecture in place on the consumer side that can make that work without it completely replacing the main system. PCI and PCI Express are just 100% not suitable for being an interconnect between multiple processor sockets. There isn't enough bandwidth until you get to full PCIe 3.0 x16, and even then the latency is way too high. It's problematic enough for GPU workloads, never mind fused GPU/CPU. AMD card, AMD system, they could bring back the HTX slot. Connect that fucker with HyperTransport.. HalloKitty fucked around with this message at 12:05 on Aug 15, 2013 |
# ? Aug 15, 2013 12:02 |
|
spasticColon posted:I know this might open a can of worms but IMHO AMD should just give up on the desktop CPU market and focus on the GPU and mobile APU market instead. And of course continue to make the Custom APUs for the PS4 and Xbox One. What would be interesting is if in response to Nvidia's Maxwell (ARM CPU + Kepler 2.0 GPU) AMD would release a graphics card with an APU (Jaguar CPU + GCN 2.0 GPU) similar to what's going into the PS4 except more powerful of course. AMD often has better pricing at certain levels outside the US than Nvidia. Edit: Oh, you said desktop CPU. All the acronyms blurred together. I was just talking about GPUs. Aphrodite fucked around with this message at 14:12 on Aug 15, 2013 |
# ? Aug 15, 2013 13:45 |
|
And it's worse both instructions-per-second and instructions-per-joule against similarly-clocked Intel parts with half the logic units. Hard to imagine the discounts being worth the drawbacks anywhere. GPUs I can see AMD being competitive, because performance-wise they are, when their driver people are on the ball. Then again, no one's graphics drivers are on the ball lately.
|
# ? Aug 15, 2013 14:00 |
|
Edit: I moved this post to the AMD CPU and Platform megathread where it belonged.
|
# ? Aug 16, 2013 00:28 |
|
EDIT: Disregard, I hosed up my fan curve like a moron. :P
Gonkish fucked around with this message at 01:41 on Aug 16, 2013 |
# ? Aug 16, 2013 01:21 |
|
I've just got a free HD 5770 from a friend to complement my existing HD 5770. But there's a weird snag; Both cards are exactly identical: http://news.softpedia.com/newsImage/ASUS-Radeon-HD-5770-CuCore-Almost-Here-2.jpg but the new one has some serious temperature issues. In furmark my existing card tops out at 75 Degrees C, where the new card gets to 104 and I kill the app in a panic. They idle within 2 degrees of each other, but being the secondary crossfire card the new card is at half the clocks of my main one. They do go to the same clocks in game - 800mHz core / 1200mHz memory but the temperature difference under load is mind boggling Things I have tried; - Open case with a fan blowing at them - Putting in a front case fan that blows across the cards With no difference, I'm going to swap them tonight to see if it's the location but surely the bottom card will always be the coolest? The friend also denies ever taking off the heatsink but says they gave it a clean before I got it. Cancelbot fucked around with this message at 08:31 on Aug 16, 2013 |
# ? Aug 16, 2013 08:29 |
|
If swapping positioning doesn't do it you probably need to remove the heatsink and do a thorough cleaning, including the GPU die and heatsink base, with a new application of thermal paste before remounting the heatsink.
|
# ? Aug 16, 2013 12:43 |
|
Alereon posted:If swapping positioning doesn't do it you probably need to remove the heatsink and do a thorough cleaning, including the GPU die and heatsink base, with a new application of thermal paste before remounting the heatsink. Yeah I have some cleaner + arctic silver to fix it up which I'll test out. But I've fixed my problem by buying a HD7870 , I will try and fix the other card so I can give it to someone at work instead.
|
# ? Aug 16, 2013 13:21 |
|
Thought I'd share this with any Britgoons, looks like an excellent deal: http://www.scan.co.uk/products/2gb-msi-gtx-670-overclocked-28nm-pcie-30-(x16)-6008mhz-gddr5-gpu-965mhz-boost-1045mhz-cores-1344-2xd 670 for £179, free postage
|
# ? Aug 17, 2013 18:24 |
|
I sold a 6950 to a bitcoiner a while back and got $200 for it, then got a GTX 660 for 160. My point? Bitcoin spergs are dumb and will buy old video cards for more then they are worth, esp. that one because apparently it's easy to flash.
|
# ? Aug 18, 2013 07:54 |
|
Sadly that's not likely to work any more, since it's not profitable to mine using graphics cards
|
# ? Aug 18, 2013 10:19 |
|
So Apple is doing a replacement program for the HD 6970M GPU used in the 27" iMac. And it's kinda surprising because just now they're finding an issue in a multi-year-old GPU? 9to5 Mac has the program details. Apparently it's real enough but they just now figured out it's a hardware problem rather than a driver problem.
|
# ? Aug 19, 2013 01:43 |
|
SRQ posted:I sold a 6950 to a bitcoiner a while back and got $200 for it, then got a GTX 660 for 160. Huh. Wonder what I could get out of a 7950? I am not well versed in it's mining strengths. Edit: Mainly because it'd be cool to sidegrade and nearly break even to a 760/670 if possible.
|
# ? Aug 19, 2013 02:31 |
|
Put it up on craigslist you might get lucky. I sure as heck did. The 660 is at least as fast, smaller, cooler, and I can run OSX now.
|
# ? Aug 19, 2013 06:35 |
|
http://www.xbitlabs.com/news/multimedia/display/20130807232531_AMD_Reveals_Official_Name_for_Next_Gen_Never_Settle_Bundle.html So AMD cards might be bundled with Battlefield 4 and something else coming up soon. These pack-ins appear to be getting better and better.
|
# ? Aug 19, 2013 17:31 |
|
Oh hey Nvidia actually fixed the TDR issue. It'll be ready when the next WHQL drivers come out.quote:ManuelG said: e: From guru3d which was from somewhere on the geforce developer forums. MagusDraco fucked around with this message at 17:36 on Aug 19, 2013 |
# ? Aug 19, 2013 17:33 |
|
Yeah, he posted a list of fixes that should be in the next driver:ManuelG posted:Here is a list of some bugs which are fixed in our upcoming driver:
|
# ? Aug 19, 2013 22:45 |
|
havenwaters posted:Oh hey Nvidia actually fixed the TDR issue. It'll be ready when the next WHQL drivers come out. gently caress yes! They finally fixed it!?!? GIVE ME THE drat DRIVER NOW GOD DAMNIT! I didn't pay for a 770 just to have it crash every other day!
|
# ? Aug 20, 2013 01:19 |
|
AMD has new beta drivers out right now for SRIV: http://support.amd.com/us/kbarticles/Pages/AMDCatalyst13-8WINBetaDriver.aspx They are supposed to increase performance by 25% at 1920x1080 on Ultra settings
|
# ? Aug 20, 2013 04:38 |
|
Agrajag posted:gently caress yes! They finally fixed it!?!? GIVE ME THE drat DRIVER NOW GOD DAMNIT! I didn't pay for a 770 just to have it crash every other day! Yeah seriously, can one of you nvidia dudes post this poo poo? It's pretty inconvenient!
|
# ? Aug 20, 2013 04:46 |
|
Oh my god, I'm so glad that's a known issue. My GTX 770 crashes every other day like clockwork. Computer isn't idle that entire time, but always crashes after an extended break.
|
# ? Aug 20, 2013 07:08 |
|
So I've been using a GTX 275 for the past 3 years and it has done fine for me. However I feel as it might finally be ending its life since I've had a couple of issues where my screen would freeze and audio would continue. The only PC game I see myself playing in the neat future is Final Fantasy 14 and my card right now runs it fine. Should I pick up another cheap GTX 275 or actually upgrade to a budget Nvidia card?
|
# ? Aug 20, 2013 08:26 |
|
The 326.80 beta that fixes the TDR issue seems to be up.Kuai posted:So I've been using a GTX 275 for the past 3 years and it has done fine for me. However I feel as it might finally be ending its life since I've had a couple of issues where my screen would freeze and audio would continue. The only PC game I see myself playing in the neat future is Final Fantasy 14 and my card right now runs it fine. Should I pick up another cheap GTX 275 or actually upgrade to a budget Nvidia card? What's your budget? Also getting another 275 is probably not wise unless we are talking like, $10. Dogen fucked around with this message at 13:26 on Aug 20, 2013 |
# ? Aug 20, 2013 13:23 |
|
I've been out of the loop for a month or so, is the 320.18 corruption issue fixed?
|
# ? Aug 20, 2013 13:45 |
|
Kuai posted:So I've been using a GTX 275 for the past 3 years and it has done fine for me. However I feel as it might finally be ending its life since I've had a couple of issues where my screen would freeze and audio would continue. The only PC game I see myself playing in the neat future is Final Fantasy 14 and my card right now runs it fine. Should I pick up another cheap GTX 275 or actually upgrade to a budget Nvidia card? Largely depends on your budget. If you can afford it I'd recommend a 760. You should be able to ride that for a couple years, too.
|
# ? Aug 20, 2013 17:01 |
|
My budget for a new vid card would be 200ish. I honestly only play MMOs which don't seem to require beefy vid cards.
|
# ? Aug 20, 2013 21:21 |
|
Kuai posted:My budget for a new vid card would be 200ish. I honestly only play MMOs which don't seem to require beefy vid cards. You're partially correct--newer MMOs require beefy CPUs instead.
|
# ? Aug 20, 2013 21:27 |
|
Two suggestions might be either 7950 or a 7870 Ghz. Both would give a significant boost in gpu power and should give a couple years of use.
|
# ? Aug 20, 2013 21:33 |
|
|
# ? May 30, 2024 16:47 |
|
So, is there any confirmation that 326.80 beta driver from Nvidia fixes the crashing issue?
|
# ? Aug 21, 2013 04:07 |