|
Dominoes posted:Is that where the cursor will turn into staticy vertical lines? I didn't realize that was a video card glitch. It happens with multiple monitors, though I haven't had any corruption for a few months. At the very least, not long lasting. You can wiggle the mouse back and forth between two of the screens and fix it. Otherwise a restart will restore it, or open up windows magnifier at 1x.
|
# ? Dec 14, 2012 20:34 |
|
|
# ? May 28, 2024 16:28 |
|
Dominoes posted:I'm switching to nvidia because ATI can't get their poo poo together on vsync with multiple monitors. Are you using analog connectors?
|
# ? Dec 14, 2012 23:40 |
|
kuddles posted:I was going to post that. Of course both the AMD and Nvidia fanboys are going nuts but the real point of the article is that current benchmarking comparisons of just listing the average or maximum FPS don't tell the whole story about how well a card works on particular games. I'm not an Nvidia or AMD fanboy and both of my HD7850 cards are doing this in both of my systems (single card in each machine; no crossfire) in the same games. Even with the newest beta driver and catalyst application profile I still get stuttering in Farcry 3 if I enable AA. With Skyrim I get stuttering and graphical glitches. With Arkham City I get stuttering even if I disable the DX11 features and AA. And its not like these games use OpenGL like Rage does these games all use DirectX although I have read reports online that the new drivers break some DX9 games or games in DX9 mode for some reason Skyrim being one of them. Is it the hardware itself that is causing this issue or did they just botch the drivers?
|
# ? Dec 15, 2012 00:28 |
|
spasticColon posted:Is it the hardware itself that is causing this issue or did they just botch the drivers? Sadly, that's not even known at this point. I've read through the Guru3D post that was linked earlier, and it does seem to vary wildly between games and drivers. I didn't really have any issues when I first got the 7850s, and it's been getting noticeably worse over time, so it could be drivers or just hardware degrading. I'll be running some more tests, although it's hard to test anything without having a precise repro case.
|
# ? Dec 15, 2012 01:04 |
|
Jan posted:Sadly, that's not even known at this point. I've read through the Guru3D post that was linked earlier, and it does seem to vary wildly between games and drivers. I didn't really have any issues when I first got the 7850s, and it's been getting noticeably worse over time, so it could be drivers or just hardware degrading. The problem started for me with the 12.10 drivers and the subsequent beta drivers really haven't improved anything other than Farcry 3 running smoother as long as I keep AA disabled. If they issue a recall of these cards I'm going to cry and depending on how much xmas money I get, order a few nvidia cards for my systems.
|
# ? Dec 15, 2012 01:14 |
|
Rawrbomb posted:For anecdotal evidence the other way, I had 6 NVidia cards blow out in 2007-9 between me (2) my friends (4). I also had non stop crashing issues. I moved to ATi/AMD and never looked back Yeah see, and I'd take this over software issues any day. If there are software issues it costs me time to gently caress with it to find a workaround or fix. If it's hardware I can return/rma/use it as an excuse to upgrade. This is further mitigated by buying from manufacturers with better reputations and warranties. I can see where others might feel differently, but my experience with nvidia cards has been great overall. The one problem I ever had was a DOA card I just sent back to Amazon and used the refund toward a better card.
|
# ? Dec 15, 2012 01:19 |
|
I had constant driver crashing issues as well. I've not had major video driver issues for 2~ years or so now with my amd cards, aside from cursor corruption once every so often. Its really personal preference / price point now. I've stuck with AMD to date due to better displayport cards (which didn't happen until this generation on NVidia). But I also drive 3 monitors and have been for 2+ years now since the displayport option came about on the 5000 series.
|
# ? Dec 15, 2012 01:24 |
|
After having great luck with two Nvidia cards in a row (8800 GTS, then a GTX 570) I decided to try out a GTX 680. I got an EVGA model from Amazon and plugged it right in. Almost right away I noticed this weird hitching / stuttering that was present both during gaming and video playback. If I tried watching youtube videos or regular videos, about every 30 seconds there would be a noticable stutter and then it would resume smooth playback. In games my FPS was much higher than with the 570 but again, same issue. About every 30-60 seconds it would seem like I was getting 4 FPS - but Fraps did not bear this out, it claimed my framerate was steady. The effect was super noticable and super annoying. Even my roommate commented on it when we were watching some TV shows together. I tried the whole works - clean reinstalls, registry sweepers, driver cleaners, changing settings (disabling vsync) and none of it helped. I went back to the 570 and the problem completely went away. I have a 600 watt PSU and an i7-2600k and I don't believe either are the issue - it was when the card was not under load or almost completely idle that it would happen most often. When I started googling '680 stutter' it seems this is a very common issue with 670/680 series cards, and (my speculation here) from what I read seems likely related to the power management / voltage throttling that Nvidia introduced with this series of cards. I would like to upgrade my GPU but am kinda lost here. I could try a 670 but that's less of an upgrade than the 680 and still seems to suffer from the same problem - although less often it seems. I was looking at a 7970 but I have concerns about ATI's drivers and overall stability, with issues like the frame latency being top of mind. Does anyone have either any more info about the stuttering issue with the 670 / 680 cards, or suggestions on how the 7970 performs based on their experience with it?
|
# ? Dec 15, 2012 01:29 |
|
I've surprisingly never encountered the cursor corruption issue, and I've been using ATI/AMD cards for about 10 years now (9800 -> X1950XT -> 6950). I'm not biased against nVidia per se; I remember I couldn't get a Riva TNT to work with an old VIA computer of mine, so I ended up going with a Voodoo4 4500 back in the day (good timing, that - the 9800 followed when I was able to afford it). Also, I put a Geforce in my brother's build 5 years ago, which works fine - though into a computer with an nForce chipset, which has its own issues. That series of articles has really brought out the worst in the red/green partisans. The article's advice was pertaining to the newest games coming out during the holiday season at the highest resolutions and eyecandy settings - if it turns out to be a driver issue, the 7950 will be the better card hands down once it's fixed. Instead you have people jumping all over AMD like hyenas (with a particularly vicious subset crowing for the company's death), people jumping all over TR (pretty sure the article was only intended to be a "uh, AMD, you need to look into this", but people have been savaging them because they blasted their PR department recently over the Trinity review schedule), people ripping into each other, etc. The one thing nobody wants (well, almost nobody) is an nVidia monopoly.
|
# ? Dec 15, 2012 02:22 |
|
McCoy Pauley posted:Can anyone help me figure out the difference between two different EVGA 660 ti cards -- the 660 ti FTW Signature2 and the regular 660 ti FTW?
|
# ? Dec 15, 2012 03:19 |
|
Navaash posted:people jumping all over TR (pretty sure the article was only intended to be a "uh, AMD, you need to look into this", but people have been savaging them because they blasted their PR department recently over the Trinity review schedule)
|
# ? Dec 15, 2012 03:22 |
|
hobbesmaster posted:Are you using analog connectors?
|
# ? Dec 15, 2012 05:12 |
|
Even as much as I personally go for nvidia I certainly don't want to see them have a monopoly. In fact, I hope Intel's work on igpus lights fires under asses to keep things innovative and competitive.
|
# ? Dec 15, 2012 08:44 |
|
So, on the polygon warping issue, I managed to get a frame capture of the problem with Intel GPA, by setting it to capture every 10 seconds and hope it eventually triggers spot on. I can look at the frame screenshot in the capture browser and see the issue, like so: But then when I load the capture to try and see which draw call is freaking out: I should have known better, of course it's not going to show. It's playing back the frame and getting the expected results, not those of the hardware being lovely. Oh well, looks like my awesome debugging process ends here.
|
# ? Dec 15, 2012 18:46 |
|
Rakthar posted:After having great luck with two Nvidia cards in a row (8800 GTS, then a GTX 570) I decided to try out a GTX 680. I got an EVGA model from Amazon and plugged it right in. Almost right away I noticed this weird hitching / stuttering that was present both during gaming and video playback. If I tried watching youtube videos or regular videos, about every 30 seconds there would be a noticable stutter and then it would resume smooth playback. Did you fully clean drivers before upgrading? Do you have any OCing software (like EVGA Precision) installed?
|
# ? Dec 15, 2012 19:43 |
|
This might be common knowledge to many of you but I foolishly didn't do enough research before hand. Avoid any 7950 with the boost bios like the plague. It's a total nightmare to use. Because the boost state goes to 1.25v you hit the power threshold for the card instantly so the card flips between 850 and 925 causing huge fluctuations in GPU load. You can overcome this by putting the power slider to 16% and above but this doesn't completely alleviate the problem as you now have a card that gets incredibly hot and eats power. Changing the voltage also doesn't work as every piece of software only alters the base voltage, the boost voltage stays at 1.25v. In the end I've flashed a 7970 bios (couldn't find a 7950 bios for my card that would stick) and I've now got a card with lower voltage, outperforming the boost bios in every way. Now that I've got the card performing like I want to it's excellent and would recommend it to anyone, just avoid the boost versions. I have no idea what AMD were thinking when they put that bios out.
|
# ? Dec 15, 2012 21:54 |
|
Been following this thread since my last post in it and, good lord, the last few pages basically read like "save a headache and buy Nvidia". I want to build a computer by the end of next year and really with I could go for the underdog and build a full AMD system since I don't want a company monopoly either, but the research I've been reading for the past month or so make a system like that less than ideal.
|
# ? Dec 15, 2012 23:26 |
|
Charles Martel posted:Been following this thread since my last post in it and, good lord, the last few pages basically read like "save a headache and buy Nvidia". I've had a lot of AMD systems and they've been great, but currently it's not worthwhile to buy one of their new CPUs as they're just outperformed by similarly priced Intel CPUs. The latest generation of video cards has a few issues as you've seen, but they'll likely be sorted out. I tend to buy video cards based on what the best price vs. performance I'll get based on the mid level cards I usually buy. This has changed whether I buy ATI/AMD or Nvidia over the years, but I think it's the most sound way to make a purchase rather than buy based on brand name. Both companies have had hits and misses in their releases and at any point one is usually a little bit ahead if you're looking at price vs. performance. Reliability often comes down to the third party manufacturer that actually assembles the card.
|
# ? Dec 15, 2012 23:32 |
|
Well, to be fair, my 7850s are doing the job very nicely, not counting the polygon corruption issues. And on that particular note, I went ahead and tried the newest 12.11 beta11 drivers (I was at beta4), and they pretty much entirely get rid of the problem. I think I noticed a flicker or two in Skyrim, but it could easily have been my imagination -- nothing like the psychedelicfest from my video capture. The periodic high latency issue (or whatever you could call it) mentioned in that TechReport article has been acknowledged by AMD, and it shouldn't hopefully be too long before they address it. The only other issues I've had are the occasional dud driver release that actually worsens performance in some games, but that's happened when I had nVidias as well. The negative issues just happen to be more apparent in these discussions.
|
# ? Dec 15, 2012 23:36 |
|
Yeah, AMD is still a winner for me. My 7950 is doing nicely, handling anything I throw at it. My only complaint being a lack of voltage control, which is more Gigabyte's fault than anything. I'll just flash to a 7970 bios and go from there. If it is still locked at least I have more voltage to play with (1.175v compared to 1.090v) and avoid the boost bios insanity.
|
# ? Dec 15, 2012 23:49 |
|
I never did find the fault with the fans on my MSI 7850 spinning up when the card idled (PITA), but I fitted an Accelero S1 Rev. 2 with a ~600rpm fan and it's inaudible, better temps too. Just incase anyone was wondering if those old coolers fitted 7850's, YMMV but it fitted a Twin Frozr IV 7850. I had to replace one of the included ramsinks with a low profile one from the bits box, all fine. Impressed by this old cooler - used it as it was cheap and was more or less the biggest that would fit in this ITX box. All crammed in nicely. GRINDCORE MEGGIDO fucked around with this message at 00:56 on Dec 16, 2012 |
# ? Dec 16, 2012 00:37 |
|
So I was just reading this article on high framerate (HFR) filmmaking and The Hobbit; there's hubbub that 48 FPS showings of the movie are triggering Uncanny Valley effects in a good number of people by being effectively faster than the brain's Conscious Moment Per Second rate - about 40 at rest state, 80 to 100 during intense activity. While the eye can distinguish motion at a faster rate, again varying by person but on average 66 Hz, apparently that 40 Moments per Second rate is killing the suspension of disbelief, because 48 FPS film no longer looks sufficiently different from reality for a lot of folks. It makes me wonder: Why doesn't this happen with video games? Or will it, in the future? For many people, the difference between 30 FPS and a 60 FPS in gaming is not only incredibly noticeable, but it's incredibly desirable, even if we have no problem with 30/25/24 FPS video. What distinguishes these situations? Is it the stylized aesthetics or other cues for non-reality, so that the uncanny valley isn't approached for other reasons? Or does the interactivity make a game sufficiently "real" in terms of our brain function that the imagery itself has different standards for uncanniness? Is it simply a remaining enormous gulf between real-time generated imagery and photoreality? Or does the higher framerate reduce latency between action and reaction in a way that reduces unreality more than the increased framerate increases it by itself? Or are all video game players hyperspergs who don't find Japanese robotics incredibly weird? A lot of TVs with temporal interpolation - like, upscaling the framerate to 60 or 120 FPS from 30p or 60i content - look great to some people and awful to others. Yet I'm not sure those categories map or correlate with video gaming - I know I love a solid 60 FPS in games, but motion interpolated video has a kind of hyperreal queasiness to me, like the difference between a 24p film and a 60i football game taken five steps too far. Brains man. What gives with brains? Factory Factory fucked around with this message at 01:22 on Dec 16, 2012 |
# ? Dec 16, 2012 01:19 |
|
I think when we talk about video games, there is an active input, when we do something, we expect an instant reaction. When framerates are lower, there is a perceived slowness. I think we tolerate lower frame rates on movies/tv/etc because its not interactive. Everything syncs well with what we're watching. At the very least, that is my perspective on it. GF wants to go watch the hobbit in the hifps theater, I'll deal and have fun with it regardless.
|
# ? Dec 16, 2012 01:29 |
|
Charles Martel posted:Been following this thread since my last post in it and, good lord, the last few pages basically read like "save a headache and buy Nvidia". I've had very small issues with AMD cards over the past 3-4 years, and they were mostly resolved very quickly. I'm currently running a HD6990 and it's perfectly fine and performs exceptionally well in almost every game I own. While AMD does get a lot of flak, their cards are still quite good. The frame latency issue is a bit of a poo poo problem with the 7000 series though, which hopefully gets resolved soon.
|
# ? Dec 16, 2012 03:42 |
|
Alright, so I got over my voltage lock barrier on my Gigabyte 7950 by flashing an older bios. However, changing voltage in Afterburner produces a different result in both HWiNFO64 and GPU-Z. I have Afterburner set to do 1100/1350 @1.2v, but HWiNFO64 still reports the card's stock 1.090v, while GPU-Z reports 1.175v. It seems wherever I put voltage on Afterburner, GPU-Z reports .025 less. I know the voltage change has to be working, because I just played Call of Pripyat for 20 minutes on those settings, whereas before I could only change core clock to 1000 and either not launch at all or hang at the main menu with a driver crash upon exiting. How do I know what my real voltage is? Edit: And now Afterburner refuses to let my GPU go into idle clock mode. Nothing running, still at 1100/1350/1.2v Endymion FRS MK1 fucked around with this message at 07:39 on Dec 16, 2012 |
# ? Dec 16, 2012 07:28 |
|
Factory Factory posted:Is it simply a remaining enormous gulf between real-time generated imagery and photoreality? This is a large part of it, mostly because videogames still don't have realistic motion blur, which is a huge part of why 24 FPS works as well as it does in film but looks like a jerky mess in games.
|
# ? Dec 16, 2012 07:44 |
|
Factory Factory posted:So I was just reading this article on high framerate (HFR) filmmaking and The Hobbit; there's hubbub that 48 FPS showings of the movie are triggering Uncanny Valley effects in a good number of people by being effectively faster than the brain's Conscious Moment Per Second rate - about 40 at rest state, 80 to 100 during intense activity. While the eye can distinguish motion at a faster rate, again varying by person but on average 66 Hz, apparently that 40 Moments per Second rate is killing the suspension of disbelief, because 48 FPS film no longer looks sufficiently different from reality for a lot of folks. Maybe this will finally stop the "But the human eye can't see faster than 24 FPS anyways!" crap that I hear on a daily basis. Probably not, but one can hope.
|
# ? Dec 16, 2012 08:43 |
|
As somebody who has actively tried to keep an even-handed approach to recommendations I have to admit it is getting harder to say that ATI's driver issues are just bad press taken too far. It seems like for about a year, year and a half now, nVidia's had one showstopper issue with the 560Ti and Battlefield 3. I don't even want to list the showstoppers ATI cards have experienced in the same time frame. Apart from that, nVidia's cards have had one or two substantial driver-based game rendering smoothness improvements (e.g. Skyrim, which saw a double-whammy that wrought over 50% greater overall performance from Fermi and Kepler hardware due to initial driver difficulty with the games that was resolved by the driver team), but for the most part they don't even have to fluff a narrative; their story has been uneventful and good. Whereas now if I want to convince someone who is skeptical of ATI's driver quality, there's too much stuff to explain away. I personally think it has to be part of the overall financial situation AMD unluckily finds itself in, forcing an already somewhat reactive driver update process (due to reduced access to in-development titles) to be even more out in the open about playing catch-up and having significant launch issues. Part of what has made nVidia's high performance lineup so apparently solid is that it's all derived from one chip with extremely similar layouts. Any time you can get that degree of homogeneity in hardware, it's going to be smoother sailing than the alternative. I conjecture that a big problem for them is just that combined with limited resources, ATI has a much greater diversity of products with unique hardware configurations in broad usage across price categories and in diverse setups, requiring significantly more attention and work on the drivers side of things... Where their capability is suffering from the sickness of the company of which they are just one part, so it's one thing after another at the worst times for 'em. nVidia's had its fair share of driver problems in the past, but as someone who doesn't want to see one of the (surprisingly narrowly, but...) still profitable parts of AMD suffer, it blows that the prevalent idea that their drivers are problematic or just outright broken with many new games being pretty correct lately really sucks.
|
# ? Dec 16, 2012 09:32 |
|
sethsez posted:This is a large part of it, mostly because videogames still don't have realistic motion blur, which is a huge part of why 24 FPS works as well as it does in film but looks like a jerky mess in games. Ding ding ding, we have a winner. Motion blur is either completely missing or looks nothing like the real thing in all games I've ever seen. Lower frame rates just make games look jerky since there's nothing smoothing it out. If you want uncanny valley in games then heavy rain might be a contender, along with those nvidia dawn demos (not games but I'm going to consider all real time rendering).
|
# ? Dec 16, 2012 17:11 |
|
Additionally, camera panning in games can be rapid, while it's generally slow in film. It's much easier to notice low framerates with a fast camera pan, or fast-moving objects.
|
# ? Dec 16, 2012 17:14 |
|
Longinus00 posted:Ding ding ding, we have a winner. Motion blur is either completely missing or looks nothing like the real thing in all games I've ever seen. Lower frame rates just make games look jerky since there's nothing smoothing it out. I don't know much about film/graphics/human perception, but it seems that these are all pretty loving basic and fundamental things to be discovering this late. I recall reading in the spring, I think, when those first clips of the Hobbit movie or whatever were shown at some movie festival at the higher FPS rate. It seemed to have caused quite a stir. Why? Hasn't stuff like "FPS v latency" for games and "48fps v 24fps" for film been known/researched for years?
|
# ? Dec 16, 2012 17:20 |
|
I'm running 2x 6950s on 3820x1920 resolution. Performance is marginal in some games, and vsync will not work. I get tons of tearing. I finally convinced myself that waiting for the next gen of cards is folly. Should I buy 2 680s or a 690? Performance and price appear similar. What other things should I look for? Even the anantech benchmarks for sound and temp are similar. Anantech doesn't have crossfire 6950s in their 2012 benchmarks, but comparing a 6950 to 680 shows a doubling in FPS. Which specific model/brand should I buy? I use HDMI audio. My current card has 4 DPs, and 2 DVIs that can also be HDMI with an adapter. I use DPs for the monitors and HDMI for audio. The nvidia cards have 3x DVI and 1xminiDP. The only way that would work is 2x DVI monitors, 1xDP monitor, 1x DVI-HDMI for audio. Will this work? Dominoes fucked around with this message at 18:46 on Dec 16, 2012 |
# ? Dec 16, 2012 17:33 |
|
Local Resident posted:Why? Hasn't stuff like "FPS v latency" for games and "48fps v 24fps" for film been known/researched for years?
|
# ? Dec 16, 2012 18:10 |
|
Factory Factory posted:It makes me wonder: Why doesn't this happen with video games? Or will it, in the future? For many people, the difference between 30 FPS and a 60 FPS in gaming is not only incredibly noticeable, but it's incredibly desirable, even if we have no problem with 30/25/24 FPS video. What distinguishes these situations? There's a lot more movement information per frame in a 24hz movie than in a 60hz game. The movie has been sampled from light at 24hz intervals and each one of those samples has accumulated light for the duration of the exposure time, where movement has been happening continuously. Subtle movements faster than 1/24th of a second still show because of this. A frame from a game is a rendering of an instant in time without motion. Games don't have motion, they produce the illusion by teleporting objects over sufficiently small distances that it's not too distracting. Unlike a movie they aren't a sampling of a continuum, they're truly discrete, and hiding this through brute force would mean rendering at frequencies in the high hundreds. Some games approximate motion blur. Most only do it for camera movement, but some do it for everything and these do look better at lower framerates. Crysis for example does motion blur both on objects and relative camera movement when at max settings. It's still only an approximation; some real-life visual effects happen precisely because you see a sampling of a continuum and these can't be simulated merely by blending discrete frames. If we had super-duper computational power it wouldn't be an issue. You could produce nearly identical motion characteristics to a movie by updating the screen at 24hz while rendering the game internally at >1000hz and then averaging groups of frames together as appropriate for the exposure time. This is [in effect] how CG effects are able to blend with live action scenes so effectively. There's a bunch of other issues too. But that's the big one as far as smoothness of motion goes, and I've gone on long enough.
|
# ? Dec 16, 2012 19:25 |
|
I've seen Quake 3 videos where they got the engine to crank out a 300 fps video file, which is then downsampled to a 60 fps video with motion blur. It looks better than normal motion blur, but it still doesn't look natural.
|
# ? Dec 16, 2012 21:43 |
|
My wife's machine has a 7850 and while it runs games absolutely fine, for some reason chromes built in flash always causes a TDR, Firefox is fine with hardware acceleration turned off though. Everything else works perfectly, no corruption of any kind in games, gives nice performance. We're past the point where we could RMA now as we originally thought it was just a Chrome/Flash issue but later versions (And drivers) didn't fix the problem and no one else with a similar card seems to have this issue. Makes me wish we'd gone with a Nvidia but could just be we were hit with the one faulty card out of thousands. That said though it runs everything perfectly other than the bizarre flash issue, but next time I upgrade I'll probably go with Nvidia, while someone who had a bad 680 will probably swear their a Radeon person for life for now on.
|
# ? Dec 17, 2012 00:35 |
|
What the hell is a TDR?
|
# ? Dec 17, 2012 03:40 |
|
Local Resident posted:I don't know much about film/graphics/human perception, but it seems that these are all pretty loving basic and fundamental things to be discovering this late. I recall reading in the spring, I think, when those first clips of the Hobbit movie or whatever were shown at some movie festival at the higher FPS rate. It seemed to have caused quite a stir. Most of the people complaining are idiots and all I can determine is that they hate change. In the past any fast panning in 24 fps films has been rather noticable to me. I watched the hobbit in 3d, 48 fps and dolby atmos at the weekend. It was noticable as a higher frame rate than I'm used to in film but I got used to it after about 5 minutes. Perhaps the preview was actually too short for people to adjust because most of the moaning was about the frame rate. My experience by the end of the film is that I'd rather see more films at 48 fps or higher.
|
# ? Dec 17, 2012 04:39 |
|
Jan posted:What the hell is a TDR?
|
# ? Dec 17, 2012 08:29 |
|
|
# ? May 28, 2024 16:28 |
|
Lord Dekks posted:My wife's machine has a 7850 and while it runs games absolutely fine, for some reason chromes built in flash always causes a TDR, Firefox is fine with hardware acceleration turned off though. Everything else works perfectly, no corruption of any kind in games, gives nice performance. I knew someone who would get TDRs non stop with his nvidia card and was driving him crazy until he finally figured out it was because he was streaming audio over hdmi. Once he turned that off and moved to using the sound card the problem disappeared.
|
# ? Dec 17, 2012 17:29 |