|
movax posted:As a product of the late 90s, the NV3 packed some bitchin' features. (Also does anyone else remember the Riva128 vs Voodoo2 flamewars? So epic. So dumb.)
|
# ¿ May 15, 2012 20:25 |
|
|
# ¿ May 5, 2024 23:45 |
|
movax posted:Hm, for some reason I thought I read that the Riva 128 was still limited to 16-bit color depth in 3D operations (I think Wiki says that), but if that isn't the case I will totally update that post! Man, nostalgia. I never had good computers when I was growing up despite being desperately obsessed with them. No c64, no amiga, a 286 long into the 486 and early pentium days, etc. Getting that college computer was like the apotheosis of all my boyhood computer lust. Boten Anna posted:Since this was the 90s, by both of them do you mean you would actually open up your computer and swap them out? Or since this was the 90s, do you mean they both just sat on the uniform AGP/PCI slots and you just plugged in the one you wanted to use at the time?
|
# ¿ May 15, 2012 21:25 |
|
So if we're taking a trip down memory lane to the GPU technology of the 90s, lets make a quick stop to mention S3. At the same time as nvidia was making the Riva128, S3 made the ViRGE, which was groundbreaking in it's own way. The virge was another combined 2d/3d accelerator, but it was a bit... weak at that 3d thing. In fact it was so bad, that as soon at you asked it to do anything more complex than render a single unfiltered texture on a poly, its performance dropped off a cliff. The average cpu of the time using software rendering could play quake better than a virge; it was mockingly called a "video de-accelerator" on usenet and the early internet. However, the silver lining to the dark cloud of S3 failure was their mighty efforts to find *some* way to make their followup, the Savage 3D, perform faster. Nothing inspires engineers like having a bunch of nerds mock your efforts for years. So they invented two things that are still in use today: 1-cycle trilinear filtering and a little thing called S3TC, which was licensed by Microsoft to use in DirectX as DXTC: DirectX Texture Compression. It wasn't enough at all. The Savage 3D had average performance but couldn't be made cheaply due to horrendous yield. (Wikipedia sez: "S3's yield problems forced Hercules to hand pick usable chips from the silicon wafers.") The Savage 4 fixed the yield problems but didn't get much faster, and was a cheapo second fiddle to the TNT2 and ATI Rage. The Savage2000 was kinda better, with hardware performance nearly equal to the new GeForce, but drivers so atrocious that most games were unplayable. S3 quit the video chip market and sold everything to VIA, then merged with Diamond Multimedia, with whom they had worked together on a cool little product: the Rio PMP, the first portable mp3 player. I had a PMP500 (successor model), with 64mb of internal flash plus a 64mb MMC card. It's impossible to explain how baller it was to walk around with that when everyone else still had big skipping CD walkmen. Later they made the first time-shifting digital TV recorder, the ReplayTV. Both of these products caused them to get sued, by the RIAA and MPAA respectively, making them waste all their time and money fighting 2 huge lawsuits, and eventually killed them even though they won. Which is why the media companies to this day still sue any technology they hate and fear, and why Apple rules the world instead of S3.
|
# ¿ May 15, 2012 23:56 |
|
HalloKitty posted:poo poo, why didn't I know about this already? You can even set up a toggle to see the performance impact and the difference in how it looks. Thanks! I think the weakness of MLAA/FXAA is that they're serving two masters. They're a good supplement to MSAA for getting rid of obvious aliasing from specular lighting or other things that MSAA doesn't touch. But they're also a low-cost AA for modern inexpensive graphics cards that have plenty of shader power but less memory bandwidth and raster power. As a result they're a bit blurrier than they'd otherwise need to be. And your issue with MLAA blurring text and such is because the game didn't have hints for the driver to not post-process those bits, either because it's old or made by people who didn't know how to do that. Most new-ish games don't have that issue. The loss of detail on regular textures still happens though. TheRationalRedditor posted:SweetFX has a stand-alone GUI that can create custom override profiles for every game independent of driver suites, I do believe. Nvidia Inspector is cool but it doesn't have SMAA. That has to be done via some type of injection because it's not implemented in drivers (yet). I just grabbed RadeonPro, and it's running a background service that I guess is doing the injection.
|
# ¿ Apr 19, 2013 03:17 |
|
kuddles posted:Huh. People keep saying this to me but when I try doing that, the screen still tears like crazy for me. To answer your original question: yes, radeonpro has an option in it's vsynch controls to force triple buffering. So you can have that and SMAA at the same time.
|
# ¿ Apr 19, 2013 21:00 |
|
The Lord Bude posted:In what parallel universe do you need to do anything in particular to install beta graphics drivers? In seven years of owning my own PC running XP, then later vista and now windows 7 I have only ever installed beta drivers.
|
# ¿ Apr 22, 2013 04:12 |
|
Dogen posted:I was installing the new nvidia betas and the installer slideshow indicated a bundle is coming with Metro Last Light Metro is definitely an improvement for nvidia though. That poo poo-tier F2P credit was the main reason I bought my first AMD/ATI a few months ago, after a 10 year run of nvidias (not a fanboy thing, I just never happened to buy a video card in the generations that ATI had the upper hand).
|
# ¿ Apr 23, 2013 14:00 |
|
Space Racist posted:Nothing justifies the cost of a $200-$300 card like ~$135 worth of free, new AAA games. Of course, I'm curious how significantly this promo is eating into AMD's profit margins. Now that we know that both PS4 and MS consoles will use AMD chips, I think their strategy for the past year was to keep their head just barely enough above water to stave off drowning. Margins on the new console chips will be sweet gently caress all, but it improves their revenue and solves that fab capacity problem. They're still in a dicey position.
|
# ¿ Apr 24, 2013 06:35 |
|
I don't think that 1080p inherently needs more memory. Older video cards used to be completely capable of 1600x1200 which is approximately the same number of output pixels as 1080p. It's the high resolution textures, and multiple textures per surface, that's eating most of the memory. Plus lots of pixel shaders need memory buffers. Also the trend in games to things like open-world and large, diverse levels means keeping a much larger working set in memory. e: Jago posted:Frame buffers and AA are still not free. Look at any benchmarks with a bunch of cards and watch them scale as resolution is increased and decrease. Framebuffers at this point have been so outstripped by memory growth that they might as well be free. Triple buffered 1080p 32bit + zbuffer is like 50 mb. If you're supersampling it balloons up but even then the bandwidth requirements kill you first. Klyith fucked around with this message at 00:07 on Apr 26, 2013 |
# ¿ Apr 25, 2013 23:32 |
|
Mad_Lion posted:Is it just that games have gotten more complicated, or what? Now, huge amounts of the GPU work is higher-level programs that don't react well to being split across scan lines. When you split them up, every border is a division that the GPUs have to pass data to keep synched. Which is a problem, because the fewer borders you have the worse the load balancing is.
|
# ¿ Apr 27, 2013 00:35 |
|
e: ^^^ My understanding is that if you enable vsynch it's relatively fine and always has been. I'd think the main benefit of a SLI/CF setup is so you can enable vsynch and never drop below 60 frames.Mad_Lion posted:Anandtech's newest review of the 7990 makes it look pretty good vs. the 690. http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official But this isn't the thread for upgrade questions / talk.
|
# ¿ Apr 27, 2013 05:00 |
|
dpbjinc posted:They probably don't have anything new to release yet. As was just mentioned, AMD's working on hUMA. NVIDIA's working on Project Denver, which is probably going to end up connected with their GPUs in some way. Neither are going to be ready for mainstream use before 2014. In other news, I just got a code for Blood Dragon from AMD, which is pretty awesome since I originally bought just the Tomb Raider and BS:I version. God knows when I'll end up playing it -- I'll have to sign up to Ubistore to get it, but that's not AMD's fault.
|
# ¿ May 2, 2013 05:45 |
|
Rigged Death Trap posted:Why would it be laughable? Unless "competing with eyefinity" is just the ability to put Windows apps on 3 monitors, in which case whatever. Plenty of office workers get multiple monitors so it's nice they won't have to spec a basic video card for their Dells, but that's not what Eyefinity is all about.
|
# ¿ May 2, 2013 15:38 |
|
Your issues sound very similar to a thing I had happen to me about a year ago, except mine was with my nvidia card. At some point I installed new drivers over the top of old ones without fully uninstalling. Windows XP used to be ok with that, but 7 gets hosed up. Basically every time after that when I tried to update drivers, windows would dig out this old version from the depths of WinSXS and try to install its files instead. Driver cleaning programs would remove the active files, but not the backups from whatever buttcrack they were lodged in. Finally I had to remove any new drivers, and let windows go back the old ones so I could find which exact version they were. Then I got that exact version from nvidia's driver archive, installed it, then used the uninstaller properly. That finally got rid of everything, and I could update again with no problems.
|
# ¿ May 5, 2013 02:08 |
|
Jan posted:Also, take this with a grain of salt since I haven't ever worked with WinSXS, but I'm pretty sure GPU drivers cannot be run from SXS. The kernel mode driver framework doesn't work in a way that allows running the same driver twice in parallel. Of course, with the amount of non-kernel level libraries that drivers come with nowadays, I wouldn't be surprised if that stuff ends up causing trouble with SXS copies. quote:All in all, what I do know of driver programming is painful enough that I wouldn't wish it on my worst enemy. Game engines are bad enough.
|
# ¿ May 5, 2013 21:44 |
|
TheRationalRedditor posted:IIRC those after-market GPU coolers all reviewed favorably well, the only drawbacks being price and sometime difficulty of install. One thing to try before spending on a aftermarket heatsink: look at your vsynch settings. I was a bit annoyed that my new 7870, which has top fans not a blower, was making a lot more noise than my old GTX460. In particular it was spinning the fans on high in relatively simple games like Kerbal Space Program where I'm not sealed into my headphones to block the noise. But I figured out that RadeonPro disabled vsynch by default, so the card was running max speed to render 140 kerbals per second.
|
# ¿ May 7, 2013 13:14 |
|
subx posted:That's surprising. I haven't kept up with video cards much the past couple of years, but when I did after 2-3 years you could buy something quite a bit faster for ~100-150. The budget market is where I usually pointed people to when they are building a first gaming pc or whatever, so that's pretty disappointing that my 3 year old $350 card is still in that same range. (edit - not disappointing for me so much since it's been a fantastic card, but for people getting into the hobby) 1) The 58xx / 4xx generation was a pretty fantastic one. The two since then haven't exactly been stagnant, but the extended console cycle pushed some design attention from raw fps power to stuff like GPU compute, new AA methods, etc. 2) The $100-150 budget card bracket is always pretty crappy. They take forever to catch up to high performance cards from previous years, because they're built with a lot of compromises for price. At $100 a lot more of your money is going to fixed costs like the pcb & assembly, packaging, transport, etc. The $200-300 midrange is normally where you get good value for money at the spread of price points, and then you start entering diminishing returns after that. Pointing people at a $150 budget card for their first gaming pc is ok if they're really price conscious, a fairly casual gamer, or a WoW junkie. But someone who plays big AAA on a 360/PS3 will probably be disappointed.
|
# ¿ May 7, 2013 22:08 |
|
Fauxtool posted:I have a 7970 that gets as hot as 96C when playing mwo on a triple monitor set up. Most every other game stays below 80C. Is that safe? The fan only ramps up to 66% automatically when I wouldnt mind it going 100%. The noise is a non issue most of the day but I cant just leave it at 100% all the time. Can I set a more aggressive fan profile? As for whether you need it, temps in the 90s are hot but within acceptable limits for the silicon. They design GPUs to run hot. It might reduce the statistical life of the card a bit, but it's not like you're running it 24/7 mining bitcoins. I wouldn't worry about it too much personally. But it's pretty bizarre that the card isn't running the fans at 100% at that temperature. Some 7900s have really slack fan profiles that let the chip get pretty hot, especially ones with blower coolers because they're so loud. Not running full speed in the 90s is pretty unusual.
|
# ¿ May 12, 2013 03:16 |
|
Killer robot posted:I've had this happen too, and have assumed it to be a driver thing since it's seemed to come and go with driver installs.
|
# ¿ May 12, 2013 23:57 |
|
Urzza posted:I was browsing LGA 2011 motherboards, and while most of them have PCI-E 3.0, I haven't found any processors in that socket type that support 3.0. Is it just that I haven't found a processor that supports 3.0, does it not matter except for high end cards, or is it just all marketing on the part for the motherboard manufactures? But PCIe 3.0 doesn't matter for any video card, even the high end. It might just barely be measurable for a SLI setup. Video cards are not starved for bandwidth across the PCIe bus. It's been a consistent thing, all the way back to the AGP era, that you can run Card X on the bus from Generation X-2 and have less than 5% impact to performance. Buying a desktop LGA 2011 system right now would be a massive waste of money, what with Haswell right around the corner. Unless you really need a 6 core chip for some reason.
|
# ¿ May 18, 2013 08:11 |
|
Animal posted:The only way it could get damaged is if you have a crap PSU, like OCZ. A psu would also have to be impressively hosed up, not just cheap trash, to permanently damage components. It's hard to say why Stumpus's computer is running badly because "slow as poo poo" isn't very descriptive, but on a box you haven't used for a few years the BIOS battery might have run down and reset to safe defaults. Aside from that, it's probably just the OS being full of old crap and a reinstall would be snappy again.
|
# ¿ May 22, 2013 22:19 |
|
Endymion, it's probably fine. Even the bad post-OCZ PCP&C is only bad in comparison to what PCP&C used to represent. If you're worried, a backup PSU that sits on a shelf is nice to have around. PSUs can damage other components when they fail, but it's really not very likely. There's a number of failsafes to prevent that. I suspect a lot of anecdotal "psu killed my cpu / video card / whatever" are explained by people being idiots and not seeing how they hosed up their system. Srebrenica Surprise posted:I'd love to hear about the multitude of OCZ units being sold based on a Seasonic design. Animal posted:Why should anyone buy their brand when Seasonic is the complete opposite and has similar prices?
|
# ¿ May 23, 2013 04:09 |
|
NickelSkeleton posted:So I'm freaking out after my new Radeon 7950 HD stopped working properly. If you want to test it out, you could try using the overclocking panel of CCC to lower the GPU speed in 100mhz chunks and see if it can load a game without crashing. If it works at some lower clock speed, that's good evidence the card is defective. quote:another weird thing, when running with the catalyst 13.5 beta2 driver check out my GPU clock:
|
# ¿ May 23, 2013 04:24 |
|
Joink posted:On that tab it doesn't update the GPU clock speed, only shows its max setting. Under sensors it shows exactly whats going on with the card. For my card, its running at 300MHz when doing 2D. Zorro KingOfEngland posted:Has anyone tried out the new Geforce Experience thing nVidia just released? Being able to record the past 20 minutes of gameplay footage sounds really cool, but there's no way it comes without a performance hit. Still, it's a baller app. Being available on all Kepler based cards, not just the $650 fuckoff-expensive ones, is awesome. Finally a way to do high-quality recording that doesn't require a high-end computer.
|
# ¿ May 23, 2013 19:55 |
|
Alereon posted:It'll work, but that's a lower-end power supply and you're pushing it pretty hard, so expect higher noise levels and temperatures. Definitely upgrade power supplies if you consider overclocking. I don't know how a 60-70% load is pushing pretty hard, and I also can't see how a decent seasonic is a lower-end PSU.
|
# ¿ May 24, 2013 03:51 |
|
Last page someone asked about how replacing PSUs after X years: I think every 3 years is probably being over-cautious, but they don't last forever. PSUs have large liquid electrolytic capacitors which have a limited lifespan, and 6 years starts getting close if you're like many of us that 24/7 our desktops.Alereon posted:You want the load on the power supply to be 50% or lower, beyond that you get progressively worse noise levels, efficiency, and power delivery quality. On that Seasonic power supply, the fan is ramping up to noticeable levels at 60%, and 80% is about the point where the noise and heat output are excessive. That power supply isn't BAD, it'll keep supplying functional power up to its limit, but it was a low-cost power supply when it was new and is a bit behind the curve today. It was a great choice for a lower-draw system, the problem comes when you try to use that same cheap power supply for a system with a top-end videocard. I don't think there's any real reason to buy a 800w PSU for a 300w system. Even overclocking a CPU is only adding a few tens of watts if you're doing normal, non-heroic OC. But I do think that if someone is going to spend 650 dollars on a graphics card, they shouldn't complain about buying a PSU that's over $100.
|
# ¿ May 24, 2013 23:26 |
|
Shaocaholica posted:As long as you're not VRAM bound, a Titan or 780 should smoke a Quadro K5000 at GPGPU tasks given the number of cores on the latter right? No real GPGPU special sauce for the Quadro K5000? Zotix posted:Well my question is, does that seem like it's an issue with the card directly? Like if I purchase a 780 GTX, and lets say it's perfect out of the box, will I likely have high temps similar to the 580, or should they be normal? Checking your video card for dust is not some kind of rocket surgery. If you have one with a blower style cooler, there will be a couple obvious screws on the top side of the card that you can remove to pop off the plastic shroud. With the shroud off, you will probably see a gently caress-ton of dust blocking the airflow. Blowers are really susceptible to dust, any case without filters they will eventually clog up. If it's a top-mounted fan (or dual fan) seeing the dust will be even easier, though you still may need to remove the shroud to really clean it well. It less unlikely to be a thermal paste issue, the card isn't very old and they slather that stuff on pretty good in the factory.
|
# ¿ May 27, 2013 04:05 |
|
Don Lapre posted:770 is so god drat huge. Hopefully the 8 series has a new architecture and we can get a 870 in a smaller card. w00tazn posted:For next gen, I can imagine the target framebuffer resolution going up, but not much more than that. I doub't we'll be seeing any huge leaps in fidelity anytime soon as we haven't really seen anything that really pushes the boundaries on PCs today and doing so would just mean increased costs for developers who already operate on razor thin margins. But if you think next-gen games aren't going to have improved graphics, you are high. There are ways to use that power that don't cost exponentially more money; there are also players ready to spend more money at the start of a new cycle to establish a lead or new IP.
|
# ¿ May 31, 2013 05:54 |
|
Sidesaddle Cavalry posted:I'm graphics-effects-dumb, how much of that can be taken up by the various methods of anti-aliasing? And where does memory bus size come into all of that? I've read a few comments on various 770 reviews somewhat lamenting missing out on the 384-bit bandwidth that came with GK110 cards like the 780 and Titan. Does the blistering 7 Ghz memory clock on the 770 somewhat make up for it? Buying a >2gb card is mostly about the larger textures available in the big PC releases. Personally I'd say it's only a major requirement if you're buying an expensive card that you plan to keep for years. With a midrange 2gb card you might have to lower the game's texture setting a notch from ultra quality, meh. Not a huge deal, I'll keep going with cheaper cards more frequently myself. I'm sure by the time 2gb feels constricting I'll be ready for a new one.
|
# ¿ Jun 4, 2013 09:17 |
|
FetalDave posted:That's not true. I've had issues with stuttery graphics from my 3870, to my 5870, and now to my 7970 (see my post above/youtube video). That's almost 5? years of different generations of AMD/ATI cards with the same issue. --- AMD drivers are still 2nd best compared to nvidia, but as a long time nvidia user who just switched to AMD, I'll say they're not a reason to avoid all AMD cards these days. If you have two competing choices that are tied on the price, performance, and pack-ins, the drivers are certainly a good tie-break for nvidia. If you don't give a poo poo about paying more, you could just go with nvidia all the time. The microstutter thing for example, they've made good strides to fix. Nvidia didn't have the problem because they already had done work on it when it became general knowledge. If TR had started their new method of benchmarking frame delay instead of average FPS a year earlier, both would have had the stutters. Finally, Nvidia isn't perfect themselves. They haven't had a major fuckup for a while, but when they have it's often been the hardware, which IMHO is worse than bad drivers. Software can be fixed, but things like poo poo analog quality (back when people still had VGA monitors) or a bad heatsink standard resulting in cards that cooked themselves after 18 months are forever. Those examples are from a long time ago, but at the time there was plenty of about it.
|
# ¿ Jun 5, 2013 23:58 |
|
FetalDave posted:This. Microstutter is only something that occurs with a dual GPU setup. I have a single GPU setup and it still happens. quote:Vsync actually makes it worse. On top of the jitteryness, there's now a distortion bar that runs horizontal and moves from the top of the screen to the bottom every 5 seconds. They may not be as good as nvidia, but I'm pretty sure they can catch a bug that obvious. I feel quite bad for them that in situations where poo poo is broken and their card happens to be present, people will just say "their drivers suck, get an nvidia card".
|
# ¿ Jun 6, 2013 04:29 |
|
Daysvala posted:Yeah I have one of those reference blowers that exhaust out the back. My case seems to have sufficient airflow, but I can't check to see if the blower is clogged because one of the screws that secures the blower to the card came improperly manufactured and I can't get a screwdriver head to catch. Blower style heatsinks are very prone to getting clogged by dust if you don't have a well-filtered case (or a very clean house).
|
# ¿ Jul 5, 2013 05:13 |
|
Jan posted:On a different subject, does anyone know of any issues with AMD, DisplayPort and turning off monitors? I sometimes get issues where if I turn off my DisplayPort connected monitor and leave it off for a fair amount of time but leave the computer running. When I come back and turn the monitor back on, the monitor doesn't detect any signal. Sometimes I can get it back by cycling the monitor power a few times, but I've also had one or two cases where the drivers freak out, cause a reboot and then Windows is in VGA mode with the device manager saying the card was disable because it was causing issues. (If you want manual control of turning off your screen, there's a great little utility called nircmd that does a whole bunch of useful little things, including power saving the monitor. It's easy to put a shortcut in quicklaunch or someplace to just instantly sleep the monitor whenever you leave the pc.) quote:MSI 7850 TwinFrozrs I saw lots of forum posts about dead twinfrozr fans when I googled about it.
|
# ¿ Jul 10, 2013 23:50 |
|
z06ck posted:I'm sorry you decided to fix/replace the fan on something that was still in warranty, cause it ain't now. Though if I had known then what I know now, I would have gone directly to a solution that ditched both stock fans. (Also, warranty violations, pshaw. If you're patient and meticulous, you can solve that just by being better at reassembling the thing than the guy who put it together in the first place.)
|
# ¿ Jul 11, 2013 12:58 |
|
The Lord Bude posted:You cannot game adequately at 1080p with a $100 gpu, so building a gaming pc with one in it is a pointless waste of money, edit VVVV Only if the sole purpose for his computer is playing games. Given how he's specing the computer, the dude probably isn't playing games all the time. Klyith fucked around with this message at 15:11 on Jul 18, 2013 |
# ¿ Jul 18, 2013 12:27 |
|
quote:Oculus in 1080p / 1440p / 4k The other thing is that rendering at native resolution is not critical for the oculus. It's nice if you have the horsepower to do 90hz at very high rez, but the reason they're pushing to the highest PPI displays they can source is to minimize the screendoor / subpixel effects. Because of the distortions applied to the image, it doesn't have an exact pixel to pixel output even when you're running native. So don't run off to buy a super expensive gpu just because you think the oculus needs 4k type horsepower. (The oculus is gonna be expensive enough on its own.)
|
# ¿ Jun 20, 2014 16:53 |
|
jm20 posted:How much more will AIBs charge for actual heatsinks on an RX480? Not much, even though 150w is disappointing it's not crazy by videocard standards, any of the completely ordinary 3 or 4 heatpipe coolers that the OEMs make by the thousands will work. And I can't think of any time where the first-wave cards with crappy blower coolers were cheaper. This isn't a new thing. I think it's an artifact of the extra step in the supply chain (chipmaker handing whole cards to OEMs rather than OEMs doing their own assembly). Only get one of these AMD blower units if you're planning to replace your heatsink with watercooling or something. xthetenth posted:CPUs did. Now there's just the canonical one big core architecture, and Intel iterates on it very slowly. Which is a good lesson for why being a "fan" of any of these companies is punch-yourself-in-the-dick stupid. The only thing that keeps them from giving you the shaft is a competitor. I'm already kinda annoyed at nvidia for leveling the performance/price curve down to a straight line for their last couple generations, which they had the leeway to do based on AMD's weak results.
|
# ¿ Jun 30, 2016 01:03 |
|
NewFatMike posted:I'm definitely interested in DX 12 performance in the future. Hopefully it's not just because of rebrands that AMD cards have aged rather gracefully. DX12 and Vulkan, if that gets adoption by PC games, will maybe be an ace for them, it's certainly a place they make up a lot of ground vs nvidia right now. But that very well could fade, it's quite possible that nvidia just hasn't put the same about of work into DX12 yet because there are hardly any games that use it. I wouldn't buy a 480 just on that expectation if the 1060 ends up being better in current games (relative to $). xthetenth posted:The reason everyone's super mad is because it's using a ton of watts to do it which means that all the people who put too much money into their computer (most of the thread) are really worried that they won't be able to fit much performance into 300W for a top end card. I'm kinda disappointed by the power use because I like a quiet computer, even when playing a game if I can get it, and was hoping the process shrink would fix the biggest flaw in the AMD lineup for the last few years. For Vega, they better hope to have either their process problems un-hosed or some miraculous performance magic pulled from HBM, because 300+ watt cards are a tough sell.
|
# ¿ Jun 30, 2016 05:21 |
|
Cream posted:I'm on a 7950, the price of the 480 is looking pretty good. Would it be worth the upgrade? Yes but a) don't buy these first ones with the bad reference blowers b) wait until the price & reviews of the 1060 comes out, which is only a bit over a week Siets posted:What is the difference between 144Hz and GSync? quote:I know that's the refresh rate, but can GSync also work at those refresh speeds? tl;dr = even without *sync, 50fps on a 144hz monitor will look slightly better than 50fps on 60hz quote:I thought the eye couldn't really tell a difference beyond 60Hz or so? The overall "framerate" of our vision seems to be 45-90hz. But eyes are not computer devices, they perceive certain things very quickly and yet detail vision likely takes multiple passes through the brain. Biological systems are non-linear.
|
# ¿ Jun 30, 2016 16:32 |
|
|
# ¿ May 5, 2024 23:45 |
|
Spatial posted:Human vision doesn't have a framerate. It's a continuous analog system. b) It's definitely not continuous. Flicker in rapidly rotating objects, persistence of vision, and a number of optical illusions work because there are discrete events. Nothing with nerve cells is continuous, neurons have a "tick rate" just like transistors. But different types run at different speeds. But it's not a CCD camera either. For one thing, the optic nerve doesn't have enough bandwidth to work like a CCD camera. A few years ago I read a really neat article about scientists trying to extract images from the optic nerve, and trying to decipher the biological data compression -- the pictures they were able to reconstruct were mostly just outlines.
|
# ¿ Jun 30, 2016 17:27 |