|
I love how they sold adaptive vsync as something new - surely it's nothing more than a framerate cap, unless I'm misunderstanding it..
|
# ¿ Sep 3, 2012 20:58 |
|
|
# ¿ May 14, 2024 03:54 |
|
Factory Factory posted:With lowered detail settings and AA in some titles, sure. Triple screen and S3D is a ridiculous workload, though. If you want the "full details, AA, solid 60" experience, then you have to talk quad SLI. Also, I'd be looking at 7970 GEs for the 3GB VRAM, or 680 4GBs, for sure. Not that I would recommend this, because of the crazy requirements and heat, but it is what it is..
|
# ¿ Sep 13, 2012 17:15 |
|
spasticColon posted:I got a 7850 for my main rig back in April and one for my HDTV rig just last week and both cards overclock to 1GHz without issue so that probably gets them close to 7870 speeds I would think. I have a Sapphire 7850 that has coil whine like a motherfucker, does yours make a noise when you put it under load?
|
# ¿ Sep 14, 2012 22:50 |
|
Endymion FRS MK1 posted:This is a rumor, but supposedly leaked specs of the Radeon 8870 and 8850 Pretty sure the 7870 is faster than the 6970, but I get your point, the mid-range Radeon targets the high end performance of the previous generation. Still, 8000 series already? Seems like it's time to clean up the branding mess the 7000 series became with the different sub-versions of cards. Things could be getting very interesting.
|
# ¿ Sep 16, 2012 22:44 |
|
wipeout posted:Sapphire Trixx is good, no low limit if the card can be controlled by it. Yeah, there are definitely benchmarks out there showing 7850s clocked high enough to exceed 7870 speed at stock, or otherwise come very close to it. HalloKitty fucked around with this message at 09:04 on Sep 17, 2012 |
# ¿ Sep 17, 2012 09:02 |
|
Wozbo posted:Isn't this not true with at least the latest Nvidia generation because they all had to follow a certain baseline spec to the letter? I'm getting a Gigabyte 670 (with the 3 fans) because it looks to perform pretty awesomely. I thought it was the exact opposite with the newest NVIDIA generation, that you couldn't really tell what clocks the cards were boosting to, so there was more room for variation than ever before, but I could be way off with that thought. HalloKitty fucked around with this message at 18:36 on Sep 17, 2012 |
# ¿ Sep 17, 2012 18:32 |
|
Yeah, I've always been a little disappointed in XFX. I had a 4890 and noticed it was an extremely cut down version of the reference card (no volterra digital VRMs, uncooled) which did not overclock one bit. That said, your anecdote is a clusterfuck beyond belief. Why they couldn't send out a BIOS update is anyone's guess. Incompetence, most likely.
|
# ¿ Oct 12, 2012 17:36 |
|
Charles Martel posted:Wow, what an OP. I didn't see this thread before, and it would be better suited for the question I posted in the parts-picking megathread earlier: They have completely different architectures, but those details aren't important for a general overview. There are differences of course in performance, varying across the ranges, but this isn't the thread for posting many graphs - try AnandTech bench or some other reputable site - no point repeating all the game performance differences here, but if you're looking at the low-mid range of card prices, AMD is a good bet right now, and at the top end, most would say NVIDIA has an edge - but recently AMD has boosted the clocks of the top end which helps in some situations. In a very basic sense, the difference is that NVIDIA has CUDA exclusively, and hardware accelerated PhysX, but if you're into compute that runs under OpenCL, the newest Radeon generation is in general, faster at compute than the current NVIDIA cards at the same price. HalloKitty fucked around with this message at 13:32 on Nov 9, 2012 |
# ¿ Nov 9, 2012 13:28 |
|
2GB is a healthy enough amount, but how on earth did they require 1GB for the OS? Isn't the the Xbox 360 OS limited to 16MB?
|
# ¿ Nov 25, 2012 21:27 |
|
Joink posted:I find those tech demos completely inferior to something like this https://www.youtube.com/watch?v=6lTJrmGzpEI in every way. The actual exes: Panic Room (64KiB) Elevated (4KiB) HalloKitty fucked around with this message at 22:56 on Dec 7, 2012 |
# ¿ Dec 7, 2012 20:30 |
|
bull3964 posted:I can't even pretend to follow the Radeon line recently. Easily more logical than the NVIDIA line, which is saying something, because both naming systems are not great. At least the Radeon line tends to be filled with less rebrands, whereas NVIDIA often throws in a lot of old chips with new names. AMD also culled the extra letters and crap after the model number, for the most part.
|
# ¿ Dec 19, 2012 18:46 |
|
Grim Up North posted:Radeon 4K 2870. I guess at the moment the cards could also be interpreted as the 27xxx series, but instead of 2 they put HD. Radeon 27970. Ah yes, the numbers, the numbers!
|
# ¿ Dec 20, 2012 10:49 |
|
78°c isn't a temperature to worry about on a graphics card. They are just power hungry, hot beasts. Enjoy the performance..
|
# ¿ Dec 20, 2012 11:23 |
|
DrSunshine posted:It came with a Pentium D 3.0 GHz processor, when I got it from the used computer store. I also plugged in more RAM, and a better power supply. Leave the side off until you get a newer machine. Looks like the CPU shroud will keep the CPU cool, and the side being off will help the GPU a lot.
|
# ¿ Dec 31, 2012 17:56 |
|
unpronounceable posted:You know the random frame time spikes that the AMD cards were having? Well the new, and as of yet, unreleased beta drivers help smooth them out a lot. In the Tech Report article, they tested a 7950 with new and old drivers against a 660Ti in Skyrim, Borderlands 2, and Guild Wars 2. In all games, there was a marked improvement with the new drivers. Wow, drat, they've really improved the frame latencies in just one driver revision. I guess they can fix the problem, excellent!
|
# ¿ Jan 17, 2013 19:22 |
|
Endymion FRS MK1 posted:I don't know if you're running Windows 8, but according to the release notes, well... Their CPUs suck, the GPU division is actually not bad, and they need to make sure they're making every effort to fix poo poo, which is vastly preferable to going under. So I imagine so.
|
# ¿ Jan 18, 2013 22:15 |
|
Alereon posted:News is spreading that VGLeaks has posted what they claim to be final Xbox 720 specs, featuring a CPU with eight 1.6Ghz Jaguar cores (the descendent of the Bobcat cores used in the E-series low-power APUs) and Radeon HD 8770 graphics. I'm rather skeptical of this because it seems like giving up on per-thread CPU performance and relying totally on many slow cores is a proven-wrong approach, but we shall see. Similar rumors are spreading about the Playstation 4, including that that is a fully-integrated APU based on the Radeon HD 7870. A 7870 wins a matchup against an 8770, but by how much will depend on clockspeeds, power, and efficiency. The 7870 has 67% more shaders and up to twice the memory bandwidth, but we'll have to see what the actual deployed configuration is. I'd be skeptical too. 8 weak cores? This sounds like the exact opposite of what you'd want in a games console. How will backwards compatibility be handled? .. and so on.
|
# ¿ Jan 22, 2013 00:43 |
|
Alereon posted:Digital Foundry also says their "trusted sources" confirm the PS4 and Xbox 720 are Jaguar-based, but I have no idea how legit they are. If it's true, they better have implemented amazing Turbo or I can't see this going well. I just figured it would be foolish in the extreme, given the massive install base of 360 games. Xbox to 360 wasn't so crucial, as the Xbox reached fewer people in the long run. Tezzeract posted:It'll be hard for Sony to do backward compatibility because they're weaning off of Cell processors. Microsoft could have an easier time. But yeah streaming is one way to handle backward compatibility. Thing is, we've had the 360/PS3 generation for so long, that people still think it is "current" even though we're talking about machines that are from 2005. But people are still wowed by the graphics. Indeed, GTAV is soon to hit these very ancient machines. No matter how amazing the newer consoles' graphics are, I think we've gotten past the point where people think the 3D is primitive and cheesy (see 3D PlayStation games with their warped geometry and grainy everything), and will probably still think 360/PS3 graphics of the last couple of years are acceptable, thus would probably like to play their games for a bit longer without having multiple machines. It's not like going from SNES to the N64. But I could be wrong, consumers may lap any old thing up. I was personally hoping on BC because 360s are notoriously unreliable (I have 2: 1 died, was repaired, then died, and the other died out of warranty), and I don't want to spend money just to buy an obsolete console in a shinier box with smaller process chipset to go alongside a new Xbox at the same time. The old leaked document did suggest it would have a 360 subsystem in the box, but it seems a bit naïve to believe it will - clearly the costs would mount up quickly. I guess we just wait for E3. HalloKitty fucked around with this message at 19:01 on Jan 22, 2013 |
# ¿ Jan 22, 2013 18:51 |
|
TheRationalRedditor posted:That may be true but not in this case. The 7850 is a no-frills kinda thing so nearly all of them ever produced have that ATi reference single-fan leafblower. Nah, I've bought two 7850s (Sapphire and MSI) and they both had their own custom coolers. They weren't some special amazing expensive edition cards or anything either. Both extremely quiet even under load. (Although the Sapphire had godawful coil whine, but that's a different issue).
|
# ¿ Feb 11, 2013 18:08 |
|
Fanelien posted:So I have a 2 card SLI 570 1280mb set at the moment, I just upgraded to a surround setup and I am noticing some absolutely insane temps in some games like Euro Truck Simulator 2(80c+) and World of Tanks(95c) when run in surround resolutions, even with AA off. It's frightened me off turning on surround in Borderlands 2 etc because I didn't expect cards that basically never got above 75c to shoot for the moon on temps when exposed to a surround set up. I have a massive amount of air flowing through the case from two 180mm fans in the bottom but I just can't keep the temps under control, the only good thing I guess is that the cards haven't started to throttle yet. Other than going to water is there anything I can do to decrease the heat? Instead of wasting time with fans and water, high temps and power consumption, just get a new high end card and eBay the two 570s.
|
# ¿ Mar 2, 2013 14:22 |
|
coffeetable posted:Because once they've released the perfectly-tuned recording of the tech demo, releasing the demo itself can only make their tech look worse. I don't know, I tried Epic Cidatel on my android handset, it looked good on my RAZR MAXX HD and my old Desire HD. Of course, that's without any real elements that make a game.. a game, but it looked pretty and ran smoothly.
|
# ¿ Mar 30, 2013 12:06 |
|
Tacier posted:Meanwhile I bought a 1gb 7850 thinking it'd be obsolete before I had any real need for 2+gb of video memory. How wrong I was... I'd be annoyed if I bought a card with 1 gram-bit as well... But it is certainly unique sorry for being a oval office
|
# ¿ Apr 3, 2013 00:44 |
|
Jan posted:So, with Haswell coming out, have there been any recent news on its integrated GPU? Last I heard, it was surprisingly competitive (compared to previous offerings at least), and I've been holding off building my HTPC to see if I could also make it a HTPC that can half-decently play some games. I literally wouldn't bother. Since the best was rumoured to be competitive with the Geforce 650m, if you're looking to make a half decent gaming HTPC, you have a higher power envelope to play with, and I'd simply get the best possible passive card, or honestly, one of many mid-range cards today - any custom ones with multiple fans are usually extremely quiet, or can have fan ramps sorted out in something like MSI Afterburner. Edit: also, if your use case was geniune (OK gaming, no add-in card), AMD has already offered that solution with their APUs. Double edit: in summary, Haswell's GPU is of interest mainly in expansion limited situations - laptops. HalloKitty fucked around with this message at 22:45 on Apr 7, 2013 |
# ¿ Apr 7, 2013 22:43 |
|
Rashomon posted:Does anyone have any info on when the tiny ASUS 670 will be released? I am planning a mini Haswell build but might pick up a graphics card earlier since this year is looking pretty boring for graphics cards and I'd like to play Bioshock Infinite on pretty settings. Anecdotally, two people I know said they cranked their settings to Ultra presumably at 1920x1080 and it wasn't too heavy on the machine. Midrange cards, older CPUs.
|
# ¿ Apr 9, 2013 11:32 |
|
spasticColon posted:How the hell is this game coming out on the PS3 and 360 and not the PS4 and the next Xbox. There's no reason you would need all those specs whilst still being available on PS3 and 360 with their 7 year old GPUs. What makes me laugh is the feeble RAM requirements! Start putting out 64-bit executables and use up RAM! RAM is cheap, Titans are not..
|
# ¿ Apr 18, 2013 13:16 |
|
Endymion FRS MK1 posted:Honestly for AA I just force SMAA through RadeonPro. Allows me to stretch my 7950's legs without wasting it on more expensive forms of AA. poo poo, why didn't I know about this already? You can even set up a toggle to see the performance impact and the difference in how it looks. Thanks! Edit: drat, I've been wasting time on anything else. This is amazing, and with no impact discernible. Christ. I've used MLAA before but it mightily ballsed up text and sharp areas of contrast such as the HUD. This just.. doesn't seem to do that. HalloKitty fucked around with this message at 23:12 on Apr 18, 2013 |
# ¿ Apr 18, 2013 23:00 |
|
Klyith posted:It's much more GPU intensive than MLAA. SMAA is not a purely post-process filter Yeah, I had a skim of the whitepaper, but I've tried it with RadeonPro, set a key to toggle on and off, and using the built in FPS counter, it doesn't seem to drop at all. I understand it's more intensive, but I guess on a half decent card, it blows through it in no time. Edit: on a 6950 unlocked to 6970
|
# ¿ Apr 19, 2013 08:53 |
|
kuddles posted:The main reason I never use the SMAA Injector is because you can't use D3DOverrider at the same time with it. Does anyone know if RadeonPro allows you to turn on Triple Buffering and SMAA at the same time? Currently on Nvidia but that might change with a new build in the fall. Well, right now I have it set to have SMAA at "Ultra" quality, and Dynamic V-sync at the same time, which forces triple buffering.
|
# ¿ Apr 19, 2013 18:55 |
|
roadhead posted:http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/ This is not only an excellent read, but quickly explains why AMD is such a natural choice for the next generation consoles. Sure, they had decent enough x86 cores and GPUs tossed onto a single package before, but this gives it some special spice that will be lacking on a general PC platform, without going down crazy routes like the Cell in the PS3. vv Ah yes, it makes AMD a more attractive choice on the desktop too HalloKitty fucked around with this message at 22:05 on Apr 30, 2013 |
# ¿ Apr 30, 2013 21:53 |
|
subx posted:Is there anywhere I can see how an ~$100 current card would compare to my 5850? I'd rather wait for the next gen to do a larger upgrade, but was hoping I could get a small boost until then. Unless you get a deal on a used card somewhere, no. 5850 is somewhat more powerful than current $100 cards.
|
# ¿ May 7, 2013 17:07 |
|
Alereon posted:Seasonic is a good brand but every brand has a low-end line, and below 600W or so component and design quality generally also drops off a lot. Not exactly, they have a 560W X-Series which is built every bit as well as its bigger brothers, and the the fanless Platinum from 400-520w
|
# ¿ May 24, 2013 19:28 |
|
Miffler posted:Interesting. PSU's are a much more interesting component than I ever imagined prior to reading SH/SC. If you're interested in PSUs: http://www.jonnyguru.com/ OklahomaWolf's reviews are not only entertaining, but thorough. The forums are also a wealth of PSU minded people.
|
# ¿ May 24, 2013 21:56 |
|
Agreed posted:Factory Factory, my circumstances have changed pretty drastically as a result of the severe worsening of my preexisting injury, and the surgery isn't going to change it back. Suddenly a graphics card is really just a graphics card. Titan doesn't mean much for me anymore, I think. What's that, you want a 780 that's faster than Titan for less cost than Titan? http://hexus.net/tech/reviews/graphics/55725-evga-geforce-gtx-780-superclocked-acx/
|
# ¿ May 26, 2013 15:35 |
|
HalloKitty posted:What's that, you want a 780 that's faster than Titan for less cost than Titan? http://hexus.net/tech/reviews/graphics/55725-evga-geforce-gtx-780-superclocked-acx/ Just to quote myself to point this out, read another review http://www.techpowerup.com/reviews/EVGA/GTX_780_SC_ACX_Cooler/ Do you think the conclusion was different? No. It really is a card that's faster than Titan for gaming, and with that cooler, less noise under load than a stock 780 or Titan. drat. The wonder card.
|
# ¿ May 27, 2013 12:11 |
|
Un-l337-Pork posted:Am I missing something about these charts (http://www.tomshardware.com/reviews/geforce-gtx-780-performance-review,3516-28.html) or does the 7970GE smoke the poo poo out of Titan for OpenCL performance? There's a reason all bitcoin mining rigs sport AMD GPUs.
|
# ¿ May 27, 2013 15:03 |
|
Hollis Brown posted:How important is the amount of VRAM above 2GB? I am considering a GTX 770, and Gigabyte has a 4GB version that is out of stock everywhere for ~$460 (http://tinyurl.com/n373wp3), where the other options are 2GB and ~$400 (http://www.newegg.com/Product/Product.aspx?Item=N82E16814127741). I occasionally do multi-monitor gaming (2 1920x1200) is the extra 2GB of VRAM worth waiting or paying for? You can get a Radeon 7970 GHZ Edition for $400 or even a little less after rebates. It's also generally faster. Of course, it has 3GiB. HalloKitty fucked around with this message at 21:03 on Jun 3, 2013 |
# ¿ Jun 3, 2013 21:00 |
|
Agreed posted:It's not fair, but who wants to buy a high end graphics card and get some crappy stuttery mess? But they aren't. Only in crossfire or crossfire based cards have this problem. Single GPU, the frame times are perfectly fine.
|
# ¿ Jun 5, 2013 12:09 |
|
FetalDave posted:That's not true. I've had issues with stuttery graphics from my 3870, to my 5870, and now to my 7970 (see my post above/youtube video). That's almost 5? years of different generations of AMD/ATI cards with the same issue. Fair enough, I just remember looking at the fcat analysis and it didn't show anything wrong with the single card setups, as far as I recall Edit: yeah, visible in the video
|
# ¿ Jun 5, 2013 20:51 |
|
LCD Deathpanel posted:RadeonPro's pretty cool, although it'd be nice if you didn't have to configure everything per-application. You.. don't. Just click "Global" at the top of the window.
|
# ¿ Jun 6, 2013 12:48 |
|
|
# ¿ May 14, 2024 03:54 |
|
InstantInfidel posted:edit: Nevermind on the PS4, only the Xbox One will support 4K games. No games will run in 4K unless they are extremely simplistic ones. The PS4 has a far faster GPU than the Xbox One, for a start. But we're still talking about the PS4 being somewhere between a 7850 and a 7870, and the Xbox One being something more like a 7790++
|
# ¿ Jun 9, 2013 22:07 |