|
I like the difference in the reviews. overclock3d: AMD pulled a rabbit out of a hat. Guru3D: Meh look for cheap 290/x's.
|
# ? Jun 18, 2015 16:07 |
|
|
# ? May 15, 2024 02:51 |
|
Yeah, both are true in a way. The 390X is a high performance card positioned well for that performance. The thing is, so's the 290X and that's positioned for a throttling, loud mess of a card's performance. Sucks for AMD they had so much inventory in the channel they couldn't do the rebrand when the 980 came out to dampen the enthusiasm for it. Oh and MoraleHazard in case you missed my edit on the last post: If you want to check out a noise comparison video, computerbase.de has a comparison if you search for "Elf Nvidia GeForce GTX 970 im Vergleich", go down to the nav bar, click Seite 1/6, and then Lautstärke & Temperatur. (sorry, at work and posting from phone) xthetenth fucked around with this message at 16:17 on Jun 18, 2015 |
# ? Jun 18, 2015 16:14 |
|
I really don't get this, these benchmarks / comparison are a good bit faster than the 290 / 290X reviews at launch. Is that all thermal throttling? Is my MSI 290 that overclocks to 1100 actually comparable to a 970?
|
# ? Jun 18, 2015 16:17 |
|
Slightly worse at 1080p, slightly better above.
|
# ? Jun 18, 2015 16:19 |
|
Twerk from Home posted:I really don't get this, these benchmarks / comparison are a good bit faster than the 290 / 290X reviews at launch. Is that all thermal throttling? Is my MSI 290 that overclocks to 1100 actually comparable to a 970? Higher frame rate than reference clock 970, probably evenish with say a G1 gaming 970. The 970 will have slightly smoother frames most likely. For reference hardware.fr got a Tri-X 290 that held a steady 1000 MHz, their reference one went as low as 845, and probably averaged around 900. The Tri-X is 5% slower than a 970 G1 gaming and 4% faster than a reference clocked 970. You've got up to 10% more clocks than that Tri-X. Numbers are for 1440p, 290 probably does a bit worse in comparison at 1080. AMD shot themselves in the foot with the 290(X) blower. It literally never got above 947 MHz in that test. xthetenth fucked around with this message at 16:26 on Jun 18, 2015 |
# ? Jun 18, 2015 16:23 |
|
THE DOG HOUSE posted:They use it for other games. Very weird to me theyd have it off for BF4 since that was the headline Mantle game. Not using Mantle sets a level playing field between the two cards. Takes away the "but they're using Mantle" card away from the negative nancies. Also, Mantle is being deprecated, so....
|
# ? Jun 18, 2015 16:48 |
|
SwissArmyDruid posted:Not using Mantle sets a level playing field between the two cards. Takes away the "but they're using Mantle" card away from the negative nancies. Also, Mantle is being deprecated, so.... But they did use Mantle in the Sniper Elite test, so it feels more like cherry picking the render path which works best for them in that test.
|
# ? Jun 18, 2015 17:06 |
|
Running mantle in sniper elite 3 but not bf4 makes me think that they wanted to avoid using it if they still won outright, but weren't willing to turn it off if it meant a loss (also sniper elite is newer so it may be a real mess for them in dx). Also Sniper Elite has many more options for mantle to hide in. Could also be they wanted one where they win with it off and one where they show it making a huge difference. I'd say cherry picking but I'm pretty sure bf4 is faster in mantle. xthetenth fucked around with this message at 17:14 on Jun 18, 2015 |
# ? Jun 18, 2015 17:11 |
|
xthetenth posted:I'd say cherry picking but I'm pretty sure bf4 is faster in mantle. There's a lot of anecdotes out there about Frostbites Mantle path performing worse than the DX11 path on fast CPUs, nobody seems to know what causes it. Maybe AMD got bit by that issue
|
# ? Jun 18, 2015 17:32 |
|
repiv posted:There's a lot of anecdotes out there about Frostbites Mantle path performing worse than the DX11 path on fast CPUs, nobody seems to know what causes it. Maybe AMD got bit by that issue That might be and would make sense, since cherry picking and using every advantage is expected from manufacturer benchmarks. I wouldn't know about mantle render path issues, I've never owned an AMD card.
|
# ? Jun 18, 2015 18:09 |
|
R9 390X reviews went up a few hours ago. Doesn't seem to be worth the $50 premium at all, although it uses 30 less watts than the equivalent 290X according to one benchmark. http://www.guru3d.com/articles-pages/msi-radeon-r9-390x-gaming-8g-oc-review,1.html
|
# ? Jun 18, 2015 18:20 |
|
eggyolk posted:R9 390X reviews went up a few hours ago. Doesn't seem to be worth the $50 premium at all, although it uses 30 less watts than the equivalent 290X according to one benchmark. Pretty much. Aftermarket 290s are priced according to the performance of much worse reference 290s, and it looks like the end of the fire sale on Hawaiis is coming soon. Heck I think some 290s are getting raised in price because the 390 reviews are good coverage of what they can really do. Also the anandtech review makes the 8 GB 390 change make sense. Apparently that lets them move from 2 Gb chips to more common 4 Gb chips, and apparently helps with the memory speed boost because the new chips have better timings. xthetenth fucked around with this message at 18:43 on Jun 18, 2015 |
# ? Jun 18, 2015 18:33 |
|
It's almost like AMD threw a bunch of engineers at a problem or two
|
# ? Jun 18, 2015 18:44 |
|
I'm surprised that the 300 series lacks bundled games.
|
# ? Jun 18, 2015 19:25 |
|
So I was a little bit confused by the announcement and forgot to ask here but are they just putting fury x2 inside their little VR box or will they be selling it? A fury x2 card would be pretty badass
|
# ? Jun 18, 2015 19:51 |
|
They're selling it but not until the end of the year.
|
# ? Jun 18, 2015 19:52 |
|
Second question I've been trying to figure out: is the 295x2 just a 290x x2 or is it some difference that makes it "295"
|
# ? Jun 18, 2015 19:55 |
|
SwissArmyDruid posted:Not using Mantle sets a level playing field between the two cards. Takes away the "but they're using Mantle" card away from the negative nancies. Also, Mantle is being deprecated, so.... They used it for other games in that same benchmark. Or at least, another game, I worded it poorly. Plus I get what youre saying but it'd be just as valid to use Mantle imo for the benchmark since thats what would really be used IRL and I'd care more about the Mantle results than a improbably comparison. I'm glad the 300 series is getting somewhat positive words but I'm taking a very glass half empty stance on all of it. Can't get over the 3 year old release , 10months late... but anyway, here's hoping the Fury stuff is great.
|
# ? Jun 18, 2015 19:55 |
|
xthetenth posted:I'm pretty sure that one's excellent and one of the best 970. Basically EVGA half-assed their cooler on release, so their stuff is a mixed bag, with some early coolers having one heatpipe not actually functional and so on and getting beat by everyone else's coolers. I'd wait for someone who stayed paying attention to 970 coolers to be sure of it though. Thanks for the site. Google translated the German for me. I'm going to go w/ the MSI 970. I just want to measure dimensions and whatnot and check the connectors before ordering. But I have a tall tower, so I think there will be issues.
|
# ? Jun 18, 2015 20:02 |
|
Bleh Maestro posted:Second question I've been trying to figure out: is the 295x2 just a 290x x2 or is it some difference that makes it "295" Dual 290X card, no special sauce other than a stock cooler than keeps it from throttling all the time. ^^^Glad I could help. Conputerbase is great for those roundups with video of the coolers in action. xthetenth fucked around with this message at 20:04 on Jun 18, 2015 |
# ? Jun 18, 2015 20:02 |
|
So much for hoping that AMD can actually beat Nvidia cleanly. Does anyone know where the gently caress WCCFT is getting their info from? They've got specs on the Fury Nano, saying it's got a 175W TDP? http://wccftech.com/amd-radeon-r9-nano-detailed-features-fiji-gpu-175w-tdp-single-8pin-connector-sff-design-faster-hawaii/
|
# ? Jun 18, 2015 20:03 |
|
Bleh Maestro posted:Second question I've been trying to figure out: is the 295x2 just a 290x x2 or is it some difference that makes it "295" It does have slightly higher clocks, but there's nothing significant. That liquid cooler does an excellent job of keeping the noise levels down and the cores cool, though.
|
# ? Jun 18, 2015 20:03 |
|
SwissArmyDruid posted:So much for hoping that AMD can actually beat Nvidia cleanly. I think that's based on the plugs. There was a little quip during the press release about being twice as efficient per watt (presumably compared to their very own 390x) so its not a stretch to imagine. Not sure there is any hard data though edit: should have looked at the link but it says it right there penus penus penus fucked around with this message at 20:24 on Jun 18, 2015 |
# ? Jun 18, 2015 20:18 |
|
The thought of a fury x2 in an itx machine is tempting... I really need to stop being a lazy rear end and sell the cards I have laying around.
|
# ? Jun 18, 2015 20:50 |
|
THE DOG HOUSE posted:I think that's based on the plugs. There was a little quip during the press release about being twice as efficient per watt (presumably compared to their very own 390x) so its not a stretch to imagine. Not sure there is any hard data though Yeah, it was based on the PCIe spec with regards to the connectors on the card, if I recall there was a slide from AMD about it.
|
# ? Jun 18, 2015 20:53 |
|
Huh, seems Grenada isn't a straight Hawaii rebrand after all. Witcher 3 with Hairworks on runs dramatically faster on 390X than 290X, so at the very least they've beefed up the tessellation engine. http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/3 EDIT: Actually maybe not, they're using different drivers for the 290X and 390X. Maybe just optimization or a Witcher 3 profile that forces the tess scale down by default. repiv fucked around with this message at 21:58 on Jun 18, 2015 |
# ? Jun 18, 2015 21:24 |
|
Ak Gara posted:I'm assuming you can also overclock the G1's factory OC? I wonder if G1's binning makes them better at customer OCing or would they be no better or worse due to silicon lottery? I have a G1-980 that sits at about 1550 boost. I initially thought it wasn't stable there, but it turned out something was up with the shadowplay streaming service (nvstream or something like that), even when not using shadowplay, that was crashing the gently caress out of games. I manually disabled that service and it's been smooth sailing ever since. Anecdata go!
|
# ? Jun 18, 2015 21:39 |
|
Gwaihir posted:I have a G1-980 that sits at about 1550 boost. I initially thought it wasn't stable there, but it turned out something was up with the shadowplay streaming service (nvstream or something like that), even when not using shadowplay, that was crashing the gently caress out of games. I manually disabled that service and it's been smooth sailing ever since. Shadowplay is rough on OC in my experience. Same with streaming. If I'm on the edge of stability, turning on Shadowplay will crash it for me.
|
# ? Jun 18, 2015 21:46 |
|
repiv posted:Huh, seems Grenada isn't a straight Hawaii rebrand after all. Witcher 3 with Hairworks on runs dramatically faster on 390X than 290X, so at the very least they've beefed up the tessellation engine. Huh, so maybe new TSMC silicon they've been holding onto for the 300 release? At least the 390 has a legitimate performance advantage over a 290.
|
# ? Jun 18, 2015 22:04 |
|
It annoys the gently caress out of me that hardocp doesn't use the same settings to compare graphics cards. edit: ohh further down they do do that.
|
# ? Jun 18, 2015 22:15 |
|
Don Lapre posted:It annoys the gently caress out of me that hardocp doesn't use the same settings to compare graphics cards. And there's dozens other sites that do use the same settings, who cares?
|
# ? Jun 18, 2015 22:16 |
|
It'd be super cool if they did a test of an 8GB 290X against a 390X and a bios flashed 8GB 290X. And honestly this is the same [H] that managed to conclude that a 295X and 980 SLI beating a 980 Ti without framerate dips meant that 6 GB was a "MINIMUM" for 4K gaming, so not the highest standards there.
|
# ? Jun 18, 2015 22:38 |
|
xthetenth posted:And honestly this is the same [H] that managed to conclude that a 295X and 980 SLI beating a 980 Ti without framerate dips meant that 6 GB was a "MINIMUM" for 4K gaming, so not the highest standards there. You do have to wonder how they'd notice that discrepancy then not verify it in a vacuum using Tessmark
|
# ? Jun 18, 2015 22:43 |
|
Don Lapre posted:It annoys the gently caress out of me that hardocp doesn't use the same settings to compare graphics cards. Actually, it's kind of nice to get a different perspective, to see performance at settings you might actually use. They do also have 'apples to apples' comparisons if you want the same settings, yeah.
|
# ? Jun 18, 2015 22:44 |
|
repiv posted:You do have to wonder how they'd notice that discrepancy then not verify it in a vacuum using Tessmark Yeah, that's really promising data, it's like the guy talking about how he didn't follow up on the 970 giving different performance from the 980 in a synthetic. That sort of thing is huge, having an explanation for something nobody else does. On the subject of things I really want to see regarding the 2/390s, 290X CF vs 290X 8 GB CF. Really see what it takes to make them diverge.
|
# ? Jun 18, 2015 22:46 |
|
Gigabyte G1 980 Ti vs Gigabyte G1 970s in SLI. Happy to not deal with SLI for a while, but I dunno, I kinda expected better. The 980 Ti only hit 62 C during the test though. KS fucked around with this message at 03:21 on Jun 19, 2015 |
# ? Jun 19, 2015 03:17 |
|
KS posted:Gigabyte G1 980 Ti vs Gigabyte G1 970s in SLI. You expected a 980ti to like crush SLI 970's in a synthetic? I think its crazy that a card as good as the 970 in SLI is almost dead even with a 980ti, a card that "only" costs twice as much. That's like an even performance:dollar ratio to the top, unheard of. And in reality the 980ti is better in so many practical ways that you could even say the 980ti is actually a better value choice than the 970. Also a little jealous you snagged that card. I havent seen any aftermarket 980tis in stock period penus penus penus fucked around with this message at 03:30 on Jun 19, 2015 |
# ? Jun 19, 2015 03:21 |
|
It really is a better choice. First, it's a single card, so you don't have any of the frame time issues with SLI (If you thought the 290X was bad, wait till you see two card graphs). Second, no profiles. Third, you don't have to worry about VRAM size or that weird partition. Fourth, you only have to pay the aftermarket model once if you want a fancy high performance cooler and quieter part. Fifth is much better power use. All that for the same performance and similar price? There is no dual card solution that makes sense that isn't multiple 980 Ti or maybe four Titan X, including the 295X even when it does higher framerates than the 980 Ti and already has a very good cooler.
|
# ? Jun 19, 2015 03:33 |
|
THE DOG HOUSE posted:That's like an even performance:dollar ratio to the top, unheard of. And in reality the 980ti is better in so many practical ways that you could even say the 980ti is actually a better value choice than the 970. Had not thought of it that way. Actually, that's pretty nuts that you can get nearly equal FPS/$, but not need an SLI motherboard, beefy PSU, etc. Had I not already bought those things I would have come out well ahead.
|
# ? Jun 19, 2015 03:33 |
|
|
# ? May 15, 2024 02:51 |
|
KS posted:Had not thought of it that way. Actually, that's pretty nuts that you can get nearly equal FPS/$, but not need an SLI motherboard, beefy PSU, etc. Had I not already bought those things I would have come out well ahead. Plus it is just plain better, you will see.
|
# ? Jun 19, 2015 03:41 |