|
Massasoit posted:I got the gigabyte g1 gaming because I stopped at local microcenter and they happened to have it in stock and I put it on "EcoMode" which I think underclocks it. Not sure it really does much. It also has an overclocking mode Bleh Maestro posted:Ign's daily deals page said there was an $80 off Gigabyte G1 1070 on sale today. It sold out but did anyone see this? Was it $80 off $429 or some higher price? The Gigabyte G1 is probably the worst AIB card, outside of blower cards.
|
# ? Jul 26, 2016 18:41 |
|
|
# ? May 30, 2024 08:10 |
|
snuff posted:The Gigabyte G1 is probably the worst AIB card, outside of blower cards. How so?
|
# ? Jul 26, 2016 18:44 |
|
Massasoit posted:How so? Loud at load compared to the competition.
|
# ? Jul 26, 2016 18:46 |
|
snuff posted:Loud at load compared to the competition. I can agree with this. My EVGA FTW blows it out of the water in terms of full load cooler noise.
|
# ? Jul 26, 2016 18:47 |
|
Peanut3141 posted:I can agree with this. My EVGA FTW blows it out of the water in terms of full load cooler noise. It is a bit loud but I'm glad I don't have coil wine. Every EVGA card I've had has had coil wine - 660ti, 760, 970
|
# ? Jul 26, 2016 18:50 |
|
I dunno I don't really notice it. My case isn't right next to my head. Then again central air is blasting and I couldn't give too much poo poo about a little extra air noise.
|
# ? Jul 26, 2016 18:51 |
|
Admiral Ray posted:Should still be better than that i5-760, though: Comparison between 760, 8350, and 2600k The FX-8350 has a much better single thread rating for Passmark. My understanding is that games can be relatively low IPC code with lots of cache misses, pipeline flushes, and branching code that means that the long-pipeline CPU designs like Pentium 4 and AMD Bulldozer aren't able to perform as well as they should be capable of. Benchmarks like Passmark tend to be set up to show off the CPU performing at it's best, heavily branching code can bring them to their knees. Another issue confounding things is that with Bulldozer vs Nehalem / Sandy Bridge etc, you've got one that's integer math heavy and one that's comparatively FPU heavy. CPUs have never been as simple as "X CPU is better than Y CPU", we've just gotten used to having things be good enough so we don't have to think about these things.
|
# ? Jul 26, 2016 18:51 |
|
i've never noticed the noise from my G1
|
# ? Jul 26, 2016 18:54 |
|
Phlegmish posted:I'm a bit surprised, looking at PassMark it has a decent score: Yes thats the shortest summary of AMD cpu problems. A $60 Intel Pentium could outperform it in games, and in fact a lot of things (however some specific recent games have been bottlenecked an actual 2 core Intel, but just as a reference point). CPU synthetic benchmarks are unfortunately highly misleading for average use, especially for games. Its actually kind of difficult to find benchmarks with AMD cpu's anymore, at least comprehensive ones with a focus on CPU's affect on things. There was a lot of hope early on that Direct X 12 and Vulkan would close the gap between the common 8 core FX and the respective common 4 core non-HT Intel but as far as I can tell its not even remotely close. The AMD cpu's benefit, but so do the Intels, and the gap remains. I tried to look this up now though out of interest but I actually couldn't find a whole lot which is kind of surprising. Here's another thing too that really shocked me when I first experienced it. If you play games online, especially ones where latency is important or lots of things are happening, the difference between AMD and Intel is amazing. Even if you locked it at exactly the same framerate the AMD acts like a stuttering mess compared to an Intel. I went from an AMD Phenom II 965 BE with a maxed out OC, a chip I highly regarded for a very long time, to a bone stock 4670k and was absolutely blown away at the difference. In single player the fps boost was minor at the time, but once I went online it was clear as day. These were for games like BF3 and the like. I thought the Intel hype was half bullshit too, I was having some pretty strong buyer's remorse before I booted it up. FX cpus actually got worse, at least the ones right after the Phenoms. Good luck finding benchmarks about the difference between CPU's single player vs multiplayer, they just didn't happen. Zen cpu's might fix the huge gap but im rambling too much about CPU's. Just want anybody here to know if you're riding on an old FX cpu and wondering why your fps doesn't exactly match what your new GPU says it should do, thats almost certainly why.
|
# ? Jul 26, 2016 19:00 |
|
PerrineClostermann posted:...looks like I'm crashing at more than +80 in AFterburner... Did you increase the voltage? I didn't get much headroom with the CPU with my EVGA SC 1070 but I did add a few more clocks on the memory. Overall not a huge change.
|
# ? Jul 26, 2016 19:19 |
|
wicka posted:i've never noticed the noise from my G1 That's great, but it's unfair for potential buyers not to mention that it's one of the objectively loudest AIBs.
|
# ? Jul 26, 2016 19:21 |
|
snuff posted:That's great, but it's unfair for potential buyers not to mention that it's one of the objectively loudest AIBs. Is there a comparison of noise levels somewhere? I have a G1 and haven't heard anything out of it, at least not over any other ambient noise. e: What's the deal with voltage control in MSI Afterburner? It's incredibly non-intuitive.
|
# ? Jul 26, 2016 19:47 |
|
THE DOG HOUSE posted:Here's another thing too that really shocked me when I first experienced it. If you play games online, especially ones where latency is important or lots of things are happening, the difference between AMD and Intel is amazing. Even if you locked it at exactly the same framerate the AMD acts like a stuttering mess compared to an Intel. I went from an AMD Phenom II 965 BE with a maxed out OC, a chip I highly regarded for a very long time, to a bone stock 4670k and was blown away at the difference. In single player the fps boost was minor at the time, but once I went online it was clear as day. These were for games like BF3 and the like. I thought the Intel hype was half bullshit too, I was having some pretty strong buyer's remorse before I booted it up. FX cpus actually got worse, at least the ones right after the Phenoms. Good luck finding benchmarks about the difference between CPU's single player vs multiplayer, they just didn't happen. The long and short of it is that a 6600k utterly destroys everything in the AMD stable, even on threaded tasks, even before we get into hyperthreading, hexacore, etc.
|
# ? Jul 26, 2016 20:05 |
|
snuff posted:That's great, but it's unfair for potential buyers not to mention that it's one of the objectively loudest AIBs. that's great, where are these objective comparisons?
|
# ? Jul 26, 2016 20:20 |
|
wicka posted:that's great, where are these objective comparisons? https://www.computerbase.de/2016-07/geforce-gtx-1080-partnerkarten-vergleich-test/
|
# ? Jul 26, 2016 20:28 |
This was about the 1070, not the 1080.
|
|
# ? Jul 26, 2016 20:35 |
|
Barry posted:Is there a comparison of noise levels somewhere? http://www.eteknix.com/palit-gamerock-premium-gtx-1070-graphics-card-review/11/ That's just off the top of my head, I have heard it from several sources and owners in this thread.
|
# ? Jul 26, 2016 20:38 |
|
AVeryLargeRadish posted:This was about the 1070, not the 1080. probably going to be very similar results
|
# ? Jul 26, 2016 20:46 |
|
snuff posted:http://www.eteknix.com/palit-gamerock-premium-gtx-1070-graphics-card-review/11/ I think it's obvious that it's noisier under load than the other cards (which may or may not be all that important to you) but I don't know if that's enough to immediately mark it as the worst of the AIB cards, especially at a potential $80 off.
|
# ? Jul 26, 2016 20:49 |
|
RX 480 announced at $200, third party is basically $260-270 and still has extremely high demand :iamafag: The value against a 1060 seems completely eroded unless you truly believe in the brand or DX12/Vulkan.
|
# ? Jul 26, 2016 20:50 |
|
I believe in FreeSync . It also helps that I am not in a hurry to upgrade RIGHT NOW.
|
# ? Jul 26, 2016 21:09 |
|
wolrah posted:I'm just not seeing where the performance advantage is over accessing those same SSDs installed in the host system itself. It's two M.2 slots as currently implemented, so the absolute best case is 4 PCIe lanes per slot worth of bandwidth each. Assuming you're not trying to build a budget compute system on a PCIe-limited platform like LGA115x there's basically no bandwidth advantage. Depending on what motherboard you have and where you put the GPUs and SSDs there may be a latency advantage. It's been confirmed to be nothing more than a PCIe switch on the card, so in theory a pair of SSDs installed on the same PCIe controller as the GPU in a normal PC could be used exactly the same way. Even if there is a latency difference from the onboard SSD versus in the host the latency difference between RAM and SSD is so large that it seems like the difference between the two SSD configurations would be a fart in the wind by comparison. Accessing a SSD on the system is dependent upon where it's located in the system. No matter what the call would need to go out over the PCIe bus to the CPU to get in the queue. Then it depends on where/how the drive is hooked up. Best case scenario you'd be talking to a PCIe NVMe drive in another slot with a direct link to the CPU so you'd just have to back through the bus to the different slot. Otherwise the signal would then have to be routed to the chipset and then out to either a M.2/SFF-8643 port or (god forbid) a SATA port. With the drive hooked up directly to the video card the call can be routed directly through the on-board switch chip without having to go anywhere else. It certainly doesn't make sense for consumers but could be beneficial for big data clients who have multi-GB data sets.
|
# ? Jul 26, 2016 21:09 |
|
jm20 posted:RX 480 announced at $200, third party is basically $260-270 and still has extremely high demand :iamafag: The value against a 1060 seems completely eroded unless you truly believe in the brand or DX12/Vulkan. It seems likely that the 470 is going to be the real ~$200 card. Someone posted a link here a few days ago that had it rumored at $180 which would be ok if you didn't give up too much performance coming from the 480.
|
# ? Jul 26, 2016 21:15 |
|
I wonder how the 470 would compare to the 970/980/390, but we will have to wait for benchmarks or word that such a chip is being released before october
|
# ? Jul 26, 2016 21:16 |
|
THE DOG HOUSE posted:Yes thats the shortest summary of AMD cpu problems. A $60 Intel Pentium could outperform it in games, and in fact a lot of things (however some specific recent games have been bottlenecked an actual 2 core Intel, but just as a reference point). That almost sounds like a good Reviewer Site niche to fill.
|
# ? Jul 26, 2016 21:59 |
|
Just wanted to pitch in my personal experience: the G1 was noticably loud and I'm very glad I pulled the trigger to return it for a Gaming X. Perfo across all AIB seem to be nearly identical so noise / temps become pretty important when selecting a board.
|
# ? Jul 26, 2016 22:12 |
|
Ugh, motherfucker; earlier today I saw 1060s in stock on Newegg, but by the time I'd got through the checkout process, it had already gone out of stock.
|
# ? Jul 26, 2016 22:13 |
|
SlayVus posted:So I've always like big cases for their expandability, but with the Vive I want to build something like an itx build with a Titan XP. Just so that I can demo it around and have some thing not as flashy in the living room where I have my PC setup. Posted this on the pc building thread and got no traction there. Anyone have any thoughts on how I can go about doing this?
|
# ? Jul 26, 2016 22:14 |
|
SlayVus posted:Posted this on the pc building thread and got no traction there. Anyone have any thoughts on how I can go about doing this? You could keep your PC in the living room with your Vive and then Steamlink to a lovely PC with your ultrawide. That isn't so great for twitch shooters and the like, but its fine if you want to play, like, stardew valley or something. Or you could keep your PC in the living room and run a 100' HDMI with some Keyboard/Mouse switching solution. Not cheap either. I think the Titan is better on the Vive than on the monitor, to be honest. I'd rather get a 1070 or 1080 and lose some frames on the monitor and have the Vive rock solid.
|
# ? Jul 26, 2016 22:22 |
|
If you're building it in itx maybe lug it upstairs/downstairs when needed? If it's directly upstairs then a wireless keyboard and mouse may work alongside a hdmi cable? Not sure how great wireless hdmi is but may be worth looking into? I know my logitech keyboard works from most rooms in my house, certainly in adjecent rooms.
|
# ? Jul 26, 2016 23:03 |
|
New gaming laptop with a 1070M and leave the desktop where you want it? The 1070M looks to almost be = to the 1070 non M due to the core increase with only a minor clock reduction but we will have to wait and see what they do when they are finally out to know for sure. Until then, I lug my huge desktop around to demo the Vive because it is just drat worth it. VR must be shared!
|
# ? Jul 26, 2016 23:14 |
|
SlayVus posted:Posted this on the pc building thread and got no traction there. Anyone have any thoughts on how I can go about doing this? Thunderbolt is the best overall solution. You can do everything with a single cable. You can also pull a pair of HDMI/DP and a USB 2.0 cables down to your living room. You can probably get about 15 meters before you will need "active extension"/"repeater" cables. This isn't the 100 ft you mentioned, but it might be enough if you pull a run through your walls. You can plug multiple HDMI/DP/USB active extensions together, but each link is basically a separate hub that adds latency and at some point you will have issues with the Vive. It might be good enough for desktop gaming though, since that's not as latency-sensitive. Depending on how far you want to go, this is not particularly cheap either. You can stream your display from your living room up to your computer room over a network (ethernet/powerline/wifi in order of preference) but you will incur a pretty decent latency penalty at the other end, and the streaming solutions are not as good as having a physically connected PC. You can also use HDMI-over-ethernet and USB-over-ethernet adapters. There don't seem to be DP-over-ethernet solutions so you would have to use HDMI. I imagine the Vive wouldn't like the latency and you run into many of the same problems in terms of latency sending up to your gaming PC (in particular you won't get *sync and possibly not even your full refresh rate). However, since it's a physical approach it would probably beat software-based streaming solutions. Paul MaudDib fucked around with this message at 23:48 on Jul 26, 2016 |
# ? Jul 26, 2016 23:44 |
|
Been reading up on the AMD news out of Siggraph. And um. I know we kind of semi-mocked the whole SSD-on-GPU thing, but an 8K timeline realtime scrubbing at 92 FPS? That's.... not nothing. That's actually kind of absurd.
|
# ? Jul 27, 2016 00:06 |
|
EdEddnEddy posted:New gaming laptop with a 1070M and leave the desktop where you want it? Even with sufficient power, there's kinks to work out with VR and notebooks. They tend to get goofed up by Optimus.
|
# ? Jul 27, 2016 00:26 |
|
bull3964 posted:Even with sufficient power, there's kinks to work out with VR and notebooks. They tend to get goofed up by Optimus. Just get a G-Sync laptop, none of them have Optimus
|
# ? Jul 27, 2016 00:35 |
|
AMD didn't make those SSD video cards on a whim. They almost certainly had at least one large scale customer that knew the approach would give dramatic performance benefits. I imagine these probably make those servers full of GPUs easier to manage. The individual GPUs don't have to fight over PCIe bandwidth.
|
# ? Jul 27, 2016 00:44 |
|
Paul MaudDib posted:The real benefit of buying a more expensive version is usually getting better stock clocks, but most Pascal cards are boosting to 2000-2100 MHz on their own anyway because of GPU Boost 3.0 so the differences on paper don't exist in real life. And especially on a $250 card it's really not worth spending a bunch extra anyway, because it's more efficient to use that money to step up to a 1070 that's 40% faster instead of a high-end 1060 that's 5% faster. If you want a reasonable card with an open cooler just look for the basic EVGA ACX 3.0 card (no SC or FTW in the name) and be done with it. I appreciate your helping; forgive my lack of understanding. I am, I suppose, a 'filthy casual' when it comes to performance. It doesn't help that my gaming isn't that intensive. But half of that is words. Currently, the 1060s are showing up as £250-300(something like this: https://www.amazon.co.uk/Asus-NVIDIA-GeForce-1060-DUAL-GTX1060-6G/dp/B01IUA9Z52/ref=sr_1_5?ie=UTF8&qid=1469577469&sr=8-5&keywords=gtx+1060 or this: https://www.amazon.co.uk/NVIDIA-GeF...ywords=gtx+1060 ). Mostly out of stock, too. The 1070's are £430+(something like this: https://www.amazon.co.uk/EVGA-Nvidi...ywords=gtx+1070 or this: https://www.amazon.co.uk/Gigabyte-G...ywords=gtx+1070). About the same popularity. The major difference seems to be 'the 1060s have 6 GB of video RAM, the 1070s have 8 GB'. Is that all, and how important is that? It's a big purchase for me, so I'm basically terrified(as I always am). Certainly, I have enough money for either. Still, I would know what my money's getting. Bloodly fucked around with this message at 01:11 on Jul 27, 2016 |
# ? Jul 27, 2016 01:06 |
|
PBCrunch posted:AMD didn't make those SSD video cards on a whim. They almost certainly had at least one large scale customer that knew the approach would give dramatic performance benefits. They probably *did*. Of course, in the time-honored tradition, let's pull an "but can it run Doom" and see what happens when you just copy an entire game directory to the onboard SSD (with the appropriate extensions and APIs to take advantage of it being there, of course)
|
# ? Jul 27, 2016 01:28 |
|
Bloodly posted:Currently, the 1060s are showing up as £250-300(something like this: https://www.amazon.co.uk/Asus-NVIDIA-GeForce-1060-DUAL-GTX1060-6G/dp/B01IUA9Z52/ref=sr_1_5?ie=UTF8&qid=1469577469&sr=8-5&keywords=gtx+1060 Previously, you had to overclock by hand to fully wring all the performance out of the card. Maxwell and Pascal are both really stable and reach consistent clocks, but with Maxwell you had to overclock by hand. The "base/boost" numbers didn't represent the maximum performance you could actually squeeze out of a card, and most of the lower cards would overclock almost as far as the high-end clocks. With Pascal, there's basically an "auto-overclock". The card knows when it can apply more power and stay stable, and it will basically take itself to within about 5% of its max stable overclock all by itself (and there's a diminishing return/rapidly increasing heat past there). So basically, the "base/boost" clock numbers marked on the box are meaningless with Pascal. The card will boost itself past those numbers if the chip can actually stay stable. So the only differentiations on the better cards are better cooling and better power delivery that might increase the stability a bit. This is a super minor difference, the difference between a GBP 250 card and a GBP 350 card might be 5%. On the other hand, the 1070 is a legitimately faster card with a lot more cores on it (about 40% faster). It costs more, but you're actually getting something for your money and at least in the US it's more cost efficient to save your pennies and step up to the 1070 rather than try and chase that last 5% of performance. And at least here, if you chase around good sales you can knock another chunk off the typical retail prices since those are overpriced at the moment relative to demand. Paul MaudDib fucked around with this message at 02:10 on Jul 27, 2016 |
# ? Jul 27, 2016 01:53 |
|
|
# ? May 30, 2024 08:10 |
|
Paul MaudDib posted:Previously, you had to overclock by hand to fully wring all the performance out of the card. Maxwell and Pascal are both really stable and reach consistent clocks, but with Maxwell you had to overclock by hand. The "base/boost" numbers didn't represent the maximum performance you could actually squeeze out of a card, and most of the lower cards would overclock almost as far as the high-end clocks. That explains why my Zotac 1070 AMP was able to clock itself all the way up to 1922MHz in Witcher 3 even though the boost spec is only 1797MHz. Still I wonder who would be *that* dumb to buy something like a $330 Asus Strix 1060 instead of any 1070 when available at ~$400.
|
# ? Jul 27, 2016 03:09 |