Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

El Scotch posted:

As I'm home visiting for the holidays I find my old man still has a BFG 9800 GT, a Diamond HD3850 and some old Zotac card sitting in drawers. :iiam:

Hey, it's probably good enough to play MGSV at least. I put it on my dad's Haswell desktop using integrated graphics to see if it would even start and was shocked to find that it's actually pretty good at 720p/low-med detail. I still upgraded him to a 7750 I found on Newegg for $40, but came away pretty impressed with how far Intel has come.

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

PirateBob posted:

Cool. I'm pretty sure it is OC'ed by at least 10%. Stock clock is 3.3 ghz unless I remember incorrectly.

Yes, but you can probably go higher. I have a 2500K with a normal CM Hyper 212 on it and stock voltage and I have been running 4.4GHz stable for years.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Malloc Voidstar posted:

How in the world did AMD manage to do this

Isn't it more likely to be bad coding on the part of the game designers, or is that :thejoke:?

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Play your games on a TV and sit across the room from your PC, that's my solution. Can't hear case fans either.

May not work if you ever want to go over 60Hz, though.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
2002-2004: Geforce 4 MX440 upgrade to a refurb Compaq P2 beige box that I got for $50
2004-2006: Radeon 9600 XT that came with an Alienware P4 desktop
2006-2008: Geforce 6800GS AGP (unlocked to Ultra spec) upgrade for the Alienware, ran like poo poo until I got an Arctic Cooling HSF for it; later sold on SA-Mart
2008-2011: Radeon 4850 with a new i7-920 system
2011-present: Radeon 7850 with a new i5-2500K system

I'm pretty sure I'll move up to Polaris or Pascal when they come out.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Yeah, although you haven't said anything specific to lead me in that direction I'd try a spare PSU next if you can get a hold of one. Old PSUs are odd and I've had a unit work fine on a 2008 Core 2 Quad desktop with a graphics card but not be able to spin up a mini-ITX Braswell system that should use a quarter of the power. It could be that yours has a marginal voltage somewhere that isn't causing any problems at load, but the GPU just won't accept it.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
If it really was 150B transistors including the memory, you can assume that DRAM will be roughly 1:1 bits:transistors so after the 16GB/128Gb of RAM you'll have 22B left for the processor.

That's still a huge chip, like 3x as large as the 22-core Broadwell Xeon so I'm a bit skeptical. Maybe there's something on the board other than the core and RAM being counted too.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
SRAM is usually 6 transistors per bit. I don't know what kind of cache GPUs typically work with but going back to the 22-core Broadwell Xeon, it has 55MB of cache which would be 440Mb and ~2.6B transistors. I'm sure that some substantial portion of what's left after you take out the DRAM is cache, but it still has to fit on the same die (unless Nvidia is pulling a Pentium II) and I don't think they have 22B transistors in one die.

feedmegin posted:

ROM/flash? Other random support ASICs?

Could be anything I guess, but the ROM is going to be insignificant unless that thing has a huge firmware image. I just downloaded a 980Ti BIOS out of curiosity and it's only 222KB.

Eletriarnation fucked around with this message at 19:45 on Apr 5, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

MaxxBot posted:

The actual GPU has 15.3 Billion transistors and they even managed to bump up the boost clocks by a considerable amount too. The P100 boost clock is as high as a Maxwell card at maximum overclock.

https://devblogs.nvidia.com/parallelforall/inside-pascal/

Yeah, I just realized that 128Gb isn't 128B bits - it's 2^37 bits, which is ~137B. This plus a 15B transistor core puts us at the nice round 150B transistor number.

Ed: 37th power, not 27th.

Eletriarnation fucked around with this message at 20:08 on Apr 5, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

SlayVus posted:

Well, Pascal consumer probably won't be announced until June by the looks of things. Even then, we won't be getting big laval. I personally wanted a big Pascal right off the bat for my system, but oh well.

Isn't it possible that even if fully-functional big Pascal chips are all being reserved for Teslas, there will be enough flawed chips that need to be cut down that the hypothetical 1080 Ti will be released in volume before the quoted Q1 2017 date? The 980 Ti is just a cut-down Titan X so we're expecting a similar pattern here, right?

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

EdEddnEddy posted:

What has pissed me off is the efficiency has gone up sure, but even with OC'ing, the sheer performance of a single core hasn't really moved much more than 10% per generation which is pretty sad even coming from the jump we got from Core 2 to the Core i Series.

I know new architectures take a lot of time and engineering to create (look at AMD I guess), but dammit if Intel is due for a new chip that isn't just a Atom. Nothing since SandyBridge - E was announced, has really gotten me excited for a CPU outside of some of the tech that the chipsets going with those CPU's has to offer. (Sandy Bridge - E brought back what Sandy Bride's chips seem to have lost from the X48/X58 platform) > 16/20 PCI-E lanes and Quad Channel RAM rather than Tripple/Dual channel.

Now the new Z170 series has some neat bells and whistles that even X99 doesn't have natively, but nothing is quite pushing me to need to upgrade my aging X79 as outside of the more cores that even Haswell - E brought, my 4.6Ghz 6 core can still hit in the same ballpark as a mildly overclocked 8 core, which is great for the old tech, but sad considering how old it is in comparison now.

I feel like the reason Intel hasn't improved this isn't for lack of trying, it's because after 35 years of scaling up and dieshrinking and optimizing their processors they're finding that it's really hard to make things that much better than they already are. Expecting big gains like clockwork assumes that there's a possible design that will get you those gains, but we can't really make those assumptions except by projecting past performance into the future.

I feel like the fact that there's a 22-core Broadwell that runs at 150W is really cool, but it's not that useful for games and insanely expensive so it doesn't have the tangible consumer benefit that a 5GHz Kaby Lake or whatever would.

Eletriarnation fucked around with this message at 17:40 on May 20, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Animal posted:

drat, I didn't realize it had that turd. I hate that snake oil company. But the model with an Intel chip is not as good at other things. It's so hard to find a good mITX board.

This has come up at least twice now in the parts picking thread. Bigfoot Networks, who makes Killer NICs, was bought by Qualcomm and the Killer NICs are all standard Atheros GigE chips now. You can make it work normally with a normal driver, and all the Killer NIC stuff is optional utilities that you don't have to install.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Animal posted:

I hate to derail the GPU thread, but how's the Atheros comparing to Intel's NIC?

Either way the ASRock board seems to have more solid reviews than the Gigabyte.

I don't know, my last two boards have used cheap Realtek chips. My source is this article: http://techreport.com/review/29144/revisiting-the-killer-nic-eight-years-on

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

repiv posted:

The napkin math doesn't look good

GTX 1070 (150W) ≈ Titan X (250W)
-> Pascal ~40% more efficient than Maxwell

RX 480 (150W) ≈ GTX 980 (185W actual)
-> Polaris ~19% more efficient than Maxwell

e: whoops

Just a few posts back someone said that AMD and Nvidia might not be using the same definition of TDP so the assumption that it's this simple is questionable to me.

Moreover, isn't it possible that the 1070 and 480 have different amounts of OC headroom and can't be taken as comparably representative of their generations in this way? If the 480 is a smaller chip that's running close to its limits while the 1070 is larger but more reined in, then the former might use a disproportionate amount of power at stock settings but be capable of similarly efficient operation at lower settings - that is, lower performance but much lower power use than it currently has producing a higher perf/watt.

Basically, we don't really have the kind of data needed to make these conclusions yet and I don't know how we could until both cards are released or at least have some really comprehensive reviews published.

Eletriarnation fucked around with this message at 22:37 on Jun 1, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

repiv posted:

The Titan X and 1080 have been measured and shown to respect their TDPs, so I'm assuming the 1070 does as well. I'm using the 980s actual power draw rather than it's :airquote: TDP :airquote:

And is the scenario you describe any better for AMD? If they're having to redline their chips to get competitive price:performance then Nvidia wins the overclocker market by default (again).

I'm well aware this is :speculate: but what else is there to do until the 29th? :v:

Having a big chip with a bad performance/watt curve does seem to be worse on some level than having a smaller one that is being pushed really hard into an inefficient part of its curve, because the second scenario implies that you could make a big chip based on the same architecture that would be more efficient at a higher performance level. I agree that it doesn't look great for AMD right now at the high end, since their top card is at a substantially lower tier than Nvidia's second one and they basically don't have a high end as a result, but the conclusions we draw from Polaris will help us make guesses about Vega which is the real future of AMD in that segment.

I'm also wondering just how much the high end matters outside of marketing and perception, because I would be willing to believe a lot of people never consider cards over a certain price point and delivering good performance at something like $200 might bring a lot more volume. Granted, the higher end cards almost certainly have better margins to make up for that.

It's fine to speculate, I just thought that extrapolating two specific models' power consumption + pre-release marketing performance numbers into an architectural efficiency comparison was a bit of a leap.

Eletriarnation fucked around with this message at 00:14 on Jun 2, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Im_Special posted:

I'm not really sure what this is, a quick google lookup, make it look to me like it's just a versatile display converter for hdmi/dvi/vga? Is there any advantages over the hdmi to dvi cable I linked to above? I assume I'll still be limited to DVI standards or whatever like no audio only digital, etc.

Most DisplayPort devices send a DVI/HDMI signal as well that can be filtered out and turned into a DVI connection by a passive adapter. This should work just fine, but HDMI and DVI use identical signaling so that definitely will.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Paul MaudDib posted:

It's weird but baking it sometimes helps too. It reflows the solder. If you're totally out of warranty and underclocking doesn't help I that can be your plan C.

Yeah, I have revived a PS3 mainboard and a couple of old GPUs this way. I'd check around to confirm this but from what I recall it was 275F at 15 minutes, propping the card up off the baking sheet with balled up bits of aluminum foil on clear spots of PCB so it's not directly touching. You definitely want to remove the HSF and any other removable bits too so nothing plastic gets warped or melted. The PS3 died again after one last boot and one of the old GPUs lasted just a few months, but the other lasted until it was so old and useless I recycled it and you've got nothing to lose if you're about to recycle your card too.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

wicka posted:

maybe use a toaster oven? i assume a video card would fit

Maybe I'm used to lovely cheap toaster ovens but I wouldn't trust one to have the evenness of temperature throughout to reflow the solder without damaging anything.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Phlegmish posted:

I don't understand how baking your graphics card helps, I thought overheating your hardware was bad.

Overheating a chip or circuit with current running through it will cause electricity to go places it's not supposed to or in amounts it's not supposed to, so that destroys things. However, even normal temperature cycles can over time weaken and eventually break solder bonds through repeated expansion and contraction.

If you take the card out and bake it on a sheet without it powered up and running, you can get up to a temperature that will soften solder without being hot enough to destroy other components on the board by just cooking them outright. Once it's cooled back down hopefully the reflowed solder connections will be functional enough to give you a few more months of life.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

dissss posted:

At work our supplier can only do one sort of cheap DisplayPort cable which has connectors that fall apart completely after 12-18 months of use so we've been buying murderously expensive HP branded passive DisplayPort to DVI converters and connecting up that way. Unfortunately the newer HP displays have completely dropped DVI so that won't work for much longer.

The latching mechanism on DisplayPort really seems to cause more harm than good - they should have just gone with an HDMI style push in.

I don't get why DisplayPort is used at all anymore now that Mini DisplayPort exists. Does it have any advantages other than the secure latching if you want that for some reason?

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Gonkish posted:

I'm waiting on the AIB non-reference 480s. I think that means that I have iron discipline compared to the rest of this thread.

I'm using a non-reference 7850 that I paid $220 for in 2012. I have been waiting to be able to get a drop-in replacement that's solidly at least twice as good for around the same price, and I think it may be here with the 480 but am waiting to see what the reviews say.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Reference 480 isn't looking like a great proposition compared to stepping up to the 1070 or waiting for the 1060, but I didn't expect it to. I'll see what the aftermarket cards look like, by which time there will be more stock of the 1070 and more information about the 1060. The last two somewhat intensive games I got were Witcher 3 and Overwatch and both of those work fine on my 7850 at 1080p medium+ detail, but I am thinking an upgrade to a 4K TV is in the near future so that might push me towards the 1070. I don't think I'll feel compelled to run all my games at 4K but the potential would be nice.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Beyond the inferiority of the reference blower, from what I read on Tom's Hardware's RX480 power analysis the card is actually set up so that it has more power draw on the slot than on the supplementary connector when the phases are evenly balanced. This suggests that running the slot under 75W with a reference design requires either unbalanced phases or running the entire card well under 150W.

For that reason,

Gonkish posted:

I think pretty much all of us are waiting on the non-reference boards, and considering that they're taking their sweet time with that...

This because I'm hoping to see a reasonably priced AIB model with an 8-pin and an open cooler, and now considering a 1060 also since the 6GB @ $250 price point is appealing and maybe we won't have to wait as long for non-blower coolers. I could stretch for the 1070 but don't like how the MSRP is totally fictional, so hoping to see more good ~$400 AIB open cooler models there.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Yeah, 980 would be about where the RX480/1060 is.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

KillHour posted:

I had a 120hz monitor as my last one, and I had a hard time telling the difference between 60 and 120 in games anyways. Maybe I'm just broken. So 1080 is the way to go, even though I'll miss out on freesync?

1080 is far faster than anything AMD is offering or will in the near future, probably twice as fast as the RX480 typically. If you want maximum speed for 4K right now then you need to go Nvidia. It also kind of depends on what games you plan on playing since a 480 might well be able to run older ones at 4K and alright settings, but I don't think it can run new and intensive games anywhere near 60fps at 4K/high.

Eletriarnation fucked around with this message at 17:28 on Jul 16, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

spasticColon posted:

Edit: Honestly I just don't understand the 4K gaming circlejerk. Movies and TV do look better on 4K sets but for gaming I'm satisfied with 1080p high/ultra settings at 60fps.

I'm struggling to understand why this would be the case. Are you saying that there's a quality improvement from 4K video that you don't see in gaming, or that you care about the improvement for video but don't for games for some reason? Have you seen games running at 4K at the same settings otherwise that you run, or are you just saying that you can't miss what you've never experienced?

MaxxBot posted:

I would argue the opposite, the difference between 1080p and 4k is much more visible at typical monitor viewing distances than it is at typical TV viewing distances.

Past the clear tautology of "you can always see more detail on an object if you're closer to it", you kind of have to define typical distance and your monitor/TV sizes to make a statement like this.

Eletriarnation fucked around with this message at 21:17 on Jul 19, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
I wish I had known that the small Zotac cards would pop up as in stock during the afternoon. I got up 2 hours after launch this morning and saw that everything on Newegg/Amazon was already sold out. I figured that was the end for a week or so, so I said gently caress it and ordered an FE despite it being almost too long to fit in my case and not really wanting a blower cooler. Now I see that I could have gotten what I wanted for $20 less, but it's too late unless I want to just order both and try to flip the FE.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Yeah, I thought about it but by the time I revisited the page they were out of stock. It's not a big deal, I was mostly just irritated at how everything else sold out in an hour. I haven't seen any reviews complaining about the noise of the FE and at least one directly said it was quiet, so if the worst that happens is paying an extra 20 bucks and having to move a hard drive I'll be OK.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
If anyone's wavering about the FE because they're not sure when Nvidia will ship it, I ordered one late yesterday morning and just got a tracking number now with an estimated delivery of Saturday. I'm not really thrilled about the price but I'm sure it will be enough of an improvement over my 7850 that it won't bother me in the end.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

.jpg posted:

Yeah I was looking at that one, though they recommend 300W+. I have a feeling what I'm looking for might not exist and I'll have to try and get budget for replacing all the PCs...

The 710 only uses 19W by itself if you check the spec sheet. The 300W PSU recommendation is so people don't push their lovely redlined OEM PSUs over the edge, but if you do the math and determine that you have that much headroom plus a little extra to account for component age then you should be fine.

e:f,b

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Naffer posted:

Any adapter should be fine. The 5770 doesn't have an HDMI port already?

I think he means that it has DVI, DP and HDMI and he plans to use the DVI and HDMI for 2 HDMI monitors already, so he's wondering if he can turn the DP into a third HDMI for the TV.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

mango sentinel posted:

I'm considering going two hundred dollars over budget just because I can actually walk into a store and buy a 1070.

Someone please tell me this is stupid.

Doesn't sound stupid to me, but maybe that's because I started with an RX480 budget and ended up strongly considering 1070 myself when I saw how bad availability is for the 480 and the 1060.

Ended up getting a FE 1060 since I figured I can just flip it without too much loss if I dislike it, but I'm still not certain if I made the best decision.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Sidesaddle Cavalry posted:

If it's a cheapo PSU that came with a pre-built machine that's several years old already and you're upgrading it, it's never good enough and will blow up your computer, guaranteed. But yes 450W is fine.

Yeah, if it's "a new Seasonic 450W SFX supply" then you're good. If it's "a 10 year old Antec that came with my Sonata II" or "it's included in the case, they don't say what kind" then not so much.

e: Yeah, that sounds like it should be fine.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

KingEup posted:

I honestly think it's crazy to buy a 1060 for marginally better frame rates and forgo Freesync.

If your main screen is a TV and has no hope of supporting any kind of *sync then you might as well go for whatever performs and is available. I wanted an RX480 but you can't find them anywhere, and after I bought a 4K TV I felt like I was wasting potential every day that passed without my PC having an HDMI 2.0 card.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

EoRaptor posted:

Nah, UHD video will probably never make it to mainstream use, but the pixel counts for VR video will be similar (or higher) so a solution to the editing problem does need to be found.

I feel like you could have said this about full HD 15 years ago. You can buy a 4K TV for well under $1000 and a Netflix subscription allows you to stream 4K to it for $12/month, so while adoption isn't widespread yet I'm curious - why do you think it's likely that it never will be?

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

ghetto wormhole posted:

Would getting a real cheap second GTX 970 for SLI be worth it if I'm only running 2560x1080?

The standard recommendation for when you're considering SLI/Crossfire is almost always to sell your card and buy a bigger one instead. The exceptions are when you're already at the top card or when you only care about performance in a particular game that is known to scale very well with multi-GPU. This may change in the future if engines based on async compute or other more flexible multi-GPU methods catch on, but for now I think it still holds.

If you know your PSU can handle it and you know that games you care about play well with SLI and you aren't able to run them at max settings now, then it might be worth considering but I would probably look at 1070s or used 980Tis instead.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Gonkish posted:

Yeah, poor phrasing on my part. No games, only desktop stuff.

I think the answer is the same because you need HDMI 2.0 for a 4K TV and that's only available on AMD's 400 series and GF >= 950. You could try one of those DP-HDMI 2.0 active adapters that costs around $30 and get a cheaper DP card, but I'm not sure how well those work and with the cost of the adapter you wouldn't save much.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Klyith posted:

their price for a generic 1060 was $299... a fictional card with the stock performance but OC price.

You mean this thing?:

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Reviews that I've seen have indicated that the 3GB 1060 is only around 5% worse than the 6GB model (the whole core isn't cut down 10%, just a subset of the functional units) for most things and the RAM difference really only makes a big difference for current content if you're trying to run at settings that the fully-enabled model can't handle either.

I would definitely buy it over an RX 470 if you don't care about FreeSync.

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
I feel like it's the same problem SLI always has, developers have to do significant extra work that will only ever benefit a tiny share of the userbase. VR might offer some low-hanging fruit when it comes to getting substantial performance gains out of it, but it doesn't change those fundamental facts.

Not to mention, Pascal has some substantial tricks for rendering the same scene from multiple eye-perspectives so I'm not sure how much of that low-hanging fruit has already been picked.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply