Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

EoRaptor posted:

The target market includes UltraHD and up video editing, something where nvidia doesn't have a specific product, so if they can make a go of it, it could be good money.

Yeah, any market they can win by default would be a pretty nice prize.

Adbot
ADBOT LOVES YOU

DonkeyHotay
Jun 6, 2005

Truga posted:

Not while a 3.5gb card is the most popular card ever ever.


Do you have your monitor plugged into DVI and overclocked? If yes, this is a known issue and I think fixed in the latest driver.

Nope, no OC and using DisplayPort

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

wargames posted:

But are they ultrawide 1440?

No, he was asking about 2560x.

As for affordable, they're a bit more than a Gtx 1070 here but monitors do tend to last longer than GPUs.

Gray Matter
Apr 20, 2009

There's something inside your head..

xthetenth posted:

If they have that 2.5x range, they do it, if they don't they don't.
So are you saying that on this monitor that lists "60hz" as its refresh rate instead of a range, I won't see any benefit from FreeSync?

Peanut3141
Oct 30, 2009

Bioshuffle posted:

It seems like NewEgg has most of the 1070 graphics cards in stock. If I'm looking to spend under 500, which is my best option currently? I am eyeing the http://www.newegg.com/Product/Product.aspx?Item=N82E16814127947&cm_re=1070-_-14-127-947-_-Product as it has silent mode, and that sounds appealing to me.

I'd highly recommend the EVGA FTW. I got one to replace my defective G1 Gaming 1070. It's so much quieter that I'm actually grateful the G1 had problems and I lived without for a week and a half while waiting for the RMA.

It even does the fans off while under load at times like AVLR mentioned. Though when being taxed with Geralt's hair for hours on end, the fans to hang around an inaudible 500-800 rpm.

TPU has it's SC little brother as the quietest 1070 they've tested. https://www.techpowerup.com/reviews/EVGA/GTX_1070_SC/29.html

GRINDCORE MEGGIDO
Feb 28, 1985


AVeryLargeRadish posted:

All of them turn their fans off when not needed and all of the ones I have seen reviewed so far are pretty quiet. Also all of them will clock to a max of 2-2.1GHz so there are very little difference in performance between them, your best bet is probably going with one made by a company that has good support and RMA service, EVGA is tops there but MSI is good too. The only card I have seen that might be significantly quieter than the others is the Zotac AMP! Extreme because it has such a massive cooler that even under full load the fans will switch on and off because they are sometimes not needed. The other type of really quiet card are the liquid cooled ones like the MSI Seahawk, those don't have a silent mode but they are whisper quiet at both idle and under load, however they are pretty expensive.

The Palit game rock has a huge, quiet cooler as well.

snuff
Jul 16, 2003

wipeout posted:

The Palit game rock has a huge, quiet cooler as well.

It's the 1070 to get, cheap and silent.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

xthetenth posted:

Yeah, any market they can win by default would be a pretty nice prize.

This also factors into AMDs quest to be a one stop shop, as they sell licensed SSDs as well. So if they can get this rolling, AMD would be selling someone an AM4 MB, Zen CPU, AMD brand DDR4, AMD brand SSD(s), and a Radeon card. All AMD would need at this point is to absorb someone like Raidmax for branding purposes to get cases and power supplies.

GRINDCORE MEGGIDO
Feb 28, 1985


snuff posted:

It's the 1070 to get, cheap and silent.

My gamerock 1080 arrives this Saturday. Upgrading from a 7850. Stoked!

japtor
Oct 28, 2005
Radeon Pro WX announced:
http://www.tomshardware.com/news/amd-radeon-pro-wx-series,32324.html

...I guess Fire Pro versions of 480/470/460 or do the specs not match up (like 8GB on the middle one)?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

japtor posted:

Radeon Pro WX announced:
http://www.tomshardware.com/news/amd-radeon-pro-wx-series,32324.html

...I guess Fire Pro versions of 480/470/460 or do the specs not match up (like 8GB on the middle one)?

It matches but it's an RX470 for the first (2048SPs), no known counterpart for the second (possibly mobile?) at 1792SPs, and the standard full Polaris 11 (RX460) for the third.

Also

quote:

Koduri said that he is very passionate about the color blue (even going so far as to name his daughter "blue" in Sanskrit). After taking control of the Radeon Technologies Group last year, he set on a mission to change the color of the Radeon Pro. The company researched different color pigments and decided on Yinmn Blue for its special properties. Yinmn blue is the only non-toxic blue pigment known to man. The color does not fade, it is highly reflective, and it's highly energy efficient. Koduri said the entire Radeon Pro WX series will be painted in this color.

:siren::byodood::siren:Intel buyout of RTG confirmed.:siren::byodood::siren:

On a more serious note, do you think team red should do with a change of colors?

Bloodly
Nov 3, 2008

Not as strong as you'd expect.
I am confused.

On a recommendation of upgrade from Nvidia GTX 660, I'm vaguely looking about for 1060s. There's a bunch, with multiple fans(Some with one, some with two, some with three) and buzzwords and of course multiple prices. I can't comprehend better or worse. Can you help me understand or give any advice?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Bloodly posted:

I am confused.

On a recommendation of upgrade from Nvidia GTX 660, I'm vaguely looking about for 1060s. There's a bunch, with multiple fans(Some with one, some with two, some with three) and buzzwords and of course multiple prices. I can't comprehend better or worse. Can you help me understand or give any advice?

That confusion is intentional BTW, but try to buy from MSI, EVGA or Zotac - these guys have the best warranty terms, customer service and good overall PCB and cooler designs. For the 1060, there doesn't seem to be a point to get really expensive versions, like all Pascal cards they seem to have similar performance and maximum overclocks, so get something fairly cheap from the aforementioned AIBs. Make sure it fits your case and case style as well, open air coolers are best for cases with good airflow, while blowers are best for low air flow cases or water setups due to the reference PCB.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

FaustianQ posted:

It matches but it's an RX470 for the first (2048SPs), no known counterpart for the second (possibly mobile?) at 1792SPs, and the standard full Polaris 11 (RX460) for the third.

Also


:siren::byodood::siren:Intel buyout of RTG confirmed.:siren::byodood::siren:

On a more serious note, do you think team red should do with a change of colors?

If I'm in RTG shoes I would also very much get bought out by Intel then to suffer under consistently incompetent and parasitic management and marketing.

Imagine the powerhouse AMD/NV now would be if both agreed to merge with JHH at the helm a decade ago.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
Nvidia has made some missteps when it comes to non GPU products. Their motherboard chipsets were pretty much total garbage, and their more "Denver" ARM SoCs have been pretty terrible as well.

It's hard to imagine any company having success whole having to deal with Global Foundries.

JHH might not have been able to fix all of AMDs problems.

wicka
Jun 28, 2007


Palladium posted:

If I'm in RTG shoes I would also very much get bought out by Intel then to suffer under consistently incompetent and parasitic management and marketing.

Imagine the powerhouse AMD/NV now would be if both agreed to merge with JHH at the helm a decade ago.

there's not much good marketing can do to overcome horrendously bad reference cards

penus penus penus
Nov 9, 2014

by piss__donald

Truga posted:

There's a 4gb sapphire nitro+ floating for preorders out there, I thought that was pretty drat good for the price:
https://www.overclockers.co.uk/sapphire-radeon-rx-480-nitro-4096mb-gddr5-pci-express-graphics-card-gx-37d-sp.html

It's not like these cards have much to profit from going more than 4gb right?

I personally do not believe this gpu requires 8gb and by the time even 5 gb is the norm (for 1080p) the fps will just be too poor to care. This has generally been the case for every card I can remember that had two different memory specs since 2gb was standard. Before that vram was a much bigger issue iirc.

Things can change though. But I've heard that for years too...

The worst case scenario in this timeframe I can think of was the 2gb 770, but that was still more on paper than real life problems.

But in any case the 4gb price is the price it needs to be

dad on the rag
Apr 25, 2010
Asus GTX 1070 Turbo for $409 - $25 PayPal code (PP2016BTS) and in stock at Newegg. So tempting if I wouldn't have to pay tax.

incorporeal
May 6, 2006

So I'm thinking if getting a 1080 for my next build, whenever they come back in stock which ones should I be keeping an eye out for and which ones should I avoid?

Stanley Pain
Jun 16, 2001

by Fluffdaddy

PBCrunch posted:

Nvidia has made some missteps when it comes to non GPU products. Their motherboard chipsets were pretty much total garbage, and their more "Denver" ARM SoCs have been pretty terrible as well.

It's hard to imagine any company having success whole having to deal with Global Foundries.

JHH might not have been able to fix all of AMDs problems.


What? Nforce and Nforce2 were solid chipsets :colbert:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gray Matter posted:

So are you saying that on this monitor that lists "60hz" as its refresh rate instead of a range, I won't see any benefit from FreeSync?

You'll still see some benefit when you stay inside the sync range, but when you drop below the minimum refresh rate you'll either see some tearing or some judder.

With LFC, the drivers would send each frame twice to keep it inside the sync range, but you need the maximum sync refresh rate to be at least 2.5x the minimum for LFC to be enabled.

https://www.amd.com/Documents/freesync-lfc.pdf

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Bloodly posted:

I am confused.

On a recommendation of upgrade from Nvidia GTX 660, I'm vaguely looking about for 1060s. There's a bunch, with multiple fans(Some with one, some with two, some with three) and buzzwords and of course multiple prices. I can't comprehend better or worse. Can you help me understand or give any advice?

The real benefit of buying a more expensive version is usually getting better stock clocks, but most Pascal cards are boosting to 2000-2100 MHz on their own anyway because of GPU Boost 3.0 so the differences on paper don't exist in real life. And especially on a $250 card it's really not worth spending a bunch extra anyway, because it's more efficient to use that money to step up to a 1070 that's 40% faster instead of a high-end 1060 that's 5% faster. If you want a reasonable card with an open cooler just look for the basic EVGA ACX 3.0 card (no SC or FTW in the name) and be done with it.

e: On that note, how in the world are AIB partners not pissed as hell about GPU Boost 3.0? Like, Maxwell would reliably OC to similar clocks, but at least you had to take an hour and figure out your peak overclock, so the increased clocks still meant something to users too scared/lazy to do that. Now the various models are literally indistinguishable because GPU Boost automatically squeezes nearly everything out of the card and the only real differentiation is a high-end aftermarket PCB and the cooler.

Paul MaudDib fucked around with this message at 15:35 on Jul 26, 2016

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Stanley Pain posted:

What? Nforce and Nforce2 were solid chipsets :colbert:

Nforce 2 was epic. Could take any source and do on the fly dd encoding

EdEddnEddy
Apr 5, 2012



FaustianQ posted:

This also factors into AMDs quest to be a one stop shop, as they sell licensed SSDs as well. So if they can get this rolling, AMD would be selling someone an AM4 MB, Zen CPU, AMD brand DDR4, AMD brand SSD(s), and a Radeon card. All AMD would need at this point is to absorb someone like Raidmax for branding purposes to get cases and power supplies.

Watch as AMD tries to sort of become Canadian Apple. :v:

Overall that GPU/SSD combo though could be enticing for the consumer market if it ends up having enough bandwidth for whatever GPU + 2 M.2 SSD's without the chipset bottleneck. If they get mounted somewhere the GPU can cool them as well, that could help with those early M.2's that overheat on MB's. Also good for those that may be size constrained and instead of having a ITX board loose space trying to fit 1 or 2 M.2's onto it, just throw it on your GPU in the only X16 slot it has. It could be a new niche no one expected, but now could want...

Also the 780i and 790i I played with back in the 8800GTX days was pretty bomb, the only problem is how hot the drat chipsets got when you started overclocking with full RAM+SLI+Quad Core setups. Made my X48 look like it was running super chill.

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

EdEddnEddy posted:

Watch as AMD tries to sort of become Canadian Apple. :v:

Overall that GPU/SSD combo though could be enticing for the consumer market if it ends up having enough bandwidth for whatever GPU + 2 M.2 SSD's without the chipset bottleneck. If they get mounted somewhere the GPU can cool them as well, that could help with those early M.2's that overheat on MB's. Also good for those that may be size constrained and instead of having a ITX board loose space trying to fit 1 or 2 M.2's onto it, just throw it on your GPU in the only X16 slot it has. It could be a new niche no one expected, but now could want...

Also the 780i and 790i I played with back in the 8800GTX days was pretty bomb, the only problem is how hot the drat chipsets got when you started overclocking with full RAM+SLI+Quad Core setups. Made my X48 look like it was running super chill.

I wonder what the MTBF on that flash memory will be like.

Enigma
Jun 10, 2003
Raetus Deus Est.

Paul MaudDib posted:

The real benefit of buying a more expensive version is usually getting better stock clocks, but most Pascal cards are boosting to 2000-2100 MHz on their own anyway because of GPU Boost 3.0 so the differences on paper don't exist in real life. And especially on a $250 card it's really not worth spending a bunch extra anyway, because it's more efficient to use that money to step up to a 1070 that's 40% faster instead of a high-end 1060 that's 5% faster. If you want a reasonable card with an open cooler just look for the basic EVGA ACX 3.0 card (no SC or FTW in the name) and be done with it.

e: On that note, how in the world are AIB partners not pissed as hell about GPU Boost 3.0? Like, Maxwell would reliably OC to similar clocks, but at least you had to take an hour and figure out your peak overclock, so the increased clocks still meant something to users too scared/lazy to do that. Now the various models are literally indistinguishable because GPU Boost automatically squeezes nearly everything out of the card and the only real differentiation is a high-end aftermarket PCB and the cooler.

Can someone link me to a straightforward, up to date guide to overclocking? I picked up the zotac gtx 1060, but the base clock is ~1550, so it sounds like there's quite a bit of room for improvement. As I understand, the steps are to run GPU Boost 3.0 and a benchmark tool like Heaven, then incrementally up the clock (waiting between increases) and until the bench mark tool shows anomalies, then back it down to the last stable clock speed? Is there anything more to it, and how does voltage and memory speed factor in?

I'm not looking to min/max, but if it really is as safe and easy as it's made out to be to squeeze extra performance out I'm interested to know how.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Don Lapre posted:

Nforce 2 was epic. Could take any source and do on the fly dd encoding

Yup. Up there with Aureal 3d as the best/most innovative sound tech to come out. Pretty sure Nforce2 was one of the first consumer level DD encoders available.

Naffer
Oct 26, 2004

Not a good chemist

axeil posted:

Sapphire NITRO RX 480 now listed on NewEgg too (but it's already on backorder)

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202223

Why are so many non-reference coolers so much taller than the bracket? It's like they're determined to make people buy new cases for a mid-grade GPU. That cooler has to be greater than 50% larger than the board to which it's attached.

penus penus penus
Nov 9, 2014

by piss__donald

Naffer posted:

Why are so many non-reference coolers so much taller than the bracket? It's like they're determined to make people buy new cases for a mid-grade GPU. That cooler has to be greater than 50% larger than the board to which it's attached.

It'd cost more to make it smaller and perform the same (copper instead of aluminum, etc), but yes the board is relatively small compared to the cooler. Even the factory blower is significantly longer. At some point 150 (160 170..) watts of heat is just a physics problem to dissipate.

Also Sapphire and case manufacturers are owned by the same company, Foxconn, who are run by the Illuminati who control the economy through currency manipulation that requires consumers to constantly buy metal boxes (which, ironically, the same metal GPU coolers are made of...)

wolrah
May 8, 2006
what?

xthetenth posted:

I'd imagine there's some compute folks jerking off to the thought of being able to fit their working set in there, and a DDR4 SODIMM isn't the sort of order of magnitude thing that would do that.
I'm just not seeing where the performance advantage is over accessing those same SSDs installed in the host system itself. It's two M.2 slots as currently implemented, so the absolute best case is 4 PCIe lanes per slot worth of bandwidth each. Assuming you're not trying to build a budget compute system on a PCIe-limited platform like LGA115x there's basically no bandwidth advantage. Depending on what motherboard you have and where you put the GPUs and SSDs there may be a latency advantage. It's been confirmed to be nothing more than a PCIe switch on the card, so in theory a pair of SSDs installed on the same PCIe controller as the GPU in a normal PC could be used exactly the same way. Even if there is a latency difference from the onboard SSD versus in the host the latency difference between RAM and SSD is so large that it seems like the difference between the two SSD configurations would be a fart in the wind by comparison.

At least with RAM you'd be able to have something faster and lower latency, though obviously significantly lower capacity.

Who knows, maybe that Intel XPoint/Optane stuff is as good as they claim and will eliminate the distinction between SSD and RAM in the long term.

EoRaptor posted:

If it turns out to be a non-starter, they can drop the logic from the vega gpu. It does raise a question about the wasting die space on the 4x0 series given the yield problems they have, but it was likely decided to test this long before that became an issue.
On the plus side since they've confirmed it's just using a PCIe switch on the board rather than some previously unknown extra interfaces on the GPU and you need to use special APIs it seems like the magic likely happens more in drivers than in silicon. There's probably not much if any die space dedicated to this capability compared to what would already be required for "swapping" to system RAM.

EdEddnEddy
Apr 5, 2012



Enigma posted:

Can someone link me to a straightforward, up to date guide to overclocking? I picked up the zotac gtx 1060, but the base clock is ~1550, so it sounds like there's quite a bit of room for improvement. As I understand, the steps are to run GPU Boost 3.0 and a benchmark tool like Heaven, then incrementally up the clock (waiting between increases) and until the bench mark tool shows anomalies, then back it down to the last stable clock speed? Is there anything more to it, and how does voltage and memory speed factor in?

I'm not looking to min/max, but if it really is as safe and easy as it's made out to be to squeeze extra performance out I'm interested to know how.

You just need a tool such as EVGA Precision, MSI Afterburner, or Nvidia Inspector and start fiddling with the sliders.

Without BIOS mods or hard mods on the card themselves, you pretty much are unable to hurt your card like the old days, but depending on how far you push it, you can still push it above unstable at times which will normally just run into a Driver Crash unless you get too carried away and it goes high enough to hard crash the card/your system which will just require a reboot to fix (so don't set any overclock to apply at boot until you find a stable setting).

The Haven benchmark method works to see th FPS numbers move directly with settings, but it may not be enough to really show a stable max OC so I am a fan of setting your OC, then running 3D Mark and seeing if it makes it through the Extreme test. Rarely do I seem to get artifacts like the old GPU OC days, more driver crashes than anything where I will drop it back a little bit and usually be just fine.

Usually you will want to start by turning the heat and power limits up (but not necessarily the voltage) and see what the max turbo it hits compared to stock, then move up the core and memory sliders to get the max turbo closer to what you are seeing around the web. Most cards can hit over 2000Mhz turbo on the core and what, 8800mhz on the memory? Nvidia Inspector is a good tool IMO as it is simple with a no frills interface but the graphs you can have it show along with the simple sliders and the easily savable OC Shortcuts are rather handy.

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

It matches but it's an RX470 for the first (2048SPs), no known counterpart for the second (possibly mobile?) at 1792SPs, and the standard full Polaris 11 (RX460) for the third.

Also


:siren::byodood::siren:Intel buyout of RTG confirmed.:siren::byodood::siren:

On a more serious note, do you think team red should do with a change of colors?

IDGAF, but it's not like they have any other primary colors they can shift to. What are they going to rebrand to, yellow? They already use the black version of their logo freely on their website.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
What's a good clock speed to start at with the 1070? I got EVGA's SC card.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

PerrineClostermann posted:

What's a good clock speed to start at with the 1070? I got EVGA's SC card.

2200mhz would be a great clock speed.

otherwise start at whatever it boosts too stock.

The Slack Lagoon
Jun 17, 2008



I got the gigabyte g1 gaming because I stopped at local microcenter and they happened to have it in stock and I put it on "EcoMode" which I think underclocks it. Not sure it really does much. It also has an overclocking mode

Bleh Maestro
Aug 30, 2003
Ign's daily deals page said there was an $80 off Gigabyte G1 1070 on sale today. It sold out but did anyone see this? Was it $80 off $429 or some higher price?

If so, that's a good deal to look out for.

Phlegmish
Jul 2, 2011



Twerk from Home posted:

You'd get a lot out of a CPU upgrade. The FX-8350 was already behind its contemporaries 4 years ago, but the good news is CPUs have only gotten incrementally faster since then. In CPU heavy games like Bethesda stuff, you stand to get a lot of performance. Sweet spot right now is the i5-6600K, this comparison below is against CPUs several generations old.



Are you at 2560x1440 or 3840x2160? It's drat hard to get much past 100fps at 4K, you might have to nudge some things down to get there. I'm getting about 80fps with everything maxed on a 2500K / R9-290 at 2560x1440 after the Vulkan patch.

I'm a bit surprised, looking at PassMark it has a decent score:

https://www.cpubenchmark.net/cpu.php?cpu=AMD+FX-8350+Eight-Core

I guess maybe per core it's not very good?

Hiowf
Jun 28, 2013

We don't do .DOC in my cave.

Phlegmish posted:

I guess maybe per core it's not very good?

Yes. It's an 8 (integer) core CPU, but the individual cores aren't very fast. For everything that isn't heavily multi-threaded, it's behind modern CPUs.

Admiral Ray
May 17, 2014

Proud Musk and Dogecoin fanboy

Phlegmish posted:

I'm a bit surprised, looking at PassMark it has a decent score:

https://www.cpubenchmark.net/cpu.php?cpu=AMD+FX-8350+Eight-Core

I guess maybe per core it's not very good?

Should still be better than that i5-760, though: Comparison between 760, 8350, and 2600k The FX-8350 has a much better single thread rating for Passmark.

Adbot
ADBOT LOVES YOU

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Don Lapre posted:

2200mhz would be a great clock speed.

otherwise start at whatever it boosts too stock.

...looks like I'm crashing at more than +80 in AFterburner...

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply