Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Bleh Maestro
Aug 30, 2003

PC LOAD LETTER posted:

I agree in principal with you but my post was more about pointing out how TSMC's 10nm isn't going to be anything to get too excited about rather than suggesting they were going to dunk on Intel or anyone else.

This is true but people poo poo all over AMD when buttcoiners were driving Hawaii's prices through the roof too. I remember people were PO'd back when the 8800GTX was selling consistently for well over MSRP as well.

I guess AMD or nV could always up supply if they wanted to in order to kick the legs out from under the price gougers but they always seem to be unwilling to take the risk of over producing parts and just let things play out.

More like 80-90mm if rumors are correct. The pessimistic view is it'll offer 390/X levels of performance for less cost and less power. The optimistic view is it'll be somewhere in between the 980 and 980Ti levels of performance for less cost and power. No on really knows exactly what its supposed to do achieve at this point.

AMD seemed to be targeting a price point of more like ~$320 if some of their earlier slides are anything to go by. There have been some rumors that it might sell for $280 but that could be for the 480 version and maybe the 480X still goes for ~$320. If they want to sell lots of them that price seems a tad high to me though it wouldn't be a poor value at at all vs the 1070 so it should still do well.

I imagine somewhere someone has a warehouse full of previous generation gpu's and computer parts of all kind that never sold. Or maybe it's like the Atari ET graveyard.

Adbot
ADBOT LOVES YOU

PC LOAD LETTER
May 23, 2005
WTF?!
There are small PC parts stores in my area that up until mid last year were still selling 7xx nvidia and 6xxx AMD GPU's for near launch MSRP's brand new.


edit:\/\/\/\/\/\/\/\/ Vulkan/DX12 are supposed to be capable of more efficiency so yes if the developers focus on lower power usage you'll see CPU/GPU power use go down. They could however just as well decide to focus on using the extra performance that the higher efficiency will allow to do other things (ie. better + more accurate game physics) that make the CPU/GPU use even more power than DX11 did.

My guess/hope is they build their games to scale down and up as necessary to provide the best of both worlds depending on circumstances and platform they're targeting.\/\/\/\/

PC LOAD LETTER fucked around with this message at 16:47 on May 20, 2016

HMS Boromir
Jul 16, 2011

by Lowtax

EdEddnEddy posted:

Now they are going all in with IoT and 5G, and while their CPU stuff isn't going away, it just continues to be super unexciting where the GPU side of things for Nvidia/AMD has the potential to be not only exciting, but also needing more CPU power now than ever (If DX12/Vulkan is to be believed).

Shouldn't DX12/Vulkan need less CPU power than ever? I thought one of the big improvements they bring is less single threaded overhead - that hardly seems like an impetus for Intel to squeeze more blood out of that particular stone. It might improve games' ability to use 4+ threads, but increasing core/thread count isn't really that exciting either, is it? I doubt people bored by Ivy Bridge / Haswell thought the 5820K was a revelation.

EdEddnEddy
Apr 5, 2012



HMS Boromir posted:

Shouldn't DX12/Vulkan need less CPU power than ever? I thought one of the big improvements they bring is less single threaded overhead - that hardly seems like an impetus for Intel to squeeze more blood out of that particular stone. It might improve games' ability to use 4+ threads, but increasing core/thread count isn't really that exciting either, is it? I doubt people bored by Ivy Bridge / Haswell thought the 5820K was a revelation.

Like "Edited" in in the post above yours, the Potential of being able to use More than 4 threads in the rendering pipelines has the potential to use more CPU power than current games seem to have been coded for (so why OC'ing/Clock Speed has been a bigger boost than just throwing cores at the problem) but at the same time, with GPU performance needing to scale up in the next year or two to handle not only 4K, but whatever is in the future of 4K and VR, they soon are going to start getting hit with the need to feed these things and other number crunching that just throwing more cores at the problem isn't going to fix. Latency between the CPU/GPU('s) and frame time is only going to become more and more of an important factor too with VR really going to be the driving force, not only for games too. Considering we are starting VR slowly with simple graphic games and such, it will only be a matter of time before they start wanting to push Photo Realism into VR similar to what they have been doing on flat screens in recent years, and the only way to do that, is massively improved hardware if Project CARS/Elite Dangerous is anything to go by. When it looks that good in VR, it is downright breathtaking and immersive as all hell.

What has pissed me off is the efficiency has gone up sure, but even with OC'ing, the sheer performance of a single core hasn't really moved much more than 10% per generation which is pretty sad even coming from the jump we got from Core 2 to the Core i Series.

I know new architectures take a lot of time and engineering to create (look at AMD I guess), but dammit if Intel is due for a new chip that isn't just a Atom. Nothing since SandyBridge - E was announced, has really gotten me excited for a CPU outside of some of the tech that the chipsets going with those CPU's has to offer. (Sandy Bridge - E brought back what Sandy Bride's chips seem to have lost from the X48/X58 platform) > 16/20 PCI-E lanes and Quad Channel RAM rather than Tripple/Dual channel.

Now the new Z170 series has some neat bells and whistles that even X99 doesn't have natively, but nothing is quite pushing me to need to upgrade my aging X79 as outside of the more cores that even Haswell - E brought, my 4.6Ghz 6 core can still hit in the same ballpark as a mildly overclocked 8 core, which is great for the old tech, but sad considering how old it is in comparison now.

feedmegin
Jul 30, 2008

EdEddnEddy posted:

What has pissed me off is the efficiency has gone up sure, but even with OC'ing, the sheer performance of a single core hasn't really moved much more than 10% per generation which is pretty sad even coming from the jump we got from Core 2 to the Core i Series.

Well yes, that's physics I'm afraid. The days of massive jumps in core frequency from year to year are now behind us barring some really blue sky technical breakthrough that resurrects Moore's Law.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

If you can't get clock speed and you can't get IPC out of transistors because you pretty much have all the ILP it's possible to extract, then welcome to being a post 2010 CPU maker. Single thread IPC gains are basically gone. AMD haven't reached that cliff yet, Apple might have, and ARM's other designers are probably still closing in on it.

penus penus penus
Nov 9, 2014

by piss__donald

PC LOAD LETTER posted:



This is true but people poo poo all over AMD when buttcoiners were driving Hawaii's prices through the roof too. I remember people were PO'd back when the 8800GTX was selling consistently for well over MSRP as well.
$280 but that could be for the 480 version and maybe the 480X still goes for ~$320. If they want to sell lots of them that price seems a tad high to me though it wouldn't be a poor value at at all vs the 1070 so it should still do well.

I dunno, I dont think anybody is confused about buttcoiners driving up the cost there... but regardless, effectively, too much money for a product sucks

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

EdEddnEddy posted:

What has pissed me off is the efficiency has gone up sure, but even with OC'ing, the sheer performance of a single core hasn't really moved much more than 10% per generation which is pretty sad even coming from the jump we got from Core 2 to the Core i Series.

I know new architectures take a lot of time and engineering to create (look at AMD I guess), but dammit if Intel is due for a new chip that isn't just a Atom. Nothing since SandyBridge - E was announced, has really gotten me excited for a CPU outside of some of the tech that the chipsets going with those CPU's has to offer. (Sandy Bridge - E brought back what Sandy Bride's chips seem to have lost from the X48/X58 platform) > 16/20 PCI-E lanes and Quad Channel RAM rather than Tripple/Dual channel.

Now the new Z170 series has some neat bells and whistles that even X99 doesn't have natively, but nothing is quite pushing me to need to upgrade my aging X79 as outside of the more cores that even Haswell - E brought, my 4.6Ghz 6 core can still hit in the same ballpark as a mildly overclocked 8 core, which is great for the old tech, but sad considering how old it is in comparison now.

I feel like the reason Intel hasn't improved this isn't for lack of trying, it's because after 35 years of scaling up and dieshrinking and optimizing their processors they're finding that it's really hard to make things that much better than they already are. Expecting big gains like clockwork assumes that there's a possible design that will get you those gains, but we can't really make those assumptions except by projecting past performance into the future.

I feel like the fact that there's a 22-core Broadwell that runs at 150W is really cool, but it's not that useful for games and insanely expensive so it doesn't have the tangible consumer benefit that a 5GHz Kaby Lake or whatever would.

Eletriarnation fucked around with this message at 17:40 on May 20, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo
Which is why I'm looking forward, almost with a bit of desperation to the 8-core Zens. The promise of high clocks (and commensurate IPC) eight cores, and not having to shell out an arm and a leg for a Xeon means I *can* build that multiheaded Linux box I've been yawping about for more than a year now, and not have to worry that I'm leaving some performance behind because of lower server chip clocks or whatever, or paying $1000 for a shard of silicon.

Seriously, piecemeal chips cannot loving come fast enough for AMD. I have big expectations for it, both on the CPU and GPU side.

SwissArmyDruid fucked around with this message at 18:42 on May 20, 2016

Animal
Apr 8, 2003

Is PCIe 8x gonna bottleneck the Geforce 1080? I am building a new rig and thinking of putting an m.2 SSD drive which is gonna bring the video card from 16x to 8x. If it's gonna reduce performance, then I'll just get a regular SATA SSD.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

SwissArmyDruid posted:

Which is why I'm looking forward, almost with a bit of desperation to the 8-core Zens. The promise of high clocks (and commensurate IPC) eight cores, and not having to shell out an arm and a leg for a Xeon means I *can* build that multiheaded Linux box I've been yawping about for more than a year now, and not have to worry that I'm leaving some performance behind because of lower server chip clocks or whatever, or paying $1000 for a shard of silicon.

Seriously, piecemeal chips cannot loving come fast enough for AMD. I have big expectations for it, both on the CPU and GPU side.

Yeah I had hopes for Broadwell-E having an affordable 8-core but it looks like they're not decreasing their per-core prices at all and the top end Broadwell-E part is like 75% as expensive as a Xeon :negative:. Given Intel's pricing trends I'll probably be looking to the Zen as well, Intel is just getting too greedy.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Animal posted:

Is PCIe 8x gonna bottleneck the Geforce 1080? I am building a new rig and thinking of putting an m.2 SSD drive which is gonna bring the video card from 16x to 8x. If it's gonna reduce performance, then I'll just get a regular SATA SSD.

It *might* at higher resolutions but a cursory search didn't show any PCI-E scaling for the 10x0 series cards I could find. I did find some info on the previous series where they showed x8 vs x16 on a GTX 980 making a negligible difference even at higher resolutions:

https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/5.html

I doubt it would be a problem at 1440 and below but may see some differences at 4K and up.

Naffer
Oct 26, 2004

Not a good chemist

MaxxBot posted:

Yeah I had hopes for Broadwell-E having an affordable 8-core but it looks like they're not decreasing their per-core prices at all and the top end Broadwell-E part is like 75% as expensive as a Xeon :negative:. Given Intel's pricing trends I'll probably be looking to the Zen as well, Intel is just getting too greedy.

We can't really blame them. They don't have much incentive to lower prices when AMD isn't nipping at their heels and overall PC sales are flat.
If I had to guess they're probably going to be pretty conservative for a while now that they've pulled out of phone SoC's. There's some risk that ARM chips in chromebooks will cannibalize the low end, but that might actually endanger AMD more than Intel.

Ika
Dec 30, 2004
Pure insanity


Down to 790 across the board, and for every other companies founders edition as well.

Animal
Apr 8, 2003

BOOTY-ADE posted:

It *might* at higher resolutions but a cursory search didn't show any PCI-E scaling for the 10x0 series cards I could find. I did find some info on the previous series where they showed x8 vs x16 on a GTX 980 making a negligible difference even at higher resolutions:

https://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/5.html

I doubt it would be a problem at 1440 and below but may see some differences at 4K and up.

Thanks. It would be running one of these monitors. Supposedly the 1080 can run Witcher 3 at 3440 x 1440 and I wanna make sure of that.

repiv
Aug 13, 2009

Looks like the retailer embargo ended: https://www.overclockers.co.uk/news/get-notified-on-the-nvidia-gtx-1080-founders-edition-graphics-cards-99.html

lol at them charging a £30 premium to subject yourself to asus customer service

EdEddnEddy
Apr 5, 2012



Animal posted:

Is PCIe 8x gonna bottleneck the Geforce 1080? I am building a new rig and thinking of putting an m.2 SSD drive which is gonna bring the video card from 16x to 8x. If it's gonna reduce performance, then I'll just get a regular SATA SSD.

If you are going VR, it may as well as higher res stuff.

Forget SLI as well unfortunately. Finally with the higher end 900 stuff and now with 1080, we may finally start to saturate the 8X lanes and we already have for VR (need X16 3.0 for VR SLI especially.)




Also on a separate note, I wonder if it would be possible to Build your own Razer Core for a external GPU. Talking with some Intel guys at the lan and they laugh at how simple the Core really is, and how it is currently just 99% markup for what it provides.

You would think if that is the case though, that they and everyone else would get in on the external GPU box bandwagon.. I am guessing there may be some form of Thunderbolt licensing BS to jump around before you can bring one to market.

fozzy fosbourne
Apr 21, 2010

Do review sites typically get 3rd party non-reference boards before they are for sale or after?

EdEddnEddy
Apr 5, 2012



https://twitter.com/OC3D/status/733723998853324800

What is this? I like how the graph looks HUGE but is only in the 100's of points difference, however a "Mobile" chip that passes up a Titan X?


I really do look forward to upgrading from my ASUS G73JH and it's 5870M and waiting till after the 900 series was the right choice it's looking.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

fozzy fosbourne posted:

Do review sites typically get 3rd party non-reference boards before they are for sale or after?

Completely dependent on the 3rd party in question and the review sites, it's very hit and miss and all you can really do is keep an eye out for reviews.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

EdEddnEddy posted:

https://twitter.com/OC3D/status/733723998853324800

What is this? I like how the graph looks HUGE but is only in the 100's of points difference, however a "Mobile" chip that passes up a Titan X?

So the successor to the "gently caress it we're stuffing a desktop 980 in this heatsink barge of a laptop", then? Desktop 1080 being 30% faster because of a TDP limit sounds plenty reasonable.

LogicalFallacy
Nov 16, 2015

Wrecking hell's shit since 1993


Animal posted:

Is PCIe 8x gonna bottleneck the Geforce 1080? I am building a new rig and thinking of putting an m.2 SSD drive which is gonna bring the video card from 16x to 8x. If it's gonna reduce performance, then I'll just get a regular SATA SSD.
I'm sure if you look, you can find mobos with M.2 slots that don't share bandwidth with your PCIEX16. I've got a Gigabyte GA-Z170X-Gaming 5 that has 2 M.2 slots, and if I use one rather than the other, I only lose my PCIEX4 slot. No big loss since I doubt I'll ever need 3-way SLI.
There is an mITX version as well if that's what you're going for.

EdEddnEddy
Apr 5, 2012



xthetenth posted:

So the successor to the "gently caress it we're stuffing a desktop 980 in this heatsink barge of a laptop", then? Desktop 1080 being 30% faster because of a TDP limit sounds plenty reasonable.

Yea that's what I was thinking, and if they used the desktop 980 designed chassis, they could have lots of room to OC even so all that R&D doesn't go to waste.

Anime Schoolgirl
Nov 28, 2002

I'm so happy that somehow in this decade 15 pound laptops are still a thing

Animal
Apr 8, 2003

EdEddnEddy posted:

If you are going VR, it may as well as higher res stuff.

Forget SLI as well unfortunately. Finally with the higher end 900 stuff and now with 1080, we may finally start to saturate the 8X lanes and we already have for VR (need X16 3.0 for VR SLI especially.)




Also on a separate note, I wonder if it would be possible to Build your own Razer Core for a external GPU. Talking with some Intel guys at the lan and they laugh at how simple the Core really is, and how it is currently just 99% markup for what it provides.

You would think if that is the case though, that they and everyone else would get in on the external GPU box bandwagon.. I am guessing there may be some form of Thunderbolt licensing BS to jump around before you can bring one to market.

SLI is out of the question because its a mITX build, which is why I'm getting the 1080 instead of 2x 1070's. I'll definitely be getting into VR, so if PCIe 3.0 @ 8x is gonna have a performance hit then screw the m.2 drive.

ItBurns
Jul 24, 2007

Anime Schoolgirl posted:

I'm so happy that somehow in this decade 15 pound laptops are still a thing

But I need to play Dota and compile my CS101 homework.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

ItBurns posted:

But I need to play Dota and compile my CS101 homework.

Get an i7 for your surface pro.

Slider
Jun 6, 2004

POINTS
That graph is straight out of fox news

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Slider posted:

That graph is straight out of fox news

Computer component manufacturer graphs.png

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Your favorite resident accent has released a "review" of the Founder's Edition

https://www.youtube.com/watch?v=myDYnofz_JE

penus penus penus
Nov 9, 2014

by piss__donald
:lol: that fuckin graph

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy

PerrineClostermann posted:

Your favorite resident accent has released a "review" of the Founder's Edition

https://www.youtube.com/watch?v=myDYnofz_JE

Don't have time to summarize, but this is a pro click

Ika
Dec 30, 2004
Pure insanity

EdEddnEddy posted:

https://twitter.com/OC3D/status/733723998853324800

What is this? I like how the graph looks HUGE but is only in the 100's of points difference, however a "Mobile" chip that passes up a Titan X?


I really do look forward to upgrading from my ASUS G73JH and it's 5870M and waiting till after the 900 series was the right choice it's looking.

Looks like somebody enjoyed their copy of "How to lie with statistics"

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

teh_Broseph posted:

Don't have time to summarize, but this is a pro click

I generally find his videos enlightening

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

Animal posted:

SLI is out of the question because its a mITX build, which is why I'm getting the 1080 instead of 2x 1070's. I'll definitely be getting into VR, so if PCIe 3.0 @ 8x is gonna have a performance hit then screw the m.2 drive.

If you're buying new then Skylake added 4 PCIe lanes (20 total) so that you should be able to run a NVME SSD @x4 and still have 16 lanes left over for your video cards.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Krailor posted:

If you're buying new then Skylake added 4 PCIe lanes (20 total) so that you should be able to run a NVME SSD @x4 and still have 16 lanes left over for your video cards.

The 170 chipsets also have a ton of 3.0 lanes.

Animal
Apr 8, 2003

Krailor posted:

If you're buying new then Skylake added 4 PCIe lanes (20 total) so that you should be able to run a NVME SSD @x4 and still have 16 lanes left over for your video cards.

So this motherboard combined with an i7 6600k should still give me 16x PCIe lanes?

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

xthetenth posted:

MSI is held to be one of the good ones, so hopefully it works right, and you've got what might be the single most gracefully aging card in history, so not trading in seems like a viable option.
Just heard back from MSI - apparently they can't just replace the fan on the card, sadly, and they're out of 290's. They're going to replace it with a 390 8GB though so there's that :black101:

Tanreall
Apr 27, 2004

Did I mention I was gay for pirate ducks?

~SMcD

Animal posted:

So this motherboard combined with an i7 6600k should still give me 16x PCIe lanes?

Are Killer NICs better now or are they unstable still?

Adbot
ADBOT LOVES YOU

Animal
Apr 8, 2003

Tanreall posted:

Are Killer NICs better now or are they unstable still?

drat, I didn't realize it had that turd. I hate that snake oil company. But the model with an Intel chip is not as good at other things. It's so hard to find a good mITX board.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply