Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Anime Schoolgirl
Nov 28, 2002

Truga posted:

So kaby lake is out and none of the desktop chips have edram on them. lol

Wasn't that thing in 5775c a pretty hefty boost in FPS in basically any game that's even slightly cpu bound, to the point of skylake looking extremely anemic as an upgrade? Do they just not want nerd money?
considering all of the desktop chips are scraps that are too leaky to use in laptops, not really

Adbot
ADBOT LOVES YOU

Aging Millenial
Nov 24, 2016

by zen death robot
How close is a 1050ti to a 970?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Truga posted:

So kaby lake is out and none of the desktop chips have edram on them. lol

Wasn't that thing in 5775c a pretty hefty boost in FPS in basically any game that's even slightly cpu bound, to the point of skylake looking extremely anemic as an upgrade? Do they just not want nerd money?

Didn't that chip just have a much higher minimum framerate than expected?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Aging Millenial posted:

How close is a 1050ti to a 970?

The 970 still beats it on raw power stats (fill rate, memory bandwidth, etc.), but the 1050ti offers about 60-80% of 970 performance (depending on what you're running on it) with a full 4GB frame buffer pool at a 75W TDP.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord

Anime Schoolgirl posted:

considering all of the desktop chips are scraps that are too leaky to use in laptops, not really

What does this mean? Like...power efficiency?

SwissArmyDruid
Feb 14, 2014

by sebmojo
Short answer: There are two computing use cases where power efficiency is typically valued above raw performance.

One is in a server environment, where hundreds or thousands of chips, multiplied by a fractional decrease in energy consumption, multiplied by 24/7/365 equals a significant drop in power consumption, and therefore lower costs.

The other is on a laptop, running off of battery.

Those chips which do not meet whatever mandated efficiency targets, or leak signal heavily under those conditions, or can't sustain clocks without additional voltage cranking, or whatever other problems that are salvaged by the application of increased power, those are are still used, but they are used in desktops, where they can pull all the extra power they need from the wall, or have the extra 120mm fans they need to keep it cool and inside a higher TDP, if need be.

This is Intel's binning strategy. They make an 'ideal' server part, and they make an 'ideal' mobile part, and then they bin them to E7, E5, E3, and mobile i7, i5, and i3 based on how close they get to this ideal. Any server parts that pass, obviously get sold as server parts for a massive markup. Any mobile parts that pass, also get sold to OEMS at a markup. Any chips that fail either of these two categories, get further sorted into your desktop i7s, i5s, i3s, Pentiums and Celerons, and then sold at "normal price". Then the only thing what's left is truly junk silicon.

I've grossly glossed over the detail, but that's the gist of what Anime Schoolgirl means by "desktop parts are scraps".

SwissArmyDruid fucked around with this message at 10:51 on Jan 4, 2017

Setset
Apr 14, 2012
Grimey Drawer

SwissArmyDruid posted:

Those chips which do not meet whatever mandated efficiency targets, or leak signal heavily under those conditions, or can't sustain clocks without additional voltage cranking, or whatever other problems that are salvaged by the application of increased power, those are are still used, but they are used in desktops, where they can pull all the extra power they need from the wall, or have the extra 120mm fans they need to keep it cool and inside a higher TDP, if need be.

This is Intel's binning strategy. They make an 'ideal' server part, and they make an 'ideal' mobile part, and then they bin them to E7, E5, E3, and mobile i7, i5, and i3 based on how close they get to this ideal. Any server parts that pass, obviously get sold as server parts for a massive markup. Any mobile parts that pass, also get sold to OEMS at a markup. Any chips that fail either of these two categories, get further sorted into your desktop i7s, i5s, i3s, Pentiums and Celerons, and then sold at "normal price". Then the only thing what's left is truly junk silicon.


if that were true then enthusiasts would just buy mobile chips and plop them into an atx motherboard to get the l33t sp33ds

Seamonster
Apr 30, 2007

IMMER SIEGREICH
We did. Athlon XP-Ms.

Droo
Jun 25, 2003

I have an old Samsung laptop with a GeForce 650M in it. Does anyone know the max resolution I can get if I use the mini displayport output from that card to drive a single external monitor? The card itself claims to support 3840x2160, but I have seen a lot of people saying that it can really only support 2560x1600, or that it can technically support 3840 but only at 30 hertz. I'm also not sure if the specific laptop I have limits it in some way.

I don't have anything over 1920 that I can use to test it, and I don't want to order a 4k screen if the laptop won't be able to use it.

Laptop: http://www.samsung.com/us/computer/pcs/NP700Z7C-S01US-specs
GPU: http://www.geforce.com/hardware/notebook-gpus/geforce-gt-650m/specifications

Edit to add: apparently it's one of those weird setups where it has two graphics cards to save power, and the other one is an intel HD 4000 which has a limit of 2560x1600. So that's where my confusion comes from - I guess i'll just get a 2560 monitor and not try to mess with it too much.

Droo fucked around with this message at 14:41 on Jan 4, 2017

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

Lube banjo posted:

if that were true then enthusiasts would just buy mobile chips and plop them into an atx motherboard to get the l33t sp33ds

Can't. None of the mobile parts are 'K' models, so no overclocking, and the mobile parts typically have a lower base clock with proportionately higher Turboboost.

I did end up getting a 4790S really cheap somewhere and use it in one of my desktops. It runs at a 65watt power envelope instead of the 95watt(I think) rating on the non-S model. It's quite noticeably cooler and the speed diff is only measurable when encoding video.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
The thing to keep in mind though is that Intels binning is extremely exacting so "junk" in this case is stuff that doesn't fit into the 95% percentile, and that there is a sharp cutoff between functioning chip and actual garbage they have to throw away. Intel is offloading their less than stellar chips onto desktop, but don't think they're just dropping garbage onto the market as well.


Truga posted:

So kaby lake is out and none of the desktop chips have edram on them. lol

Wasn't that thing in 5775c a pretty hefty boost in FPS in basically any game that's even slightly cpu bound, to the point of skylake looking extremely anemic as an upgrade? Do they just not want nerd money?

PerrineClostermann posted:

Didn't that chip just have a much higher minimum framerate than expected?

No, the eDRAM really did make a significant difference with up to 55% increase in performance from the 6100 to the 6200. The 6200 is comparable to an R7 250, the 6100 is comparable to the R7 240. APUs of any stripe are hella bottlenecked. This is why there was huge hype about Kaveri having a GDDR5 memory controller (you'd get similar results) and people salivating at Raven Ridge having HBM of any kind (it'd crush eDRAM), if mostly because AMD has gotten away with nearly the same performance as the 6200 without eDRAM (6200 vs Kaveri R7). AMD APU's with a similar solution would result in a no contest AMD win (and would have made FM2+ solutions worth buying).

The latest HD 580 sits between an R7 250X and R7 360, but of course it's mostly a mobile part and is ridiculously overpriced for desktop and good luck finding it in NUCs *grumbles forever*.

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

Droo posted:

I have an old Samsung laptop with a GeForce 650M in it. Does anyone know the max resolution I can get if I use the mini displayport output from that card to drive a single external monitor? The card itself claims to support 3840x2160, but I have seen a lot of people saying that it can really only support 2560x1600, or that it can technically support 3840 but only at 30 hertz. I'm also not sure if the specific laptop I have limits it in some way.

I don't have anything over 1920 that I can use to test it, and I don't want to order a 4k screen if the laptop won't be able to use it.

Laptop: http://www.samsung.com/us/computer/pcs/NP700Z7C-S01US-specs
GPU: http://www.geforce.com/hardware/notebook-gpus/geforce-gt-650m/specifications

DisplayPort 1.1 can't go any higher than 2560x1440 at 75hz, DP 1.2 can do 4K(at useable frequencies) so if you can find out which one your laptop has, that'll give you your answer. I don't see the version listed on your spec sheet, so you might have to do some sleuthing.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Lube banjo posted:

if that were true then enthusiasts would just buy mobile chips and plop them into an atx motherboard to get the l33t sp33ds

Along with what others have brought up, the mobile chips use socket FCBGA1440, a mobile only socket, so this is just plain old impossible.

Droo
Jun 25, 2003

JnnyThndrs posted:

DisplayPort 1.1 can't go any higher than 2560x1440 at 75hz, DP 1.2 can do 4K(at useable frequencies) so if you can find out which one your laptop has, that'll give you your answer. I don't see the version listed on your spec sheet, so you might have to do some sleuthing.

Thanks for the reply, I figured out that the computer also has an Intel HD4000 adapter and it switches between the two, and the HD4000 is 1.1. Do you happen to know if 2560x1600 is possible with 1.1, or is the vertical resolution limited to 1440?

Edit: Nevermind, now that I'm shopping I see the 2560x1600 is a weird expensive niche and I will end up getting a 1440 anyway.

Droo fucked around with this message at 14:58 on Jan 4, 2017

Shrimp or Shrimps
Feb 14, 2012


Aging Millenial posted:

How close is a 1050ti to a 970?

The 1050ti measures up better with a GTX 960 desktop. It's about on par with a 970m.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

AVeryLargeRadish posted:

Along with what others have brought up, the mobile chips use socket FCBGA1440, a mobile only socket, so this is just plain old impossible.

Honestly the worst offender here has to be AMD's AM1, which has the same pins as their mobile socketed lineup FS1, so you can insert a mobile chip and fry chip and board.

repiv
Aug 13, 2009

Droo posted:

Edit to add: apparently it's one of those weird setups where it has two graphics cards to save power, and the other one is an intel HD 4000 which has a limit of 2560x1600. So that's where my confusion comes from - I guess i'll just get a 2560 monitor and not try to mess with it too much.

Graphics switching works by feeding the discrete GPUs output through the integrated GPU which then drives the monitor, so you're limited to display modes the HD4000 supports even when the 650M is enabled. 1440p should work fine though.

Truga
May 4, 2014
Lipstick Apathy

FaustianQ posted:

No, the eDRAM really did make a significant difference with up to 55% increase in performance from the 6100 to the 6200. The 6200 is comparable to an R7 250, the 6100 is comparable to the R7 240. APUs of any stripe are hella bottlenecked. This is why there was huge hype about Kaveri having a GDDR5 memory controller (you'd get similar results) and people salivating at Raven Ridge having HBM of any kind (it'd crush eDRAM), if mostly because AMD has gotten away with nearly the same performance as the 6200 without eDRAM (6200 vs Kaveri R7). AMD APU's with a similar solution would result in a no contest AMD win (and would have made FM2+ solutions worth buying).

The latest HD 580 sits between an R7 250X and R7 360, but of course it's mostly a mobile part and is ridiculously overpriced for desktop and good luck finding it in NUCs *grumbles forever*.

I'm talking about gaming on a dgpu though.

Any heavily cpu bound game (of which I play many) heavily benefits from having a giant pool of l4 cache, which is what the edram behaves like when you're not using it for the igpu:

SwissArmyDruid
Feb 14, 2014

by sebmojo

Lube banjo posted:

if that were true then enthusiasts would just buy mobile chips and plop them into an atx motherboard to get the l33t sp33ds

Historically, this has happened, yes. It was called Athlon XP-M.

However, Intel does do one thing: Mobile parts use FCBGA1440 or are BGA, rather than use the desktop socket.

e:fb
ee:fb

SwissArmyDruid fucked around with this message at 20:12 on Jan 4, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Vega XT seems to sit between 1080 and Titan XP. It's super hard to tell because they had Vsync on. Don't mind the FOV, that's the default and it's vertical (so 85FOV) and thus isn't an aberration regarding compared performance. It's using the 8C/16T Ryzen running at 3.4Ghz though, and 4K is very favorable to AMD so grain of salt, will know more tomorrow.

Even considering a Pro version (which would sit between a 1070 and 1080), that's still an enormous gap in performance with their second fastest chip, Polaris 10. So unless Polaris 10XT2 is revised to have whatever performance benefits Vega got for the uarch update, they're likely announcing another Vega as well as that's a huge gap in their pricing structure (jumping from ~250$ RX480 to 450$ Vega Pro to 600-700$ Vega XT).

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Guessing in the dark here but I thought it was just power consumption and therefore potentially clockspeed increases, not a huge amount of change in the uarch for the Polaris "refresh."

SwissArmyDruid
Feb 14, 2014

by sebmojo
1080 Ti paper launch confirmed via leak ahead of time: http://videocardz.com/65353/watch-nvidia-ceo-keynote-geforce-gtx-1080-ti-announcement-here

Battlefront demoed @ 4K60 on Ryzen and Vega again. Stranglely, no FreeSync? http://videocardz.com/65343/amd-demos-star-wars-battlefront-on-ryzen-and-vega-at-ces2017

EdEddnEddy
Apr 5, 2012



Truga posted:

So kaby lake is out and none of the desktop chips have edram on them. lol

Wasn't that thing in 5775c a pretty hefty boost in FPS in basically any game that's even slightly cpu bound, to the point of skylake looking extremely anemic as an upgrade? Do they just not want nerd money?

That has been asked like a dozen times in here. I think the 5775c Looked better in early benchmarks vs the Skylake, then a bios fix came out that pretty much removed the bottleneck the Skylake launched with and made the Broadwells boost disappear.

Since there are pretty much like 2 articles total on both ends of the fix though, I doubt there is much of a benefit from getting a 5775c over a Sky/Kabby at this point either.

Ah found the article that talks about the fix


Also on the Mobile chip for Desktop front, that was what a lot of people were doing in the P4D-Core2 days with the mobile variants. You could OC the mobile chips to some stupid level with good cooling.

I kinda miss the old FSB OC'ing days.


*edit* Added Skylake fix article.

EdEddnEddy fucked around with this message at 19:01 on Jan 4, 2017

Truga
May 4, 2014
Lipstick Apathy
Oh. Didn't know that, thanks.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
FSB OCing was pretty hardcore, you were effectively overclocking every major bus on your system. Unlocked multipliers belonged to the stupidly expensive Extreme processors, sitting at around 1000 USD...

Truga
May 4, 2014
Lipstick Apathy
FSB OCing was great, because it also OC'd the main memory bus by a lot, and that was huge back then. I think it's less of an issue now?

EdEddnEddy
Apr 5, 2012



PerrineClostermann posted:

FSB OCing was pretty hardcore, you were effectively overclocking every major bus on your system. Unlocked multipliers belonged to the stupidly expensive Extreme processors, sitting at around 1000 USD...

Yep, it was interesting figuring out how other parts synced up with it, and there was some math to make the ram work right on X38/X48, but once you got everything synced up properly, man things flew (and NB's got hot!)

This Article was the main kicker that taught me what I was doing wrong and how to fix it.

I was able to go from a mostly stable 3.4ghz to 3.84ghz, running the DDR2 at like 900mhz to 1081 (stock is 1066 and didn't feel like pushing them much harder with the timings I was doing), and dropped the vcore from 1.35v to 1.25 (nearly stock), with all the C States and even Standby working now. It was nuts.

I am still amazed how well that Q9550 on an X48 runs at 1Ghz over stock at 1.25vcore. The ram latency dropped like a rock and made everything noticeably faster and much more stable. Nowhere near a modern gaming rig, but with Raid-0 SSD's it flat zips for normal usage as fast as anything else new today. Complete overkill for my Sis and her husband who only seems to use it for Torrenting these days. :/

Anime Schoolgirl
Nov 28, 2002

FaustianQ posted:

The thing to keep in mind though is that Intels binning is extremely exacting so "junk" in this case is stuff that doesn't fit into the 95% percentile, and that there is a sharp cutoff between functioning chip and actual garbage they have to throw away. Intel is offloading their less than stellar chips onto desktop, but don't think they're just dropping garbage onto the market as well.
Most desktop chips are perfectly fine, they just don't run at the ridiculously low voltages Intel wants out of chips like that. Also, Intel never frequency bins as Skylake BCLK overclocking shows some i5-6400s overclocking to frequencies about as well as i5-6600ks, where you pretty much know what you're getting when you try overclocking Kaveri's 7700 vs 7850 APUs, which overclock at the same margins (not that that would help their issues any)

Anime Schoolgirl fucked around with this message at 20:02 on Jan 4, 2017

penus penus penus
Nov 9, 2014

by piss__donald
I'm not sure I'd use the word "junk" to describe the situation but it does make perfect sense overall

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Lube banjo posted:

if that were true then enthusiasts would just buy mobile chips and plop them into an atx motherboard to get the l33t sp33ds

Yes, they would. (although in this case it wasn't just badly-performing chips on desktop but a whole badly-performing architecture)

They don't anymore because the mobile chips are all BGA and all multiplier-locked too, and probably other reasons.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Reminder guys: NVIDIA's presentation starts in a half hour, and it's already leaked that they're (paper) launching the 1080 Ti.

If anyone wants to watch live with other tech goons - let's chill and ill in SynIRC channel #shsc so we don't poo poo up the thread too bad.

Edit: I would go with Twitch's stream unless Youtube is also running one: https://www.twitch.tv/nvidia

Eletriarnation posted:

Yes, they would. (although in this case it wasn't just badly-performing chips on desktop but a whole badly-performing architecture)

They don't anymore because the mobile chips are all BGA and all multiplier-locked too, and probably other reasons.

It's funny, in the Intel thread I was just bemoaning the AMD fanclub on Reddit (and note that I think people on here are 1000% less insufferable cheerleaders who are pushing THEIR TEAM), but man were those the glory days. I owned AMD computers for 25 years of my life, from K6-2 all the way through Athlon, A64, A64 X2, all the way to Phenom II, and I only bought my first Intel last about two years ago today. And P3 was pretty glorious back them too. I will never find it not funny that Intel went back to the drawing board and fixed up a mobile P3 into the Core architecture, while AMD tripped on a comedically placed banana peel and put out Bulldozer which made the exact same mistakes as Intel did, but can't afford to turn their (decent) mobile architecture into a desktop powerhouse, and then oops I made myself sad :smith:

Nowadays computers just get like 5% faster per year, the only real performance gain you get nowadays is from going to more cores. I sure hope AMD kills it with Zen because Intel's sure as hell not going to bother unless they can compete.

Paul MaudDib fucked around with this message at 03:19 on Jan 5, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
It's time ladies and gents! Will Tom have a job tomorrow? Tune in and find out!

SwissArmyDruid
Feb 14, 2014

by sebmojo
Stream going live late, Tom's getting fired. Again.

Edit: Goddamnit, Paul.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Fat finger apple ding count: 5

Tom's getting fired.

SwissArmyDruid fucked around with this message at 03:47 on Jan 5, 2017

repiv
Aug 13, 2009

SwissArmyDruid posted:

Fat finger apple ding count: 5

Tom's getting fired.

that's videocardz liveblog dinging, not the stream

stop cyberbullying tom :smith:

SwissArmyDruid
Feb 14, 2014

by sebmojo
Oh, is THAT what that was. Well that's dumb.

Omnicarus
Jan 16, 2006

I'm pretty sure I have those pants.

Edit: They are pretty good. From costco.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Alright ladies and gents, last chance to lay your bets, what's the over/under on cores, memory, and launch price?

10 GB, Titan XP minus 1 SMX engine, $750

Comedy option: 12 GB, same cores, $800, but 970-style gimped memory controller or GDDR5 instead of GDDR5X :unsmigghh:

edit: GeForce Now (premium?) streaming service launch confirmed

edit2: Tom's laptop is 1366x768, pleb status confirmed.

Paul MaudDib fucked around with this message at 04:04 on Jan 5, 2017

SwissArmyDruid
Feb 14, 2014

by sebmojo
Tom apparently never made it back, Dave is now the sacrificial lamb.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I'm calling it though, this is going to segue into the fact that they're streaming from the ALL NEW GTX 1080 Ti.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply