Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Chuu
Sep 11, 2004

Grimey Drawer

Blackfyre posted:

Well realistically speaking and based on past experience I would expect Nvidia to push ahead when they do push out a new round of cards. Would expect something around sept/oct.

The demand for the compute version of Pascal is so ridiculously high right now I'd expect the consumer versions to take significantly longer to appear than normal, or launch with extreemly limited supplies. People have been throwing around Q1 2017 for quantity shipments, but I wouldn't be surprised if supplies are tight even that far ahead.

Adbot
ADBOT LOVES YOU

fozzy fosbourne
Apr 21, 2010

Trying here since I didn't get a response in the monitor thread and hopefully some turbo goon has tried both here:
Has anyone played the same game at 120hz/120fps with ULMB (maybe 1080p), as well as at an ultrawide resolution with 60-100FPS and g-sync? How did the experience compare?

I'm curious if driving a 1080p monitor at 120hz with motion blur reduction is a better experience than 21:9 with a variable, lower fps held up a bit by g-sync. I day dream about the old days playing arena shooters on CRTs, sometimes. I'm also intensely curious about gaming in 21:9.

Also, there isn't a general PC Gaming thread in games anymore? Sorry if this is off-topic

Josh Lyman
May 24, 2009


Nvidia should lease some production capacity from Intel.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Josh Lyman posted:

Nvidia should lease some production capacity from Intel.

Intel's production capacity is for Intel. That is not going to change any time in the near or foreseeable future. Their fabs are something they guard very jealously, when realistically the only other company out there that's as vertically integrated as they are from design to end product is Samsung.

Owning their own fabs and controlling their own production is something they will guard very jealously.

SwissArmyDruid fucked around with this message at 03:26 on Apr 11, 2016

Josh Lyman
May 24, 2009


It was tongue in cheek, especially since media is proclaiming the death of the PC.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Josh Lyman posted:

It was tongue in cheek, especially since media is proclaiming the death of the PC.

Media has been proclaiming the death of the PC for a decade. It's not about to die any time soon, least of all on the eve of proper VR.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Josh Lyman posted:

It was tongue in cheek, especially since media is proclaiming the death of the PC.

There is no such thing as the cloud, just somebody else's computer.

Josh Lyman
May 24, 2009


I've never tried a VR headset, by it seems to me that 960x1080 per eye isn't near enough resolution.

Regardless, it's at least $1500 for the PC + headset and I'm not sure the market is that large.

Josh Lyman
May 24, 2009


xthetenth posted:

There is no such thing as the cloud, just somebody else's computer.
Sure, but every server blade with a Xeon supplants, what, maybe 10 consumers?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Josh Lyman posted:

Sure, but every server blade with a Xeon supplants, what, maybe 10 consumers?

Supplants some, supplements others.

LiquidRain
May 21, 2007

Watch the madness!

Josh Lyman posted:

I've never tried a VR headset, by it seems to me that 960x1080 per eye isn't near enough resolution.

Regardless, it's at least $1500 for the PC + headset and I'm not sure the market is that large.
Not to mention the screens are PenTile, not proper RGB stripe. (the Playstation VR screen is 1080p but RGB stripe, so it's a toss-up) The OLEDs are manufactured with some particular properties though - decreased distance/wire width between pixels (reduces "screen door effect") as well as global refresh, where each pixel is constantly updated, rather than by a scanline.

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.

SwissArmyDruid posted:

Intel's production capacity is for Intel. That is not going to change any time in the near or foreseeable future. Their fabs are something they guard very jealously, when realistically the only other company out there that's as vertically integrated as they are from design to end product is Samsung.

Owning their own fabs and controlling their own production is something they will guard very jealously.

It's funny, I was interviewing for samsung a few weeks ago and one thing they really wanted to hype up was how superior they were to Intel, process and fab-wise.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Watermelon Daiquiri posted:

It's funny, I was interviewing for samsung a few weeks ago and one thing they really wanted to hype up was how superior they were to Intel, process and fab-wise.

It hasn't been that long since they bought a jump start by poaching one of TSMC's engineers, right?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Watermelon Daiquiri posted:

It's funny, I was interviewing for samsung a few weeks ago and one thing they really wanted to hype up was how superior they were to Intel, process and fab-wise.

Isn't Intel's 14nm process superior? I'm not sure how Samsung can hype something that's pretty checkable like that, unless the higher end measurements for Intel 14nm are correct (70x60nm vs 78x64nm) and Samsung has a lot less defects for 14LPP?

Anime Schoolgirl
Nov 28, 2002

Intel had (probably still has given the number of Skylake fuckups we've seen) teething issues with the 14nm node right when Samsung was rolling production on 14LPP

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Josh Lyman posted:

I've never tried a VR headset, by it seems to me that 960x1080 per eye isn't near enough resolution.

Regardless, it's at least $1500 for the PC + headset and I'm not sure the market is that large.

Not that it's a huge difference, the shipping rift and vive use two (pentile) oled panels, each 1080x1200.

Yes, of course you can notice the matrix, but after you've been playing for a bit, it's surprisingly immersive. (the vive is my reference here).

penus penus penus
Nov 9, 2014

by piss__donald

FaustianQ posted:

Isn't Intel's 14nm process superior? I'm not sure how Samsung can hype something that's pretty checkable like that, unless the higher end measurements for Intel 14nm are correct (70x60nm vs 78x64nm) and Samsung has a lot less defects for 14LPP?

It was also an interview and that's what every company does. The interviewer either is reguritating things they know nothing about, has really drank the kool aid, or is just fibbing a little.

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

Isn't Intel's 14nm process superior? I'm not sure how Samsung can hype something that's pretty checkable like that, unless the higher end measurements for Intel 14nm are correct (70x60nm vs 78x64nm) and Samsung has a lot less defects for 14LPP?

Anime Schoolgirl posted:

Intel had (probably still has given the number of Skylake fuckups we've seen) teething issues with the 14nm node right when Samsung was rolling production on 14LPP

There's also that question of "what qualifies as 14nm", because there is no standardizing body that regulates what a 14nm process or a 20nm process actually is.

I think it was Intel that's been basically making mountains on their dies, with base layer transistors having a 65nm feature size, then the next layer up being 45nm, and so forth all the way up to the very tops of these silicon mountains where they've got a 14nm feature. So yes, it's got 14nm features, but not all the way through.

(Partly because getting 14nm features all the way through isn't actually possible. If you tried to have 14nm power delivery features, for example, you'd slag them in an instant, the decrease in CPU voltages from Sandy to Haswell notwithstanding.)

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
Yeah, one of the managers I was talking to was quite certain that Intel's was 16nm, which for all I know it could be and they just call it 14nm (which is really more of an academic difference I would think).

Gwaihir
Dec 8, 2009
Hair Elf

Josh Lyman posted:

Sure, but every server blade with a Xeon supplants, what, maybe 10 consumers?

Sure, and each Xeon (of which you probably have 2-4 per box) costs 10 times as much as an i5-whatever the current generation is, or over 20 times more if it's a quad socket version. The largest server chips are 6-7k$ alone.


fozzy fosbourne posted:

Trying here since I didn't get a response in the monitor thread and hopefully some turbo goon has tried both here:
Has anyone played the same game at 120hz/120fps with ULMB (maybe 1080p), as well as at an ultrawide resolution with 60-100FPS and g-sync? How did the experience compare?

I'm curious if driving a 1080p monitor at 120hz with motion blur reduction is a better experience than 21:9 with a variable, lower fps held up a bit by g-sync. I day dream about the old days playing arena shooters on CRTs, sometimes. I'm also intensely curious about gaming in 21:9.

Also, there isn't a general PC Gaming thread in games anymore? Sorry if this is off-topic

I don't have an Ultrawide, but I do have the Acer XB270hu, and a Dell 3014 (Somewhat similar huge dimensions if not resolution to a 34" 21:9). With the caveat that I'm not really very sensitive to any but the most extreme tearing, the experience is extremely good with nothing else but running the monitor at 120-144hz. Even if you aren't actually pushing that many FPS. Gsync adds to it for sure, but the effects are far more subtle than what I consider the super obvious increase in perceived smoothness from hopping up from 60hz to 100+.

I would say how you rank an Ultrawide depends on what types of games you play- Primarily sims, or racing games? Definitely get something like the X34 and go crazy. FPS? I think the typical ultrawide is just slightly too large for comfortable FPS play, while something like the 27" XB270hu is just about perfect. (Super CS pros or w/e would probably say that 24" is ideal, but I'm not nearly pro enough to care about that).

Given that above choice, I would definitely prefer gsync at higher resolution varying between 75-100 fps than a lower resolution 1080 screen with no gsync at 120. There's just not enough perceptible gain (for me) going from 75-100 variable fps to locked 120 or w/e to be at all worth it.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

FaustianQ posted:

So wait, would DDR4 in fact make a decent enough performance boost over DDR3? Say a 2GB DRR4 @ 4000mhz card with a 128 bit bus, I'm likely miscalculating but that should be ~64GB/s? There should be at least some PCB area and power savings, right?
Yes, DDR4 would be better than DDR3 for graphics, but since it requires new memory controllers don't expect to see it in discrete videocards.

Durinia
Sep 26, 2014

The Mad Computer Scientist
It's all academic at this point, in terms of "xx nm". The real numbers are around density, power, and reliability, which is not examined in a lot of detail publicly (especially for Intel).

The overarching story is that Intel had a huge fab lead for a while, but the lead is shrinking very quickly as they're the first company to slam full-on into the technological scaling cliff. Both TSMC and Samsung are getting much closer, and they've got a much more robust ecosystem of tools for 3rd parties to design chips - Intel actually tried to offer their fabs out for contract via OpenSilicon, but the effort didn't go anywhere as they're not used to giving info/tools to anyone outside Intel.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

For the *sync question it's worth remembering that the higher the framerate the less *sync does. It's really just moving frames to when they should be displayed, and the faster the screen is going the less difference there can be between where the frame is and where the frame should be.

Then again I seem to be really insensitive to framerates above ~45 so my XR341CK is kind of wasted on me and I'm not the right guy to ask. It seems a bit better but not the huge deal it makes to some.

EdEddnEddy
Apr 5, 2012



Alereon posted:

Yes, DDR4 would be better than DDR3 for graphics, but since it requires new memory controllers don't expect to see it in discrete videocards.

They did it on One card back in the day.

Ended up being too expensive but it was interesting to see for a time.

Anime Schoolgirl
Nov 28, 2002

*GDDR4

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

EdEddnEddy posted:

They did it on One card back in the day.

Ended up being too expensive but it was interesting to see for a time.

I've already been wrong once on this once but I'm pretty sure that one actually was GDDR4 which is based on DDR3.

fozzy fosbourne
Apr 21, 2010

xthetenth posted:

For the *sync question it's worth remembering that the higher the framerate the less *sync does. It's really just moving frames to when they should be displayed, and the faster the screen is going the less difference there can be between where the frame is and where the frame should be.

Then again I seem to be really insensitive to framerates above ~45 so my XR341CK is kind of wasted on me and I'm not the right guy to ask. It seems a bit better but not the huge deal it makes to some.

Do games still have a connection between mouse polling and framerates? Also, does this affect analog sticks, too? I seem to remember this being the case unless the game engine had a "raw input" override (which never seemed to be on by default). This information could be way out of date. But I remember looking into this back in the Quake days and this made a big difference in aiming at higher framerates

Fragrag
Aug 3, 2007
The Worst Admin Ever bashes You in the head with his banhammer. It is smashed into the body, an unrecognizable mass! You have been struck down.
I really don't really understand what the current situation is after the presentation last week. So basically NVidia is now focusing on releasing chips first for the lucrative niche market where the chips are for running complex simulations and calculations which are done by universities and large corporations.
And consumer cards for graphics processing will be announced and released later, possible around June?

I've been delaying building a new system for a while now, what would be the best bang-for-buck gpu to buy new that would hold me over till the new GPUs are released and once their prices stabilize?

Josh Lyman
May 24, 2009


Fragrag posted:

I've been delaying building a new system for a while now, what would be the best bang-for-buck gpu to buy new that would hold me over till the new GPUs are released and once their prices stabilize?
970 or 290/390 with aftermarket cooler I think.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

The best deals on 290s have gone away it seems so it's not the screaming buy it had been, although it's been making up for that by doing great in recent games.

There's two big considerations. Price/performance is a big deal, and the lower priced 290/390/970s are probably the best bet there, although I've seen a 290x or two that hasn't gotten the memo and are still in the 300-400 dollar range that are seriously capable chips. The other is depreciation. Especially here, where you're buying a card with the explicit intention of letting its price get slammed, it's a big deal, and used value is worth consideration. Cards like the 950 and 380(X) are priced reasonably for their performance and a relatively huge drop in value puts you out less money than the higher bracket. More extreme, saying screw it and working on a game backlog means you're eating even less depreciation.

Also generally NV cards sell used better, but apparently Etherium mining is a thing so an AMD card might do better than expected.

Also, 290s or 390s with the blower cooler (one fan in a rectangular box) are to be avoided unless you intend to put an aftermarket cooler on the thing.

xthetenth fucked around with this message at 19:47 on Apr 11, 2016

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL
I was thinking about giving my old 660ti to a friend of mine, but I'm not quite sure his power supply will be up to the task. I have an old one here that came with my case, it's a 450W and it has a PCI-e 6-pin connector, but according to the calculator in the OP (I'm going by memory on the exact specs on his pc unfortunately), I'd need 510 to power everything. I had it running with a 520W seasonic PSU, but I'm also pretty sure I have a lot more stuff in my case than he does (1 7200RPM drive, 1 5400 one, 1 SSD, 2 fans).

Should I take the safe route and tell him to get a new PSU, or would the 450W be enough for this?

Cheers.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

What brand is the PSU? Coming with the case is a warning bell that it's likely not worth using at all.

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL

xthetenth posted:

What brand is the PSU? Coming with the case is a warning bell that it's likely not worth using at all.

Vistuba, it's a local brand. Yeah, I'll tell him to get a new one, there's a reason I took it out of my case in the first place.

GokieKS
Dec 15, 2012

Mostly Harmless.

Paul MaudDib posted:

I've already been wrong once on this once but I'm pretty sure that one actually was GDDR4 which is based on DDR3.

Correct. Even the initial standard of DDR4 wasn't published at the time of the 4870, and that was years before DDR4 actually started being in use.

Laslow
Jul 18, 2007
It was even earlier than that. GDDR4 was used in the 3870. This was back when DDR3 DIMMs were only used on one Intel chipset; X48, since they were so new.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Alereon posted:

Yes, DDR4 would be better than DDR3 for graphics, but since it requires new memory controllers don't expect to see it in discrete videocards.

Well, the original thought experiment was that since you'd have a limit to financial return on heavily put P11 dies, AMD would fill out the lowest end of the 400 series lineup by lifting APU iGPUs and putting them on PCB. For potentially minimal fuss they'd use DDR4, as a different memory standard would require redesigning. As pointed out by Paul, the DDR4 memory controller being attached to the CPU could complicate the task and you'd end up at square one producing two separate but functionally identical performance GPUs.

EdEddnEddy
Apr 5, 2012



Laslow posted:

It was even earlier than that. GDDR4 was used in the 3870. This was back when DDR3 DIMMs were only used on one Intel chipset; X48, since they were so new.

drat I remember that and forgot they did that with a 3870 as well. Glad I got the X48 with DDR2 back when though as DDR3 at the time was stupid expensive and not much faster at the time.


I miss my old X48 setup some. Once that thing was dialed in, and you got the SLI hack working (that tricked the drivers into seeing it as a i680 board I believe) it worked fantastic with SLI 560Ti's.

I messed with SLI for the first time with the 8800GTX's (first 2 then 3) and while the return with 3 was less then the first 2, it was neat seeing the Crysis benchmarks improve with each card addition. (and how a 4870X2 was faster and smoother then the 3 8800GTX's, and later the 560Ti/HD5870 being as fast as all 3). Crazy times back then. Getting a i780 board with 3 8800GTX cards to run a stable OC was a completely different dance and while it could be as fast or faster than an X48, the cooling requirement was stupid.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

FaustianQ posted:

Well, the original thought experiment was that since you'd have a limit to financial return on heavily put P11 dies, AMD would fill out the lowest end of the 400 series lineup by lifting APU iGPUs and putting them on PCB. For potentially minimal fuss they'd use DDR4, as a different memory standard would require redesigning. As pointed out by Paul, the DDR4 memory controller being attached to the CPU could complicate the task and you'd end up at square one producing two separate but functionally identical performance GPUs.
AMD APUs already support GDDR5, it was supposed to be offered on DIMMs but this never came about. I wonder what prices are like for current-gen DDR4 compared to GDDR5?

Edmond Dantes
Sep 12, 2007

Reactor: Online
Sensors: Online
Weapons: Online

ALL SYSTEMS NOMINAL
Huh. A few posts ago I mentioned my new card was running a bit warm while idling (60C at some points), but since the temps where under load were fine (67C) I wasn't worrying about it. I had noticed it was clocking up but I thought it was some hung process.

Well, turns out it's loving Hangouts. I was watching a youtube video and faffing about the forums when I noticed both my memory and GPU clocks were at max; I thought maybe it was youtube, but then I started closing stuff, and the first thing I closed was a single hangouts chat window that wasn't even on-screen at the time, it was minimized, and:


I tried it again a few more times, just opening the contacts list clocks everything up again, and it goes back to almost-idle as soon as I close it.

What the gently caress google :psyduck:

/edit: this is just with the Chrome extension, it doesn't happen if you're using the chat function of gmail.

Edmond Dantes fucked around with this message at 03:35 on Apr 12, 2016

Adbot
ADBOT LOVES YOU

PC LOAD LETTER
May 23, 2005
WTF?!

THE DOG HOUSE posted:

It was also an interview and that's what every company does. The interviewer either is reguritating things they know nothing about, has really drank the kool aid, or is just fibbing a little.
Its reasonable to assume its marketing BS because a)the source of course they'll talk themselves up and b)no process is good at everything and its a given there will be some things Samsung will be better at and some things Intel will be better at.

If Samsung said they can produce higher quality RAM or flash memory at a lower price and higher quantity then Intel then I wouldn't doubt it for a second. If they said they can produce the same MPU's Intel does for a lower price, higher quantity, and/or higher quality (clocks, lower power whatever) than Intel then I'd assume they're full of poo poo until proven otherwise.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply