Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

My point is many of these AMD loyalists aren't blind. They're waiting for waiting for Vega 20 at least. And if Ryzen and Coffee Lake cost the same thing they'd buy the better product. To use a personal anecdote, I used to use $350 Nexus phones and argue with iPhone-using friends who talked about how they don't mind spending twice as much to have the cushy, premium, industry-best smartphone experience. And as soon as $350 Nexus became $700 Pixel and flagship Android phones cost as much as iPhones do, here I am with an iPhone. I was accepting crashes, random reboots, etc in order to have a phone half the price and felt it was an acceptable compromise. At the same price, forget it.

That's the same math I'm doing. I want a good low-light camera so it's pixel or iPhone and welp, I'll go for the one that actually has security updates. That or a $150 unlocked moto-whatever. I made my first-gen Moto G last almost 4 years.

I also have an X34 and a 1080 so, I guess that doesn't say much as an apriori factor. I don't want to gently caress with it.

Adbot
ADBOT LOVES YOU

ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.

Xae posted:

It would only work with a Xeon(tm) Processor and Optane(tm) brand SSDs.
It won't be that brazen but this is definitely a chance for Intel to muscle its way in to dGPUs with cards that just happen to take advantage of Intel HyperConnectPlus(tm) or somesuch proprietary bullshit. They wanted Raja so that they could leapfrog ahead of 2-3 generations of mistakes and get right to the business of leveraging their advances in memory and silicon interconnects in a way AMD can't. If their GPUs just happen to access Optane drives as another layer of cache, who are they to say no?

1gnoirents
Jun 28, 2014

hello :)

Paul MaudDib posted:

He's the tech version of Alex Jones. NVIDIA YOU DEVIL!

Charlie Demerijan too. Like he's an actual crazy person who will detail all the myriad ways he hates NVIDIA and/or Intel in every article.

It's gonna be super interesting to see who in the fandom goes which way... Lisa vs Raja cagematch go.

I used to outright feel bad for the guy because I never really came across much he said that was misguided, rather he was just very optimistic - but plausible. However it was all just ultimately wrong and not his fault. But I kind of cooled on that when I watched the second half of the history of gpus thing he put out. Though not technically wrong, it was easy to read between the lines.

But now... I mean...


Anyhoo, what a rollercoaster ride for the gpu world this week.

Craptacular!
Jul 9, 2001

Fuck the DH
I’d have to think this is at least a little bad for AMD because it’s very easy for laptops with Polaris on board to opt for Freesync displays.

I feel like the widespread ubiquitousness of GeForce cards combined with the premium, luxury good pricing of Gsync has to be holding back adaptive sync somewhat, right? You give people iPads that can speed up their refresh rates dynamically and laptops (and soon maybe TVs?) with Freesync and you increase the audience of people who expect this stuff as standard, and they’re going to balk at how much Nvidia’s solution adds to the price of a 1440p monitor.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Craptacular! posted:

I’d have to think this is at least a little bad for AMD because it’s very easy for laptops with Polaris on board to opt for Freesync displays.

I feel like the widespread ubiquitousness of GeForce cards combined with the premium, luxury good pricing of Gsync has to be holding back adaptive sync somewhat, right? You give people iPads that can speed up their refresh rates dynamically and laptops (and soon maybe TVs?) with Freesync and you increase the audience of people who expect this stuff as standard, and they’re going to balk at how much it adds to the price of a 1440p monitor.

Oh god yes Intel getting access to Freesync-capable tech immediately fucks NVIDIA even in the short-term, we now see FreeSync compatible iGPUs no later than a year out, and there is a solid use-case for plebs buying a poo poo-tier freesync monitor and upgrading it right through using FreeSync compatible GPUs in the entry-level market. Microsoft will be pushing it in the living-room market and XB1X supports it.

How long do you think NVIDIA is going to stall on that? Volta will have it (but not Pascal).

Paul MaudDib fucked around with this message at 05:42 on Nov 9, 2017

1gnoirents
Jun 28, 2014

hello :)
Nvidia seems pretty good about giving features to legacy cards imo if they do open it up

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

1gnoirents posted:

Nvidia seems pretty good about giving features to legacy cards imo if they do open it up

they love to give their old cards features when it makes sense and there is real competition

1gnoirents
Jun 28, 2014

hello :)

Fauxtool posted:

they love to give their old cards features when it makes sense and there is real competition

yes, but say Volta does freesync because they finally changed the 0 to a 1 in the gpu.cfg of lots of hard work by a vast team of dedicated engineers, I'd expect it to backdate to quite a few gens

Anime Schoolgirl
Nov 28, 2002

1gnoirents posted:

yes, but say Volta does freesync because they finally changed the 0 to a 1 in the gpu.cfg of lots of hard work by a vast team of dedicated engineers, I'd expect it to backdate to quite a few gens
primitive discard made its way back to kepler and fermi for a 15-20% performance improvement across the board so this won't be surprising

gently caress, the laptop chips do identical functionality to freesync already

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Anime Schoolgirl posted:

primitive discard made its way back to kepler and fermi for a 15-20% performance improvement across the board so this won't be surprising

gently caress, the laptop chips do identical functionality to freesync already

lol the nvidia devs are actually pretty great. and you know what? Who's going to teach people how to program your uarch if it isn't you? I learned to do CUDA on an NVIDIA cluster because... CUDA was where all the support was at that time. I could go to an NVIDIA presentation, learn a thing, run it on university hardware (provided by NVIDIA), have a question + email it to a library author/mailing list and get a prompt response back. Forget the library support... NVIDIA has a massive support base going on. The social factor is super hard to break here. The semi-custom business is actually super important for AMD right now. And in the world of DX12/Vulkan low-level programming it only gets more important.

Like I said... I bet the NVIDIA devs could smack like 50-100% improvement out of the AMD hardware with their DX11 MT-queue driver and poo poo like that. From what I've heard the problem isn't that AMD doesn't do a good job threading and optimizing... they just don't do it.

Paul MaudDib fucked around with this message at 06:36 on Nov 9, 2017

SwissArmyDruid
Feb 14, 2014

by sebmojo

1gnoirents posted:

yes, but say Volta does freesync because they finally changed the 0 to a 1 in the gpu.cfg of lots of hard work by a vast team of dedicated engineers, I'd expect it to backdate to quite a few gens

People did testing, NVidia laptop chips already support freesync in all but name. (VBlank being the basis for Freesync.)

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
if only AMD scared them enough to force it out

Craptacular!
Jul 9, 2001

Fuck the DH

1gnoirents posted:

yes, but say Volta does freesync because they finally changed the 0 to a 1 in the gpu.cfg of lots of hard work by a vast team of dedicated engineers, I'd expect it to backdate to quite a few gens

As a guy still gaming at 60hz, Fast Sync has become my favorite feature of the past few years, and they announced it with Pascal but put it on the 900 series, too. (Shrug)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Keplers below GK110 (780/780 Ti) did age significantly worse than the flagships. GK110 is a 7950 now at best - but it's a 7950 with GSync, while GCN 1.0 lacked FreeSync. And GK104 did not age well, it's probably below a 7770 at this point.

And fermi sucks poo poo nowadays, even in DX11 but especially in DX12. It's a compatibility mode, not a performant option. Of course I suppose the AMD equivalent is Terascale...

Paul MaudDib fucked around with this message at 07:30 on Nov 9, 2017

Arzachel
May 12, 2012

SwissArmyDruid posted:

Because that's where the paywall cut off the rest of the article.


Finally found the question to reply to: That semiaccurate article says PCIe from CPU to GPU, EMIB from GPU to HBM.

The Anandtech article was updated a while back, they think the HBM is only directly connected to the GPU and that the CPU->GPU connection is on-package PCIe. Also, the CPU die most likely has has it's iGPU still, so expect some form of switching.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I really do wish they would just all sit down and accept a standard for adaptive sync and eventually just make it baseline in all tech, forever. It's barmy having hardware needing to match display output and not the other way around. HDMI 2.1 is a good step forward. I play all my PC games on my tv because it's a hugeass 4K 65" and it looks glorious but it's a slave to vsync with all the associated performance downsides and input delays. I hope a third GPU vendor doesn't complicate things with their own intelvision sync 9000 implemention and instead just force everyone to draw a line under the whole thing.

Every other week or so since late 2015 I do a quick search for TVs that support adaptive sync/gsync/freesync and it's nearly 2018 and there's still literally nothing whatsoever. It's maddening. Hurry up!!!

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
My take is that Intel first wants Raja to scale up Gen10 EUs into a high performance dGPU for AI/MI, and then with his input finally push like a Gen12 (Gen11 likely already in the pipeline) solution as a higher performance replacement and for general consumer use. Again, we're still looking at like, early 2019 for this to start to matter with even the most basic of product but Intel needs/wants something now.

Generic Monk
Oct 31, 2011

Craptacular! posted:

As a guy still gaming at 60hz, Fast Sync has become my favorite feature of the past few years, and they announced it with Pascal but put it on the 900 series, too. (Shrug)

is there an actual writeup on what fast sync actually does under the hood? when i had a gaming pc i forced it on for everything and there really was a palpable improvement in input latency and general... smoothness, but pretty much every mention i saw of it was just quoting the nvidia line of it being a 'supercharged version of vsync' or some poo poo like that. i totally expected it to be snake-oil from that but it was actually really great, i've just got no idea what it actually does

Arivia posted:

Yeah AdoredTV was going on about how the AMD iGPU deal was going to kill nVidia, which is just like what the gently caress.

well it's going to put the screws on them at any rate, and more competition in the pretty languid world of consumer gpus is always good. honestly i find adoredtv relatively salient and informed, he's just prone to making massive sweeping generalisations, and probably panders to the amd fanboy caucus more than he really should

Paul MaudDib posted:

That's the same math I'm doing. I want a good low-light camera so it's pixel or iPhone and welp, I'll go for the one that actually has security updates. That or a $150 unlocked moto-whatever. I made my first-gen Moto G last almost 4 years.

yeah i find it pretty hard to justify android unless you're looking to save money. i'm ok with having the informed tradeoff of a lower entry price being ropier hardware and your data probably being mined by google and your chosen oem, but neither of those dissipate as much as i'd like as you go up in price. also the iphone can scroll without being a stuttery mess

Generic Monk fucked around with this message at 13:23 on Nov 9, 2017

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Generic Monk posted:

is there an actual writeup on what fast sync actually does under the hood? when i had a gaming pc i forced it on for everything and there really was a palpable improvement in input latency and general... smoothness, but pretty much every mention i saw of it was just quoting the nvidia line of it being a 'supercharged version of vsync' or some poo poo like that. i totally expected it to be snake-oil from that but it was actually really great, i've just got no idea what it actually does

Nvidia "Fast Sync" is triple buffered V-Sync for DirectX games employed at the driver level. Nvidia designed it to cater to the Overwatch/CSGO crowd who often play at 300 fps on a 60 Hz display. Fast Sync allows the game engine to run at max rate, avoiding back pressure that can come with standard V-Sync while maintaining synchronisation with the display. Back pressure causes latency on your controls and is horrible for games like that.

Say you're playing at 300 fps on a 60 Hz display with Fast Sync. Basically the Fast Sync algorithm is discarding most frames created by the GPU and choosing an appropriate frame to send to the display 60 times a second. This satisfies the display's requirement for V-Sync while the game processes your inputs at 300 fps.
The trade-off is judder. As you're only shown 1/5 of the frames generated, you might see some slight hitching or stuttering. Whether you notice this will probably depend on your exposure to high hz + high frames gameplay.

orcane
Jun 13, 2012

Fun Shoe

Generic Monk posted:

yeah i find it pretty hard to justify android unless you're looking to save money. i'm ok with having the informed tradeoff of a lower entry price being ropier hardware and your data probably being mined by google and your chosen oem, but neither of those dissipate as much as i'd like as you go up in price. also the iphone can scroll without being a stuttery mess

Not really for this thread, but I'm wondering WTF you do with your phones or what poo poo bricks you've used because other than the security patch thing, none of these issues have been present in any of the Android phones I've used and had seen friends and relatives use in the past five years

:iiam:

Also the analogy doesn't really work for video cards but hey, it's Hyperbole Paul so whatever.

GRINDCORE MEGGIDO
Feb 28, 1985


orcane posted:

Not really for this thread, but I'm wondering WTF you do with your phones or what poo poo bricks you've used because other than the security patch thing, none of these issues have been present in any of the Android phones I've used and had seen friends and relatives use in the past five years

:iiam:

Also the analogy doesn't really work for video cards but hey, it's Hyperbole Paul so whatever.

Scrolling on my old Nexus5x smoothly right now with Oreo, meanwhile my friend is raging about his iPhone6 being a bag of poo poo with the latest IOS.

Arzachel
May 12, 2012
There are people that don't turn off all the scrolling animation crap the second they get a new phone?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Arzachel posted:

There are people that don't turn off all the scrolling animation crap the second they get a new phone?

These are the same people who leave their brand new TV at the default settings and wonder why the colors are all hosed up and the "motion looks wrong." They are a pox upon all of us.

orcane
Jun 13, 2012

Fun Shoe

Arzachel posted:

There are people that don't turn off all the scrolling animation crap the second they get a new phone?

Same but in Windows.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Enos Cabell posted:

Is this when Nvidia announces their entrance into the CPU market?

Nvidia annouces Ai86, using the power of AI Nvidia can decode x86 instruction sets into GPGPU instructions sets at greater than native x86 speeds. :getin:

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

DrDork posted:

These are the same people who leave their brand new TV at the default settings and wonder why the colors are all hosed up and the "motion looks wrong." They are a pox upon all of us.

Then there are those put max brightness of a million suns on everything and then are completely surprised that they fail shortly after the warranty period.

Volguus
Mar 3, 2009
It is funny how attached and defensive people get of their overpriced mobile computer. Regardless of what you're paying for it, that device is not worth 10% of that money.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Volguus posted:

It is funny how attached and defensive people get of their overpriced mobile computer. Regardless of what you're paying for it, that device is not worth 10% of that money.

I'll agree with you about people getting all stupid fanboy defensive over their various bits of tech.

But if you're trying to argue that an unlimited connection to cat memes and nudie pics isn't worth at least $50, I think you need to re-evaluate your idea of what "worth" means. :colbert:

Maxwell Adams
Oct 21, 2000

T E E F S
monday: AMD's hardware is going into Intel CPUs
wednesday: Intel is going to make dGPUs
friday: The Playstation 5 is a video card, made by Apple

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

GRINDCORE MEGGIDO posted:

Scrolling on my old Nexus5x smoothly right now with Oreo, meanwhile my friend is raging about his iPhone6 being a bag of poo poo with the latest IOS.

Nexus 5X is great while it works, I was a big fan of mine up until last weekend. Then LG's manufacturing defect caused my CPU die to start separating from the BGA and whoops, it won't boot anymore. Oh, the Pixel is twice as expensive? Wonderful...

At least Project Fi SIMs will work with any GSM phone after being activated, even if Google doesn't advertise it. I can use my old spare while waiting to see what LG support does for me.

wolrah
May 8, 2006
what?

Eletriarnation posted:

At least Project Fi SIMs will work with any GSM phone after being activated, even if Google doesn't advertise it. I can use my old spare while waiting to see what LG support does for me.

AFAIK while they work, they effectively become T-Mobile SIMs when used with an unsupported device. You lose the ability to roam on to Sprint and US Cellular, as well as the SIM-authenticated access to certain public wifi networks.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.
What model GTX 1070 has the best/quietest cooling? I've got an EVGA SC2 that I'm pretty happy with, but my buddy I'm doing a build for is pretty spergy about sound.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

tehinternet posted:

What model GTX 1070 has the best/quietest cooling? I've got an EVGA SC2 that I'm pretty happy with, but my buddy I'm doing a build for is pretty spergy about sound.

The MSI Gaming X, iirc.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

wolrah posted:

AFAIK while they work, they effectively become T-Mobile SIMs when used with an unsupported device. You lose the ability to roam on to Sprint and US Cellular, as well as the SIM-authenticated access to certain public wifi networks.

Ah, interesting; I had figured I'd lose CDMA and keep LTE across all carriers, and didn't yet notice anything to disprove that assumption so I didn't know. My old spare Zenfone 2 Laser (lol) actually has noticeably better cell reception than my 5X though, to the point that even while the 5X worked I was already using the Zenfone instead with the data-only SIM some of the time and just dialing over Hangouts. Shame it's 2.4GHz and 1A charging only, but better than my backup backup Nexus 4 which doesn't even have LTE.

Anyway, it's just a stopgap until I hopefully get a replacement 5X. That will probably be subject to the same manufacturing defect and fail after another year and a half.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Stanley Pain posted:

Nvidia annouces Ai86, using the power of AI Nvidia can decode x86 instruction sets into GPGPU instructions sets at greater than native x86 speeds. :getin:

After this week, Transmeta suddenly rising from the dead would be the LEAST weird thing so far.

Kazinsal
Dec 13, 2011


SwissArmyDruid posted:

After this week, Transmeta suddenly rising from the dead would be the LEAST weird thing so far.

Yes, please. :pray:

Generic Monk
Oct 31, 2011

Volguus posted:

It is funny how attached and defensive people get of their overpriced mobile computer. Regardless of what you're paying for it, that device is not worth 10% of that money.

depends what you do with it. my 10bit anime porn mkvs have rather deprecated the value yeah

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

tehinternet posted:

What model GTX 1070 has the best/quietest cooling? I've got an EVGA SC2 that I'm pretty happy with, but my buddy I'm doing a build for is pretty spergy about sound.
I can't complain about the cooling on my G1 1070 even with overclocking and a fairly aggressive fan profile. The rare times I've heard it there's no high-pitched noise or anything.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
I have an Asus STRIX 1070 which has excellent thermals and noise due to being fuckoff huge, looking at some reviews it's probably one of the best cards in that area. Gigabyte G1 and MSI Gaming X are all within a dB or two though so any of those cards would probably be fine.

MaxxBot fucked around with this message at 22:12 on Nov 9, 2017

Adbot
ADBOT LOVES YOU

GRINDCORE MEGGIDO
Feb 28, 1985


The Palit Gamerock 1070 must own thermally. The 1080 does.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply