Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
1gnoirents
Jun 28, 2014

hello :)

DoctorTristan posted:

Well at least they’re giving you time to cancel once the review embargo breaks?

Seems like it, which is nice, in case something goes horribly wrong. I'm sure they have a regular return policy though as well should the worst happen.

Chances are it'll be pretty good though and ill want it but it'll be delayed for 4 months because of course it will

Adbot
ADBOT LOVES YOU

Partial Octopus
Feb 4, 2006



Does anyone know what date the review embargo ends?

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Partial Octopus posted:

Does anyone know what date the review embargo ends?

Rumors say sept 14th.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
The more I think about it, the more I think NVIDIA is really going to lean on DLSS going forward. Like, this isn't just a throwaway gimmick, this lets them land a permanent 30% speedup in any game they get DLSS into, for ~5% extra die area plus some deep-learning guys they already have on staff. It's a good synergy for them, lets them leverage their DL staff into further software optimizations, and NVIDIA is all about the software optimizations when it makes their hardware faster/cheaper/cooler.

Very soon, display transport standards (DP/HDMI) are going to start using "visually lossless" compression (DSC standard), this is really no different (in principle). Done right, the loss of detail should be unnoticeable unless you are literally flicking back and forth between screenshots. The question is going to be "is it bothersome enough that you are going to buy the next higher card up instead?" and the answer to that one is probably "no" for most people.

That probably means Turing is close to a 2x speedup over Pascal for most future titles if you use DLSS.

Paul MaudDib fucked around with this message at 19:05 on Aug 23, 2018

Bloody Antlers
Mar 27, 2010

by Jeffrey of YOSPOS

SwissArmyDruid posted:

You mean, "exactly what Nvidia does right now by running GeForce Experience in the background" except with no credit transferability?

Yeah :(

From what I understand, the term "Girlfriend Experience" refers to a situation in which a prostitute maximizes her profit from an emotionally needy client by charging him extra for some kind of imaginary benefit beyond loving him and taking his money.

With this in mind, the phrase "GeForce Experience" seems like inspired branding.

1gnoirents
Jun 28, 2014

hello :)

Bloody Antlers posted:

Yeah :(

From what I understand, the term "Girlfriend Experience" refers to a situation in which a prostitute maximizes her profit from an emotionally needy client by charging him extra for some kind of imaginary benefit beyond loving him and taking his money.

With this in mind, the phrase "GeForce Experience" seems like inspired branding.

i really shouldnt have googled that

Unsinkabear
Jun 8, 2013

Ensign, raise the beariscope.





AVeryLargeRadish posted:

All the 1440p monitors I have seen use dual link DVI

drat, I was hoping to pick up a 1440p monitor to match my new Thinkpad's resolution, but it only has HDMI out. Why are all 1440p screens DVI?

repiv
Aug 13, 2009

Paul MaudDib posted:

The more I think about it, the more I think NVIDIA is really going to lean on DLSS going forward. Like, this isn't just a throwaway gimmick, this lets them land a permanent 30% speedup in any game they get DLSS into, for ~5% extra die area plus some deep-learning guys they already have on staff. It's a good synergy for them, lets them leverage their DL staff into further software optimizations, and NVIDIA is all about the software optimizations when it makes their hardware faster/cheaper/cooler.

Very soon, display transport standards (DP/HDMI) are going to start using "visually lossless" compression (DSC standard), this is really no different (in principle). Done right, the loss of detail should be unnoticeable unless you are literally flicking back and forth between screenshots. The question is going to be "is it bothersome enough that you are going to buy the next higher card up instead?" and the answer to that one is probably "no" for most people.

That probably means Turing is close to a 2x speedup over Pascal for most future titles if you use DLSS.

DLSS does raise the question of how we're supposed to benchmark cards going forward. Is DLSS cheating if the results look correct unless put under a microscope? How do you quantify "correct enough"?

If deep learning fuckery takes off then reviewers might have to start using perceptual error metrics like SSIM and plotting performance on two axis (FPS vs perceptual quality).

repiv fucked around with this message at 20:54 on Aug 23, 2018

EdEddnEddy
Apr 5, 2012



Unsinkabear posted:

drat, I was hoping to pick up a 1440p monitor to match my new Thinkpad's resolution, but it only has HDMI out. Why are all 1440p screens DVI?

Just the Korea Cheap Overclockable ones are DVI only.

A normal 1440P/60 Screen should work fine with HDMI.

1gnoirents
Jun 28, 2014

hello :)

Unsinkabear posted:

drat, I was hoping to pick up a 1440p monitor to match my new Thinkpad's resolution, but it only has HDMI out. Why are all 1440p screens DVI?

As ededdneddy said, thats generally reserved for korean overclockable 1440p panels with a controller board that only accepts dual link dvi. However basically any other 1440p screen will be able to use regular DVI as they have a more advanced controller (at the expense of being locked in at 60hz unless you really start shelling out money). HDMI would easily adapt to regular DVI in those cases too should you actually find one that doesnt take HDMI for some reason. However just keep in mind you cannot easily adapt HDMI to dual link DVI without spending a lot of money on active adapters that might not work anyways

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Unsinkabear posted:

drat, I was hoping to pick up a 1440p monitor to match my new Thinkpad's resolution, but it only has HDMI out. Why are all 1440p screens DVI?

Sorry, I phrased that badly, I meant that if they have DVI it's generally dual link. HDMI and DisplayPort can work just fine on them as long as they have the ports for it.

Unsinkabear
Jun 8, 2013

Ensign, raise the beariscope.





Ah cool, thanks! This laptop has an MX150 and I am beyond broke, so going above 60hz is not a thing I get to worry about in the next few years. Most likely I'll be running games at 1080p and low/med detail to even get 60fps, and just using the 1440p for work and browsing.

Aeka 2.0
Nov 16, 2000

:ohdear: Have you seen my apex seals? I seem to have lost them.




Dinosaur Gum

1gnoirents posted:

As ededdneddy said, thats generally reserved for korean overclockable 1440p panels with a controller board that only accepts dual link dvi. However basically any other 1440p screen will be able to use regular DVI as they have a more advanced controller (at the expense of being locked in at 60hz unless you really start shelling out money). HDMI would easily adapt to regular DVI in those cases too should you actually find one that doesnt take HDMI for some reason. However just keep in mind you cannot easily adapt HDMI to dual link DVI without spending a lot of money on active adapters that might not work anyways

Do we know of an active adapter that actually does work? I've seen a few for around 100 dollars but who knows which ones actually work and I'm not down to throw god knows how much money for a new 120hz+ 1440p IPS monitor.
I've got the Korean one known as the "OVERLORD TEMPEST" that quickly didn't last long on the market because the guy running it was shady as gently caress.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

DLSS does raise the question of how we're supposed to benchmark cards going forward. Is DLSS cheating if the results look correct unless put under a microscope? How do you quantify "correct enough"?

If deep learning fuckery takes off then reviewers might have to start using perceptual error metrics like SSIM and plotting performance on two axis (FPS vs perceptual quality).

Yeah I was wondering that too. Reviewers are going to hate "it's indistinguishable, trust us!", so at best I'd expect to see a lot of charts with separate entries for ultra and ultra+DLSS, and a bunch of reviewers are going to flat-out refuse to do it. Long term we are going to have to come to grips with whether "visually lossless" is OK and some way to quantify just how "visually lossless" something really is (nice euphemism, it's not lossless bitwise, that's for sure).

(I remember reading somewhere that NVIDIA is really giving reviewers the hustle on this one too, like review done and hardware shipped back in 2 weeks. Don't have a source for that, so take that with a massive grain of salt, but they definitely didn't get cards before the launch event.)

TBH I'm more OK with this than DSC. Compression in my display transport is just... no. At least here there is a very obvious performance justification for doing it.

The monitor world is fuuuucked. DP1.3 is hardly in consumer hands and we already need something better. The only way past 4K120 right now is chroma subsampling or compression. I wish we could do a dual-cable option or something.

Paul MaudDib fucked around with this message at 20:56 on Aug 23, 2018

1gnoirents
Jun 28, 2014

hello :)

Aeka 2.0 posted:

Do we know of an active adapter that actually does work? I've seen a few for around 100 dollars but who knows which ones actually work and I'm not down to throw god knows how much money for a new 120hz+ 1440p IPS monitor.
I've got the Korean one known as the "OVERLORD TEMPEST" that quickly didn't last long on the market because the guy running it was shady as gently caress.

Startech or Monoprice, but you are looking at that price.

https://www.amazon.com/StarTech-com-DisplayPort-Active-Adapter-Converter/dp/B00A493CNY

I should have said something along the lines of "it probably wont work unless it costs more than $100". Personally thats hard to swallow for me, and eventually pushed me to get rid of my Qnix. I had to give up IPS for it though which was no small loss

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

punk rebel ecks posted:

Do games use tessellation still?

Yeah it's very common nowadays. It's actually a success story of sorts, I remember everyone being very sceptical of it, but it's got its place now. Even the consoles use a lot of tessellation. With modern graphics and high resolutions nowadays, old fashioned level of detail model transitions would look even more obvious so tessellation fills the gap.

Craptacular!
Jul 9, 2001

Fuck the DH
People are skeptical because TSMC’s process shrink is coming up; and it’s a bit unbelievable that Nvidia has left this much power in this process sitting on the table; particularly through a year of crypto boom.

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

Zedsdeadbaby posted:

Yeah it's very common nowadays. It's actually a success story of sorts, I remember everyone being very sceptical of it, but it's got its place now. Even the consoles use a lot of tessellation. With modern graphics and high resolutions nowadays, old fashioned level of detail model transitions would look even more obvious so tessellation fills the gap.

I see. I just rarely see games now to turn it on/off in the settings.

repiv
Aug 13, 2009

VCZ just leaked another slide (and placed their watermark poorly)



Turing's cache is massive compared to Pascal.

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy

punk rebel ecks posted:

I see. I just rarely see games now to turn it on/off in the settings.

Yah it's common enough they pack it in with some of the other general settings nowdays instead of its own toggle, wanna say the ones labeled something like "Object Detail".

Aexo
May 16, 2007
Don't ask, I don't know how to pronounce my name either.
Will using the DP port on my motherboard to drive a second monitor for out-of-game stuff (web browsing, youtube, etc) affect the performance of my GPU on my primary monitor? The mobo video is driven by the CPU, right?

1gnoirents
Jun 28, 2014

hello :)

repiv posted:

VCZ just leaked another slide (and placed their watermark poorly)



Turing's cache is massive compared to Pascal.

Oh baby those are big cache differences

Aexo posted:

Will using the DP port on my motherboard to drive a second monitor for out-of-game stuff (web browsing, youtube, etc) affect the performance of my GPU on my primary monitor? The mobo video is driven by the CPU, right?


Yes though a few years ago this didn't work because it ate up too many PCIe lanes or something so it may depend on your chipset. These days this is pretty common, I do it all the time on pretty lovely computers. If it doesnt plug and play check the BIOS and make sure its enabled, it may be called Multi Monitor Support or simply Integrated Graphics enable or disable. Sometimes this is disabled by default especially on prebuilt machines. If it still doesn't work you may have to manually install Intel graphics drivers. I dont know why but I've run into that a few times even though the drivers were already installed. This is a common streamer thing to do as well.

1gnoirents fucked around with this message at 22:05 on Aug 23, 2018

Broose
Oct 28, 2007
But what do the numbers mean for those of us who love us some computer but barely know what an integrated circuit is? Computer go faster? Computer not crash? Mega Textures now Wimp Textures? WHAT

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Aexo posted:

Will using the DP port on my motherboard to drive a second monitor for out-of-game stuff (web browsing, youtube, etc) affect the performance of my GPU on my primary monitor? The mobo video is driven by the CPU, right?


1gnoirents posted:

Oh baby those are big cache differences



Yes though a few years ago this didn't work because it ate up too many PCIe lanes or something so it may depend on your chipset. These days this is pretty common, I do it all the time on pretty lovely computers. If it doesnt plug and play check the BIOS and make sure its enabled, it may be called Multi Monitor Support or simply Integrated Graphics enable or disable. Sometimes this is disabled by default especially on prebuilt machines. If it still doesn't work you may have to manually install Intel graphics drivers. I dont know why but I've run into that a few times even though the drivers were already installed. This is a common streamer thing to do as well.

is this per chance also a way i can get around the maximum resolution displayable by my gpu?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Broose posted:

But what do the numbers mean for those of us who love us some computer but barely know what an integrated circuit is? Computer go faster? Computer not crash? Mega Textures now Wimp Textures? WHAT

The GPU processor can keep twice as much stuff on-chip before having to go out to its memory (VRAM), and it can read from it twice as fast.

Computer go faster.

Paul MaudDib fucked around with this message at 22:14 on Aug 23, 2018

Aexo
May 16, 2007
Don't ask, I don't know how to pronounce my name either.

1gnoirents posted:

Yes though a few years ago this didn't work because it ate up too many PCIe lanes or something so it may depend on your chipset. These days this is pretty common, I do it all the time on pretty lovely computers. If it doesnt plug and play check the BIOS and make sure its enabled, it may be called Multi Monitor Support or simply Integrated Graphics enable or disable. Sometimes this is disabled by default especially on prebuilt machines. If it still doesn't work you may have to manually install Intel graphics drivers. I dont know why but I've run into that a few times even though the drivers were already installed. This is a common streamer thing to do as well.

Thanks, I'm finishing the build tonight and I'll check the BIOS settings if it doesn't come on. I was hoping it wouldn't affect performance but I'll have to see how bad it'll be.
edit: Mostly I just didn't want secondary display to use cycles on the GPU. I don't really care if I lose a few frames because the CPU (i7-8700K @ 5Ghz) is somehow busy dealing with the secondary display.

For what it's worth I'm on an EVGA Z370 Classified K motherboard. Home built.

Aexo fucked around with this message at 22:38 on Aug 23, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

VCZ just leaked another slide (and placed their watermark poorly)



Turing's cache is massive compared to Pascal.

People have been speculating they doubled the register file size too.

Wait a minute guys it sounds like NVIDIA didn't spend the entire transistor budget on magic beans after all. Maybe they... actually know what they're doing? :thunk:



If true, Volta may be a huge beneficiary here, because it sounds like most of these changes were in Volta too. They must have just tremendously halfassed the Titan V drivers.

Paul MaudDib fucked around with this message at 22:35 on Aug 23, 2018

1gnoirents
Jun 28, 2014

hello :)

Statutory Ape posted:

is this per chance also a way i can get around the maximum resolution displayable by my gpu?

Though I never tried, Windows 10 is very forgiving on having wacky resolutions between different displays. This is something you very likely can set in the standard display settings after you have the displays going. I'm curious what GPU you have that has a lower maximum resolution than your iGPU to be honest * I dont know if Windows 7 was ever updated to be as versatile as Windows 10 in this regard


Aexo posted:

Thanks, I'm finishing the build tonight and I'll check the BIOS settings if it doesn't come on. I was hoping it wouldn't affect performance but I'll have to see how bad it'll be.
edit: Mostly I just didn't want secondary display to use cycles on the GPU. I don't really care if I lose a few frames because the CPU (i7-8700K @ 5Ghz) is somehow busy dealing with the secondary display.

For what it's worth I'm on an EVGA Z370 Classified K motherboard. Home built.

It shouldn't affect performance at all unless you try to play a game on the iGPU at the same time but thats a different matter. One thing I'm unsure of is if you have to use borderless windowed mode or not for the game side. I always use that so I dont know if using fullscreen on one side would mess things up. The borderless window performance hit, in my experience, is virtually unnoticeable though and is otherwise a major convenience.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

1gnoirents posted:

Though I never tried, Windows 10 is very forgiving on having wacky resolutions between different displays. This is something you very likely can set in the standard display settings after you have the displays going. I'm curious what GPU you have that has a lower maximum resolution than your iGPU to be honest * I dont know if Windows 7 was ever updated to be as versatile as Windows 10 in this regard

Not what i meant

My gpu says it displays a total resolution of what amounts to 2x 4k screens, i have a 4k and a 1440. It would be neat to grab a 1080 to throw up vertically

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I wonder what doubled cache size and bandwidth means for mining performance, heh heh.

it won't matter, mining is dead, ASICs own Ethereum now, price is plummeting, and Vitalik is proposing cutting block rewards in 1/3 to create deflation and try to get the price under control, which will further centralize control with the ASICs and kick out anyone else who might have been interested in the Eth economy. Every other crypto is pretty much owned by ASICs now too.

Paul MaudDib fucked around with this message at 23:08 on Aug 23, 2018

repiv
Aug 13, 2009

It wouldn't matter much anyway, right? I thought most cryptos accessed effectively random memory so the chance of a cache hit is negligible.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

It wouldn't matter much anyway, right? I thought most cryptos accessed effectively random memory so the chance of a cache hit is negligible.

Monero pretty much lives and dies by cache. Vega has a shitload of cache, which is why Monero runs well on that card.

Slightly different hardness strategy, instead of trying to make it consume a lot of VRAM, you make it consume a lot of cache instead.

1gnoirents
Jun 28, 2014

hello :)

Statutory Ape posted:

Not what i meant

My gpu says it displays a total resolution of what amounts to 2x 4k screens, i have a 4k and a 1440. It would be neat to grab a 1080 to throw up vertically

Man that would be interesting to know. Ive never hit that GPU limit so I didnt even think thats what you meant. I dont see why not but who knows, some obscure thing might stop it. As far as I've seen in the past few years the iGPU does operate very independently from any discrete GPU.

Surprise Giraffe
Apr 30, 2007
1 Lunar Road
Moon crater
The Moon
Oh god I ordered a 2080ti. There's no way it will be 200% of a 1080ti holy gently caress

afkmacro
Mar 29, 2009



Surprise Giraffe posted:

Oh god I ordered a 2080ti. There's no way it will be 200% of a 1080ti holy gently caress

All the giga rays my friend.

1gnoirents
Jun 28, 2014

hello :)

Surprise Giraffe posted:

Oh god I ordered a 2080ti. There's no way it will be 200% of a 1080ti holy gently caress

You're the only other one who's admitted this so far :downs: I wonder who's being quiet

Icept
Jul 11, 2001
We need to know who ordered two 2080 Ti Founders Editions for that sick SLI setup.

Cygni
Nov 12, 2005

raring to post

I have an EVGA 2080 non-ti on preorder with newegg since they dont charge till it ships, so i'll cancel it if the reviews suck.

I didn't have the chutzpah to pull the trigger on a Ti cause i am a weak gamer soyboy.

EdEddnEddy
Apr 5, 2012



Icept posted:

We need to know who ordered two 2080 Ti Founders Editions for that sick SLI setup.

I have to admit, I would like to see what the scaling turns out to be. Pushing 8K would be possible if they can scale well and be game independent is possible with NVLink.

Guess we will just have to wait and see.

Also the Ultrawide May Be my "big stupid purchase" of the month so I am going to be good and wait to see what the Ti ends up performing like before I pull any sort of trigger.

Now to sell some stuff to make $ for future big stupid purchases...

Adbot
ADBOT LOVES YOU

Scionix
Oct 17, 2009

hoog emm xDDD

Icept posted:

We need to know who ordered two 2080 Ti Founders Editions for that sick SLI setup.

I pre-ordered 2 1080 ti's and was going to sell them and get a 2080ti and go ITX (SLi scaling is really good on Nvidia's side, btw, it's the developers never supporting it that makes it poo poo)

welp

e: I will probably sell these mid next year when zen2 comes out either way so if anyone is interested in calling dibs I will give a good deal (they have water blocks on them)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply