|
DoctorTristan posted:Well at least they’re giving you time to cancel once the review embargo breaks? Seems like it, which is nice, in case something goes horribly wrong. I'm sure they have a regular return policy though as well should the worst happen. Chances are it'll be pretty good though and ill want it but it'll be delayed for 4 months because of course it will
|
# ? Aug 23, 2018 17:42 |
|
|
# ? May 31, 2024 21:46 |
|
Does anyone know what date the review embargo ends?
|
# ? Aug 23, 2018 17:49 |
|
Partial Octopus posted:Does anyone know what date the review embargo ends? Rumors say sept 14th.
|
# ? Aug 23, 2018 17:55 |
|
The more I think about it, the more I think NVIDIA is really going to lean on DLSS going forward. Like, this isn't just a throwaway gimmick, this lets them land a permanent 30% speedup in any game they get DLSS into, for ~5% extra die area plus some deep-learning guys they already have on staff. It's a good synergy for them, lets them leverage their DL staff into further software optimizations, and NVIDIA is all about the software optimizations when it makes their hardware faster/cheaper/cooler. Very soon, display transport standards (DP/HDMI) are going to start using "visually lossless" compression (DSC standard), this is really no different (in principle). Done right, the loss of detail should be unnoticeable unless you are literally flicking back and forth between screenshots. The question is going to be "is it bothersome enough that you are going to buy the next higher card up instead?" and the answer to that one is probably "no" for most people. That probably means Turing is close to a 2x speedup over Pascal for most future titles if you use DLSS. Paul MaudDib fucked around with this message at 19:05 on Aug 23, 2018 |
# ? Aug 23, 2018 19:02 |
|
SwissArmyDruid posted:You mean, "exactly what Nvidia does right now by running GeForce Experience in the background" except with no credit transferability? Yeah From what I understand, the term "Girlfriend Experience" refers to a situation in which a prostitute maximizes her profit from an emotionally needy client by charging him extra for some kind of imaginary benefit beyond loving him and taking his money. With this in mind, the phrase "GeForce Experience" seems like inspired branding.
|
# ? Aug 23, 2018 19:46 |
|
Bloody Antlers posted:Yeah i really shouldnt have googled that
|
# ? Aug 23, 2018 19:47 |
|
AVeryLargeRadish posted:All the 1440p monitors I have seen use dual link DVI drat, I was hoping to pick up a 1440p monitor to match my new Thinkpad's resolution, but it only has HDMI out. Why are all 1440p screens DVI?
|
# ? Aug 23, 2018 19:51 |
|
Paul MaudDib posted:The more I think about it, the more I think NVIDIA is really going to lean on DLSS going forward. Like, this isn't just a throwaway gimmick, this lets them land a permanent 30% speedup in any game they get DLSS into, for ~5% extra die area plus some deep-learning guys they already have on staff. It's a good synergy for them, lets them leverage their DL staff into further software optimizations, and NVIDIA is all about the software optimizations when it makes their hardware faster/cheaper/cooler. DLSS does raise the question of how we're supposed to benchmark cards going forward. Is DLSS cheating if the results look correct unless put under a microscope? How do you quantify "correct enough"? If deep learning fuckery takes off then reviewers might have to start using perceptual error metrics like SSIM and plotting performance on two axis (FPS vs perceptual quality). repiv fucked around with this message at 20:54 on Aug 23, 2018 |
# ? Aug 23, 2018 19:52 |
|
Unsinkabear posted:drat, I was hoping to pick up a 1440p monitor to match my new Thinkpad's resolution, but it only has HDMI out. Why are all 1440p screens DVI? Just the Korea Cheap Overclockable ones are DVI only. A normal 1440P/60 Screen should work fine with HDMI.
|
# ? Aug 23, 2018 20:02 |
|
Unsinkabear posted:drat, I was hoping to pick up a 1440p monitor to match my new Thinkpad's resolution, but it only has HDMI out. Why are all 1440p screens DVI? As ededdneddy said, thats generally reserved for korean overclockable 1440p panels with a controller board that only accepts dual link dvi. However basically any other 1440p screen will be able to use regular DVI as they have a more advanced controller (at the expense of being locked in at 60hz unless you really start shelling out money). HDMI would easily adapt to regular DVI in those cases too should you actually find one that doesnt take HDMI for some reason. However just keep in mind you cannot easily adapt HDMI to dual link DVI without spending a lot of money on active adapters that might not work anyways
|
# ? Aug 23, 2018 20:06 |
Unsinkabear posted:drat, I was hoping to pick up a 1440p monitor to match my new Thinkpad's resolution, but it only has HDMI out. Why are all 1440p screens DVI? Sorry, I phrased that badly, I meant that if they have DVI it's generally dual link. HDMI and DisplayPort can work just fine on them as long as they have the ports for it.
|
|
# ? Aug 23, 2018 20:18 |
|
Ah cool, thanks! This laptop has an MX150 and I am beyond broke, so going above 60hz is not a thing I get to worry about in the next few years. Most likely I'll be running games at 1080p and low/med detail to even get 60fps, and just using the 1440p for work and browsing.
|
# ? Aug 23, 2018 20:22 |
|
1gnoirents posted:As ededdneddy said, thats generally reserved for korean overclockable 1440p panels with a controller board that only accepts dual link dvi. However basically any other 1440p screen will be able to use regular DVI as they have a more advanced controller (at the expense of being locked in at 60hz unless you really start shelling out money). HDMI would easily adapt to regular DVI in those cases too should you actually find one that doesnt take HDMI for some reason. However just keep in mind you cannot easily adapt HDMI to dual link DVI without spending a lot of money on active adapters that might not work anyways Do we know of an active adapter that actually does work? I've seen a few for around 100 dollars but who knows which ones actually work and I'm not down to throw god knows how much money for a new 120hz+ 1440p IPS monitor. I've got the Korean one known as the "OVERLORD TEMPEST" that quickly didn't last long on the market because the guy running it was shady as gently caress.
|
# ? Aug 23, 2018 20:39 |
|
repiv posted:DLSS does raise the question of how we're supposed to benchmark cards going forward. Is DLSS cheating if the results look correct unless put under a microscope? How do you quantify "correct enough"? Yeah I was wondering that too. Reviewers are going to hate "it's indistinguishable, trust us!", so at best I'd expect to see a lot of charts with separate entries for ultra and ultra+DLSS, and a bunch of reviewers are going to flat-out refuse to do it. Long term we are going to have to come to grips with whether "visually lossless" is OK and some way to quantify just how "visually lossless" something really is (nice euphemism, it's not lossless bitwise, that's for sure). (I remember reading somewhere that NVIDIA is really giving reviewers the hustle on this one too, like review done and hardware shipped back in 2 weeks. Don't have a source for that, so take that with a massive grain of salt, but they definitely didn't get cards before the launch event.) TBH I'm more OK with this than DSC. Compression in my display transport is just... no. At least here there is a very obvious performance justification for doing it. The monitor world is fuuuucked. DP1.3 is hardly in consumer hands and we already need something better. The only way past 4K120 right now is chroma subsampling or compression. I wish we could do a dual-cable option or something. Paul MaudDib fucked around with this message at 20:56 on Aug 23, 2018 |
# ? Aug 23, 2018 20:52 |
|
Aeka 2.0 posted:Do we know of an active adapter that actually does work? I've seen a few for around 100 dollars but who knows which ones actually work and I'm not down to throw god knows how much money for a new 120hz+ 1440p IPS monitor. Startech or Monoprice, but you are looking at that price. https://www.amazon.com/StarTech-com-DisplayPort-Active-Adapter-Converter/dp/B00A493CNY I should have said something along the lines of "it probably wont work unless it costs more than $100". Personally thats hard to swallow for me, and eventually pushed me to get rid of my Qnix. I had to give up IPS for it though which was no small loss
|
# ? Aug 23, 2018 20:53 |
|
punk rebel ecks posted:Do games use tessellation still? Yeah it's very common nowadays. It's actually a success story of sorts, I remember everyone being very sceptical of it, but it's got its place now. Even the consoles use a lot of tessellation. With modern graphics and high resolutions nowadays, old fashioned level of detail model transitions would look even more obvious so tessellation fills the gap.
|
# ? Aug 23, 2018 21:04 |
|
People are skeptical because TSMC’s process shrink is coming up; and it’s a bit unbelievable that Nvidia has left this much power in this process sitting on the table; particularly through a year of crypto boom.
|
# ? Aug 23, 2018 21:28 |
|
Zedsdeadbaby posted:Yeah it's very common nowadays. It's actually a success story of sorts, I remember everyone being very sceptical of it, but it's got its place now. Even the consoles use a lot of tessellation. With modern graphics and high resolutions nowadays, old fashioned level of detail model transitions would look even more obvious so tessellation fills the gap. I see. I just rarely see games now to turn it on/off in the settings.
|
# ? Aug 23, 2018 21:31 |
|
VCZ just leaked another slide (and placed their watermark poorly) Turing's cache is massive compared to Pascal.
|
# ? Aug 23, 2018 21:38 |
|
punk rebel ecks posted:I see. I just rarely see games now to turn it on/off in the settings. Yah it's common enough they pack it in with some of the other general settings nowdays instead of its own toggle, wanna say the ones labeled something like "Object Detail".
|
# ? Aug 23, 2018 21:42 |
|
Will using the DP port on my motherboard to drive a second monitor for out-of-game stuff (web browsing, youtube, etc) affect the performance of my GPU on my primary monitor? The mobo video is driven by the CPU, right?
|
# ? Aug 23, 2018 21:45 |
|
repiv posted:VCZ just leaked another slide (and placed their watermark poorly) Oh baby those are big cache differences Aexo posted:Will using the DP port on my motherboard to drive a second monitor for out-of-game stuff (web browsing, youtube, etc) affect the performance of my GPU on my primary monitor? The mobo video is driven by the CPU, right? Yes though a few years ago this didn't work because it ate up too many PCIe lanes or something so it may depend on your chipset. These days this is pretty common, I do it all the time on pretty lovely computers. If it doesnt plug and play check the BIOS and make sure its enabled, it may be called Multi Monitor Support or simply Integrated Graphics enable or disable. Sometimes this is disabled by default especially on prebuilt machines. If it still doesn't work you may have to manually install Intel graphics drivers. I dont know why but I've run into that a few times even though the drivers were already installed. This is a common streamer thing to do as well. 1gnoirents fucked around with this message at 22:05 on Aug 23, 2018 |
# ? Aug 23, 2018 21:45 |
|
But what do the numbers mean for those of us who love us some computer but barely know what an integrated circuit is? Computer go faster? Computer not crash? Mega Textures now Wimp Textures? WHAT
|
# ? Aug 23, 2018 22:10 |
|
Aexo posted:Will using the DP port on my motherboard to drive a second monitor for out-of-game stuff (web browsing, youtube, etc) affect the performance of my GPU on my primary monitor? The mobo video is driven by the CPU, right? 1gnoirents posted:Oh baby those are big cache differences is this per chance also a way i can get around the maximum resolution displayable by my gpu?
|
# ? Aug 23, 2018 22:11 |
|
Broose posted:But what do the numbers mean for those of us who love us some computer but barely know what an integrated circuit is? Computer go faster? Computer not crash? Mega Textures now Wimp Textures? WHAT The GPU processor can keep twice as much stuff on-chip before having to go out to its memory (VRAM), and it can read from it twice as fast. Computer go faster. Paul MaudDib fucked around with this message at 22:14 on Aug 23, 2018 |
# ? Aug 23, 2018 22:12 |
|
1gnoirents posted:Yes though a few years ago this didn't work because it ate up too many PCIe lanes or something so it may depend on your chipset. These days this is pretty common, I do it all the time on pretty lovely computers. If it doesnt plug and play check the BIOS and make sure its enabled, it may be called Multi Monitor Support or simply Integrated Graphics enable or disable. Sometimes this is disabled by default especially on prebuilt machines. If it still doesn't work you may have to manually install Intel graphics drivers. I dont know why but I've run into that a few times even though the drivers were already installed. This is a common streamer thing to do as well. Thanks, I'm finishing the build tonight and I'll check the BIOS settings if it doesn't come on. I was hoping it wouldn't affect performance but I'll have to see how bad it'll be. edit: Mostly I just didn't want secondary display to use cycles on the GPU. I don't really care if I lose a few frames because the CPU (i7-8700K @ 5Ghz) is somehow busy dealing with the secondary display. For what it's worth I'm on an EVGA Z370 Classified K motherboard. Home built. Aexo fucked around with this message at 22:38 on Aug 23, 2018 |
# ? Aug 23, 2018 22:31 |
|
repiv posted:VCZ just leaked another slide (and placed their watermark poorly) People have been speculating they doubled the register file size too. Wait a minute guys it sounds like NVIDIA didn't spend the entire transistor budget on magic beans after all. Maybe they... actually know what they're doing? If true, Volta may be a huge beneficiary here, because it sounds like most of these changes were in Volta too. They must have just tremendously halfassed the Titan V drivers. Paul MaudDib fucked around with this message at 22:35 on Aug 23, 2018 |
# ? Aug 23, 2018 22:32 |
|
Statutory Ape posted:is this per chance also a way i can get around the maximum resolution displayable by my gpu? Though I never tried, Windows 10 is very forgiving on having wacky resolutions between different displays. This is something you very likely can set in the standard display settings after you have the displays going. I'm curious what GPU you have that has a lower maximum resolution than your iGPU to be honest * I dont know if Windows 7 was ever updated to be as versatile as Windows 10 in this regard Aexo posted:Thanks, I'm finishing the build tonight and I'll check the BIOS settings if it doesn't come on. I was hoping it wouldn't affect performance but I'll have to see how bad it'll be. It shouldn't affect performance at all unless you try to play a game on the iGPU at the same time but thats a different matter. One thing I'm unsure of is if you have to use borderless windowed mode or not for the game side. I always use that so I dont know if using fullscreen on one side would mess things up. The borderless window performance hit, in my experience, is virtually unnoticeable though and is otherwise a major convenience.
|
# ? Aug 23, 2018 22:40 |
|
1gnoirents posted:Though I never tried, Windows 10 is very forgiving on having wacky resolutions between different displays. This is something you very likely can set in the standard display settings after you have the displays going. I'm curious what GPU you have that has a lower maximum resolution than your iGPU to be honest * I dont know if Windows 7 was ever updated to be as versatile as Windows 10 in this regard Not what i meant My gpu says it displays a total resolution of what amounts to 2x 4k screens, i have a 4k and a 1440. It would be neat to grab a 1080 to throw up vertically
|
# ? Aug 23, 2018 22:59 |
|
I wonder what doubled cache size and bandwidth means for mining performance, heh heh. it won't matter, mining is dead, ASICs own Ethereum now, price is plummeting, and Vitalik is proposing cutting block rewards in 1/3 to create deflation and try to get the price under control, which will further centralize control with the ASICs and kick out anyone else who might have been interested in the Eth economy. Every other crypto is pretty much owned by ASICs now too. Paul MaudDib fucked around with this message at 23:08 on Aug 23, 2018 |
# ? Aug 23, 2018 23:03 |
|
It wouldn't matter much anyway, right? I thought most cryptos accessed effectively random memory so the chance of a cache hit is negligible.
|
# ? Aug 23, 2018 23:07 |
|
repiv posted:It wouldn't matter much anyway, right? I thought most cryptos accessed effectively random memory so the chance of a cache hit is negligible. Monero pretty much lives and dies by cache. Vega has a shitload of cache, which is why Monero runs well on that card. Slightly different hardness strategy, instead of trying to make it consume a lot of VRAM, you make it consume a lot of cache instead.
|
# ? Aug 23, 2018 23:08 |
|
Statutory Ape posted:Not what i meant Man that would be interesting to know. Ive never hit that GPU limit so I didnt even think thats what you meant. I dont see why not but who knows, some obscure thing might stop it. As far as I've seen in the past few years the iGPU does operate very independently from any discrete GPU.
|
# ? Aug 23, 2018 23:10 |
|
Oh god I ordered a 2080ti. There's no way it will be 200% of a 1080ti holy gently caress
|
# ? Aug 23, 2018 23:18 |
|
Surprise Giraffe posted:Oh god I ordered a 2080ti. There's no way it will be 200% of a 1080ti holy gently caress All the giga rays my friend.
|
# ? Aug 23, 2018 23:46 |
|
Surprise Giraffe posted:Oh god I ordered a 2080ti. There's no way it will be 200% of a 1080ti holy gently caress You're the only other one who's admitted this so far I wonder who's being quiet
|
# ? Aug 23, 2018 23:50 |
|
We need to know who ordered two 2080 Ti Founders Editions for that sick SLI setup.
|
# ? Aug 23, 2018 23:56 |
|
I have an EVGA 2080 non-ti on preorder with newegg since they dont charge till it ships, so i'll cancel it if the reviews suck. I didn't have the chutzpah to pull the trigger on a Ti cause i am a weak gamer soyboy.
|
# ? Aug 23, 2018 23:58 |
|
Icept posted:We need to know who ordered two 2080 Ti Founders Editions for that sick SLI setup. I have to admit, I would like to see what the scaling turns out to be. Pushing 8K would be possible if they can scale well and be game independent is possible with NVLink. Guess we will just have to wait and see. Also the Ultrawide May Be my "big stupid purchase" of the month so I am going to be good and wait to see what the Ti ends up performing like before I pull any sort of trigger. Now to sell some stuff to make $ for future big stupid purchases...
|
# ? Aug 24, 2018 00:01 |
|
|
# ? May 31, 2024 21:46 |
|
Icept posted:We need to know who ordered two 2080 Ti Founders Editions for that sick SLI setup. I pre-ordered 2 1080 ti's and was going to sell them and get a 2080ti and go ITX (SLi scaling is really good on Nvidia's side, btw, it's the developers never supporting it that makes it poo poo) welp e: I will probably sell these mid next year when zen2 comes out either way so if anyone is interested in calling dibs I will give a good deal (they have water blocks on them)
|
# ? Aug 24, 2018 00:04 |