Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

Maybe it's personal bias, but experience in later X-ray engine games vs Gamebryo leads me to believe X-ray is superior/potentially superior in every way.

Agreed, the engine definitely reached maturity with Call of Pripyat and it was the most polished out of all of them.

I just think Shadow of Chernobyl was the most fun and "STALKER-y" out of all of them. Shame it was obviously rushed out the door, crashes and all. It was great with AMK mod and some accuracy/damage mods.

Groza Tunder 5.45 :circlefap:

Adbot
ADBOT LOVES YOU

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Paul MaudDib posted:

No, you need to buy a new cable. That's the warning sign, the cable doesn't get healthier from here on out.

any reason not to use monoprice or amazon? Is there such a thing as a high quality cable or a cheap one being too cheap?

Fauxtool fucked around with this message at 02:09 on Jan 18, 2016

GrizzlyCow
May 30, 2011
Yes. Amazon cables are good, but check the reviews for the Monoprice cable.

The_Franz
Aug 8, 2003

Paul MaudDib posted:

Because it makes games run faster. There's a lot of challenges to actually getting good speedup scaling out of it, and for things like indie games it's not really necessary. But for AAA games that can seriously tax a single card, the logic is to use more cards.

Programming a game engine in general is about to get a lot harder with DX12 since it's a much lower-level API, so the idea is that instead of coding a one-off special-snowflake engine for your game you build on top of something like Unity, Unreal, etc, and those guys have already done the hard work to make things (including SLI) behave properly.

APIs like Vulkan/DX12/Metal really aren't that hard to use despite all of the "IT TAKES 600 LINES TO DRAW A TRIANGLE IN VULKAN :byodood:" paranoia floating around. They are, however, explicit and require you to declare state up front and won't fix garbage rendering code in the driver for you. If you have an existing codebase that doesn't map at all to the way these APIs work, then, yes, you are in for some pain. If you followed good practices such as those in the "approaching zero driver overhead" presentations or have the engine working on a console then you are basically ready to go since many of the concepts that the new APIs force you to use are general best practices on the older ones too.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Fauxtool posted:

any reason not to use monoprice or amazon? Is there such a thing as a high quality cable or a cheap one being too cheap?

A good Monoprice or Amazon Basics (i.e. Amazon-branded product) cable will be just fine. I've used StarTech and they're fine too (but not the cheapest).

I'd personally go for the "premium" / thicker type cable because they're less likely to break the first time you yank a little too hard.

penus penus penus
Nov 9, 2014

by piss__donald
I used to poo poo on Monoprice (and Startech for that matter) and to an extent some of my gripes are still true, but they've really got it together the last few years from what I've seen. I've never seen an issue with DVI cables from them either for what it's worth. Honestly DVI has been the most reliable and predictable cable out of all of them by a real margin, any brand, from my point of view.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

The_Franz posted:

APIs like Vulkan/DX12/Metal really aren't that hard to use despite all of the "IT TAKES 600 LINES TO DRAW A TRIANGLE IN VULKAN :byodood:" paranoia floating around.
Those stupid assholes don't remember the old times of Win32 before MFC. Took a hundred lines to start a blank application window.

PirateBob
Jun 14, 2003

SwissArmyDruid posted:

Eh, as a casual overview, you can't really go wrong with a techquickie video: https://www.youtube.com/watch?v=jsxn93Wb7vk

In short:

* DX12 is going to be a thinner API, which means that the onus will be upon the developers to more clearly define exactly what they want done, which allows for much more fine grained optimization.
* It will also be much better at parallelization of workloads, allowing applications to spread out more evenly across available cores, instead of mostly on one core, and a little bit on all the other ones
* Reduction in the number of draw calls. This one is ties into the previous point, as issuing too many draw calls (the CPU telling the GPU to draw a thing on the scree) can be a bottleneck.

(Editorial mode: The onus being upon the developer to optimize their game makes me feel that we will see a lot fewer "game ready" patches, which may undercut Nvidia's driver team leaving them little or at least less to do, which is a big swing in AMD's favor.)

Thank you. So very tl;dr, DX12 is mostly about increased performance and not any groundbreaking new effects, you don't need a new next-gen card to take advantage of it, and AMD cards will benefit more than Nvidia.

Seamonster
Apr 30, 2007

IMMER SIEGREICH

THE DOG HOUSE posted:

Honestly DVI has been the most reliable and predictable cable out of all of them by a real margin, any brand, from my point of view.

Because its been around the longest? DVI is like 8-9 years older than Displayport and at least a couple years older than HDMI. I'm sure if given 8 years, DP will be quite mature as well.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

DP is actually making a real effort to stuff bits through the cable as opposed to DVI, and kind of HDMI.

Incidentally, any recommendations for a 10 foot or so DP cable that actually connects with all 4 lanes so I can daisy chain or run an ultrawide at 75 Hz?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
New Maxwell mobility GPUs, maybe first sign of 960ti class product?

AMDs yields on the 14nm node have been such that they're confident in releasing Polaris Mobility by Q2, yet Nvidia is going to release GM204 variants at the same time. Any precedent with a similar old gen plus new gen release, or am I reading too much into this to think it feels like Fermi again/Pascal is 2017?

penus penus penus
Nov 9, 2014

by piss__donald

Seamonster posted:

Because its been around the longest? DVI is like 8-9 years older than Displayport and at least a couple years older than HDMI. I'm sure if given 8 years, DP will be quite mature as well.

Yes probably. My issues with other cables have been more physical. VGA has their color issues with pins (ohhh man this is where I get a lot of monoprice hate), HDMI breaks under relatively little force at the wrong angle, DisplayPort tend to stop working seemingly randomly (among other practical issues that seem code related). DVI has been the most reliable overall to me. Most issues are due to cheap construction, and most of it is moot if you aren't dealing with thousands of these things. When you start unplugging and plugging DP a whole lot from different computers going into different monitors you start to resent the difficulty they add but more often than not its the only choice these days and I don't doubt they will iron out the issues.

Ironically one of the fastest ways to fix a DP issue is to use a DP -> DVI adapter lol

penus penus penus fucked around with this message at 18:55 on Jan 18, 2016

EoRaptor
Sep 13, 2003

by Fluffdaddy

THE DOG HOUSE posted:

Yes probably. My issues with other cables have been more physical. VGA has their color issues with pins (ohhh man this is where I get a lot of monoprice hate), HDMI breaks under relatively little force at the wrong angle, DisplayPort tend to stop working seemingly randomly (among other practical issues that seem code related). DVI has been the most reliable overall to me. Most issues are due to cheap construction, and most of it is moot if you aren't dealing with thousands of these things. When you start unplugging and plugging DP a whole lot from different computers going into different monitors you start to resent the difficulty they add but more often than not its the only choice these days and I don't doubt they will iron out the issues.

Ironically one of the fastest ways to fix a DP issue is to use a DP -> DVI adapter lol

Displayport requires both ends to handshake on the connection, and agree on how data is going to be exchanged. This lets you do much more stuff with the connection, but if anything disrupts the connection, both sides need to detect this and re-handshake, which doesn't always happen. Experience with the specification should help to reduce this as time passes.

DVI (and HDMI 1.X) just has one side say 'here is the clock rate and some data, good luck' and either the receiving end can accept the data or it can't, and that's it. The only data returned* to the source is along the EDID pin, and it barely matters if that arrives or not.

HDMI 1.X is just a DVI signal with some extra pins allocated to sound, usb and/or ethernet, depending on the cable/port and version. HDMI 2.0 switches to a packet based system like displayport, so expect early HDMI 2.0 implementations to have some of the same problems.

It's probably also worth commenting on the fact that DP and HDMI plugs are designed to be easy to use and compact, so tended to be much more fragile than DVI (and VGA) connectors, which were designed to be durable and cheap the manufacture.

* There are some extended DVI cabling specs for running USB over it, but those are very rare.

japtor
Oct 28, 2005

EoRaptor posted:

It's probably also worth commenting on the fact that DP and HDMI plugs are designed to be easy to use and compact, so tended to be much more fragile than DVI (and VGA) connectors, which were designed to be durable and cheap the manufacture.
Any clue how USB-C is on the durability front or is it just way too early to tell?

SwissArmyDruid
Feb 14, 2014

by sebmojo
Apple are jerks and have a stranglehold on the design that USB-C SHOULD have been.

EoRaptor
Sep 13, 2003

by Fluffdaddy

japtor posted:

Any clue how USB-C is on the durability front or is it just way too early to tell?

much, much more durable than mini or micro usb, probably as good as lightening in practice (shoddy implementations notwithstanding)

We can do a bunch of pre-testing of cable performance in software, and those simulations are now actually useful in the real world, which is benefiting USB-C a lot.

slidebite
Nov 6, 2005

Good egg
:colbert:

Is there a way to definitely tell if an issue is with a monitor or GPU?

When I hooked my DP cable up from my 980ti to my new U3415W the monitor will not wake up from sleep on a cold boot. Well, it actually will if I just turned it off but if I have the computer off for more than a few minutes it won't. No BIOS screen, nothing in windows, nothing.

However, it works perfectly fine through HDMI2.0. Wakes up just great. I can see BIOS when I boot and no issues at all. I would just like to track down what the problem is with the DP.

I don't think it's a Windows/driver issue with the GPU because even the BIOS splash doesn't show which should be coming up before any video related settings are loaded in the course of Windows startup.

The only other computer I have is a MSI Gaming laptop which has mDP out. I was thinking of trying that out. Is a mDP->DP cable directional? I'd have to use the one that came with the monitor but it was meant to plug into the mDP port on the monitor, I would like to use it from the mDP port on the laptop to the standard full-size DP input on the monitor and try to make it sleep.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

mDP to DP isn't directional, I've got that cable connecting my Surface Pro to a big DP on another screen.

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette
Uh, does anyone else's Evga Hybrid Cooler installed card have it's blower fan (still on the card) randomly rev up in bursts?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

xthetenth posted:

mDP to DP isn't directional, I've got that cable connecting my Surface Pro to a big DP on another screen.

Also don't forget it could be a lovely DP cable!

slidebite
Nov 6, 2005

Good egg
:colbert:

Dogen posted:

Also don't forget it could be a lovely DP cable!

That was actually my first hunch, but I have tried 2 with the same result. One "well reviewed" cable from Amazon and the other the Dell pack-in.

I've also downgraded to DP1.1 on the monitor side with no change.

Like I said, I'm not stuck at this point, works fantastic with HDMI2.0, I would just like to troubleshoot it.

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette
Alright, it seems like my Titan X's controller broke and I have to RMA it.

This is the third time I have to RMA my Titan X. I can't stop screaming.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
my dvi cable went bad again. Thanks for telling me it would only get worse because I ordered the replacement then. My new one should arrive in the morning!

I was apparently using a single link cable on a dual link connection. I wasnt missing any resolution, but it seems odd thats what it would ship with. Did single links used to be much cheaper than dual?

penus penus penus
Nov 9, 2014

by piss__donald

Mutation posted:

Alright, it seems like my Titan X's controller broke and I have to RMA it.

This is the third time I have to RMA my Titan X. I can't stop screaming.

Sell it and buy a 980ti and pocket the rest for next gen

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I know you can't run SLI on a motherboard that doesn't specifically support it without some elaborate hacks, but is the same true for Crossfire?

In the interest of science, I want to see if I can get Crossfire going with my ITX motherboard and one of these: http://www.ebay.com/itm/390755166399

SlayVus
Jul 10, 2009
Grimey Drawer
CFX will run on just about any motherboard that has Dual PCI-E slots. Motherboards have to be certified for SLI.

But I doubt that would not work at for for what you want.

repiv
Aug 13, 2009

"For Supermicro proprietary board H8DGG-QF and some Mini-ITX boards that support bifurcation"

Sounds like the PCI-E splitting itself might not work even if CFX theoretically could :(

repiv fucked around with this message at 03:36 on Jan 19, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo
Samsung has begun mass production of HBM2.

quote:

The company says it's fabricating these 4GB dies on its 20-nm process. Each of these packages comprises four 8Gb core dies atop a buffer die at the base of the stack. Consistent with JEDEC's specifications, each of these HBM2 dies will offer 256 GB/s of bandwidth. For comparison, Samsung says that figure is a little over seven times the bandwidth of its through-silicon via (TSV) 4Gb GDDR5 dies. HBM2 chips are claimed to deliver twice the bandwidth per watt of Samsung's GDDR5 solutions, and the company notes that its HBM2 chips come with ECC support built in, as well.

Was I wrong to expect that HBM2 would be made on a 14nm process as well?

http://techreport.com/news/29614/samsung-begins-mass-production-of-4gb-hbm2-memory-chips

SwissArmyDruid fucked around with this message at 04:19 on Jan 19, 2016

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette

THE DOG HOUSE posted:

Sell it and buy a 980ti and pocket the rest for next gen

A RMA'd Titan X probably won't have an warranty, so that devalues it by a lot.

But yeah, I'm done sticking Titan Xes into my computer.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Luckily the Titan X has a lot of room for devaluation before hitting 980Ti prices.

mikemelbrooks
Jun 11, 2012

One tough badass

Fauxtool posted:

my dvi cable went bad again. Thanks for telling me it would only get worse because I ordered the replacement then. My new one should arrive in the morning!

I was apparently using a single link cable on a dual link connection. I wasnt missing any resolution, but it seems odd thats what it would ship with. Did single links used to be much cheaper than dual?

Dual-link adds an extra set of wires for RGB signalling, allowing monitors to run at resolutions beyond 1920x1080. I guess at one point they were cheaper, but I would think now it would pay to make and stock only one type of cable.? If your not going to go over 1920 x 1080 the single link should be OK.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

SwissArmyDruid posted:

Samsung has begun mass production of HBM2.


Was I wrong to expect that HBM2 would be made on a 14nm process as well?

http://techreport.com/news/29614/samsung-begins-mass-production-of-4gb-hbm2-memory-chips

I thought it's not all that important what process the ram is on because ram/flash only consumes a fraction of what a CPU/GPU does, hence why Samsung actually went backwards to a larger process when it suited 3D nand yields.

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette
How do you sell something as expensive as a Titan X without putting a target on yourself for scammers?

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Mutation posted:

How do you sell something as expensive as a Titan X without putting a target on yourself for scammers?

sell it in person on craigslist at a public place during the day.

I have sold several watches worth much more than the titan on ebay and foiled plenty of scam attempts. Dont be dumb, have your serial#s recorded. Insist on the product being returned if anything is claimed wrong with it and get shipping insurance for the full retail value of a new titan X

Fauxtool fucked around with this message at 07:39 on Jan 19, 2016

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

FaustianQ posted:

New Maxwell mobility GPUs, maybe first sign of 960ti class product?

AMDs yields on the 14nm node have been such that they're confident in releasing Polaris Mobility by Q2, yet Nvidia is going to release GM204 variants at the same time.

Might be different target markets, Nvidia is probably clearing out borked GM204 chips from their inventory. This time apparently without memory bus wizardry, 192-bit and 3/6 GB. Neat package for ~100 W TDP.

quote:

Any precedent with a similar old gen plus new gen release, or am I reading too much into this to think it feels like Fermi again/Pascal is 2017?

R9 3xx series release? Fermi is closer though as only thing Nvidia could squeeze out in 40 nm were small G210/214/216 chips to counter AMD's Evergreen line.

penus penus penus
Nov 9, 2014

by piss__donald

Mutation posted:

How do you sell something as expensive as a Titan X without putting a target on yourself for scammers?

scammers are usually the sellers, not the buyers. If someone is buying a used titan x they are the ones worried about being scammed. Just dont accept anything shady like Western Union or bother to read any emails about how their daughter has cancer and will send you your asking price + $200 ... via check in the mail.

Can just take the hit and sell it on ebay too. Always have a tracking number and if youre concerned, take a video of you putting it into the box, tape it, sign the box across the seam on both sides, then make another video of you handing it over at the drop off clearly showing the seal wasn't broken. That's more than Paypal requires to settle a dispute

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Mutation posted:

How do you sell something as expensive as a Titan X without putting a target on yourself for scammers?

Use all those other tips like they are saying, and you might also consider SA-Mart. I've sold several 980ti or entire PCs off there with no problems, plus you don't have to pay 10% eBay fees or get stabbed from Craigslist.

Durinia
Sep 26, 2014

The Mad Computer Scientist

SwissArmyDruid posted:

Samsung has begun mass production of HBM2.


Was I wrong to expect that HBM2 would be made on a 14nm process as well?

http://techreport.com/news/29614/samsung-begins-mass-production-of-4gb-hbm2-memory-chips

DRAM processes and logic (ASIC/CPU) processes are completely different. As such, adding lots of logic to a DRAM chip sucks, and putting DRAM capacitors directly on a CPU chip is hard. 20nm is actually the cutting edge for DRAM processes. Samsung is the only one shipping off of that feature size much of any volume at this point.

Durinia
Sep 26, 2014

The Mad Computer Scientist

Paul MaudDib posted:

Like I said with the four-dies-on-an-interposer thing, you could actually build a crossbar switch right onto the interposer so that all the dies can talk to any memory module (meaning your card looks like a single GPU die instead of 4 CrossFire/SLI units). I think you'd pretty much have to have some if not all of the memory controller on the interposer to do that (or else on a separate die).

You'd probably also want to put the work-unit dispatcher thing onboard there too. It's not like that's a particularly performance-critical part in terms of requiring high transistor density and fast switching speed. So overall your GPU dies just become literally the SMX units (the actual SIMD processor banks) and nothing else.

An active interposer is tough. They're actually fabbed in a significantly older process (which keeps them cheap enough to not blow the budget), meaning getting logic compatibility between them and the base GPU would be tricky. Further, even with a small amount of logic, heat dissipation is a big issue - you've got a 200+ Watt behemoth of a GPU glued on top of you. There is still a reasonable case for "non performance-sensitive bits" to get moved to an interposer, but things like I/O connections that have almost no bearing on performance (and don't take a lot of power) might be better candidates.

The good news is that you can still connect multiple GPUs together with something fairly efficient (like NVLINK) and it doesn't require an interposer to do - base package substrate or even short PCB runs work and are reasonably efficient. If you actually wanted to go further and connect together all of the GPU NoCs so that the SMXs could be in a single pool, the requirements are way higher - like melt your chip just from I/Os level higher.

Paul MaudDib posted:

I also like the idea of having the interposer be an interface between different memory types. The thing is that the interposer and the multi-die assembly may be a non-trivial part of the fabrication cost and it may well not be worth it on lesser parts (which seems to be what the response to WCCFtech suggests). On the other hand, depending on how far you go with moving the support circuitry off-die that might mean having 2 different chip architectures.

Yes, this is a cost argument. Interposers are still extra ASICs, and thus not free by any measure. They also follow chip yield curves, meaning that massive ones (like those that are larger than a GPU and 4 stacks of HBM) are probably already pushing the cost boundaries for the high-end market.

Paul MaudDib posted:

I really wouldn't be surprised, they seem to have made a good partnership with Samsung this time around. Samsung probably the #2 chip maker right now behind Intel, and Intel doesn't play nicely with others. Samsung is also hugely vertically integrated - unlike Intel they can put a chip together from input materials to finished package.

Yeah, Samsung is sitting in a nice place for these guys - all the bits you need (logic, DRAM, packaging, etc.) all in one place is a BIG benefit for tightly integrated stuff like this.

Adbot
ADBOT LOVES YOU

The_Franz
Aug 8, 2003

Tab8715 posted:

Why is playing borderless windowed mode such a performance killer?

Fullscreen windows still have to go through the desktop compositor, which usually means there is an extra copy of the entire screen-sized window taking place. On a 1920x1080 screen at 60Hz that's roughly 500 megs of data per-second that needs to be copied around.

Some non-windows desktops addresses this by detecting and unredirecting fullscreen windows so they are just used directly as the display buffer instead of being fed to the compositor. Basically auto-detecting and enabling exclusive fullscreen for one display.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply