Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
EoRaptor
Sep 13, 2003

by Fluffdaddy

Well, the actual killer was Vulkan. It's also, honestly, a much better solution.

There is also the debate that without Mantle showing the performance overhead of outdated standards/implementations, nothing would have been done.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo

Mantle is dead, long live Mantle.

The TechReport posted:

Huddy told us AMD has done a "great deal of work" with the Khronos Group, the stewards of the OpenGL spec, on OpenGL Next. AMD has given the organization unfettered access to Mantle and told them, in so many words, "This is how we do it. If you want to take the same approach, go ahead." Khronos is free to take as many pages as it wants out of the Mantle playbook, and AMD will impose no restrictions, nor will it charge any licensing fees.

http://techreport.com/news/26922/amd-hopes-to-put-a-little-mantle-in-opengl-next

EDIT: Note, that this does not mean that OpenGL will be able to run Mantle code. It's just that some of the ways that Mantle does things may or may not make their way into OpenGL. This could mean one line of code, it could mean entire libraries. At the earliest, we won't know until... oh hey, today. I'd almost forgotten GDC was now.

Edit the second: 3/3 at 3:00, and Valve is going to be on stage with Khronos. The irony would be palpable if you-know-what happened.

EoRaptor posted:

Well, the actual killer was Vulkan. It's also, honestly, a much better solution.

There is also the debate that without Mantle showing the performance overhead of outdated standards/implementations, nothing would have been done.

Now, let's see if Microsoft continues to give OpenGL the screwjob on Windows, even after rejoining the Khronos Group and playing nice on WebGL. (http://www.theregister.co.uk/2014/08/11/hell_freezes_over_microsoft_joins_khronos/)

SwissArmyDruid fucked around with this message at 21:25 on Mar 3, 2015

1gnoirents
Jun 28, 2014

hello :)
I should be a gaming prophet. everything I say off hand months before hand based on absolutely nothing relavent come true.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Nah, it'll still be a vector for AMD-specific improvements, just not for optimizing away draw-call overhead since that's solved in new D3D/GL.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

EoRaptor posted:

There is also the debate that without Mantle showing the performance overhead of outdated standards/implementations, nothing would have been done.

I don't think there's much debate, Microsoft would have absolutely kept their thumbs up their asses if Mantle hadn't showed up. I think AMD can be proud of shaming everyone else into taking optimizations seriously even if competitors are uh, picking up the Mantle, so to speak.

EoRaptor
Sep 13, 2003

by Fluffdaddy

Zero VGS posted:

I don't think there's much debate, Microsoft would have absolutely kept their thumbs up their asses if Mantle hadn't showed up. I think AMD can be proud of shaming everyone else into taking optimizations seriously even if competitors are uh, picking up the Mantle, so to speak.

The Xbox 360 and Xbox One both featured a hugely modified version of DirectX that had many of these optimizations already done, so Microsoft clearly knew there was an issue and how to fix it. I think they also knew the cost of bringing those changes to windows would be pretty steep if they had to upgrade existing versions.

My basic thinking is the development looked like this:

1. MS develops improved DX for consoles
2. AMD, the supplier of console CPUs and GPUs for the then current and the future xbox, says they should port that to windows.
3. MS says they aren't willing to discuss that at this time
4. AMD takes everything they learned from during the xbox driver development and begins the same process on their windows drivers.
5. AMD partners with some dev studios, and hammers out how to access these driver improvements to benefit games, brands it Mantle at some point.
6. Mantle is publicly announced along with game support
7. Turns out, MS and nVidia have been working on DirectX improvements for windows for a while now, covered by an NDA that excluded AMD.
8. MS announces the 'new' DirectX version as part of a new Windows version
9. nVidia instantly (in hardware timeline terms) announces DirectX 12 support for their brand new video cards, even though the spec isn't final and there is not even a beta sdk to test with.

The timing involved is just to weird. There is no way MS could turn around an announcement with feature set so quickly if they didn't already have it in the pipeline, and nVidia was really, really quick of the mark to guarantee DX12 compatibility for the 9x0 series cards.

nVidia's and MS's action here really smell like AMD was being deliberately shut out to try to cost it an entire hardware cycle without DX12 support. AMD has plenty of problems, but this feels like just nasty behavior against them.

1gnoirents
Jun 28, 2014

hello :)
But didn't AMD also say it would support DX12 pretty much right off the bat

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

1gnoirents posted:

That for real sucks, but I would personally hunt for something else (monitor). Or perhaps, a $10 gpu someone doesnt want anymore
I spent an afternoon reading intel datasheets on the core I5 and the H97 PCH, so I've got a pretty good understanding of how they're hooked together now!

For posterity, the PCH (*97 chipset) only handles the analog signal and provides the clock lines for the digital interfaces - the data lines come directly off the CPU. As for analog, in Lynxpoint it's coming from the CPU via two 2.7 gigabit 'Flexible Display Interface' lines (180megapixels/s), whereas I guess in Panther/Cougar it had a much faster connection - looks like 350mhz for panther and 400 for cougar.

In the end, I bought a BenQ 1440p. I just hope my motherboard documentation is wrong - it says max of 1080p over all connections but Intel's datasheets are extremely clear that HDMI, DVI and DP all support 4k@60hz on the 4590 - including simultanious 4k on 3 monitors via HDMIx2+DP, DVIx2+DP, HDMI+DVI+DP or DPx3.

I'm resistant to buying a videocard because both nVidia and AMD really need the binary drivers to work decently, and since part of my job is kernel development their delays in supporting new versions really make life a pain for me.

Wiggly Wayne DDS
Sep 11, 2010



EoRaptor posted:

The Xbox 360 and Xbox One both featured a hugely modified version of DirectX that had many of these optimizations already done, so Microsoft clearly knew there was an issue and how to fix it. I think they also knew the cost of bringing those changes to windows would be pretty steep if they had to upgrade existing versions.

My basic thinking is the development looked like this:

1. MS develops improved DX for consoles
2. AMD, the supplier of console CPUs and GPUs for the then current and the future xbox, says they should port that to windows.
3. MS says they aren't willing to discuss that at this time
4. AMD takes everything they learned from during the xbox driver development and begins the same process on their windows drivers.
5. AMD partners with some dev studios, and hammers out how to access these driver improvements to benefit games, brands it Mantle at some point.
6. Mantle is publicly announced along with game support
7. Turns out, MS and nVidia have been working on DirectX improvements for windows for a while now, covered by an NDA that excluded AMD.
8. MS announces the 'new' DirectX version as part of a new Windows version
9. nVidia instantly (in hardware timeline terms) announces DirectX 12 support for their brand new video cards, even though the spec isn't final and there is not even a beta sdk to test with.

The timing involved is just to weird. There is no way MS could turn around an announcement with feature set so quickly if they didn't already have it in the pipeline, and nVidia was really, really quick of the mark to guarantee DX12 compatibility for the 9x0 series cards.

nVidia's and MS's action here really smell like AMD was being deliberately shut out to try to cost it an entire hardware cycle without DX12 support. AMD has plenty of problems, but this feels like just nasty behavior against them.
Do you have some sort of blog where I can read more of these... creative interpretations of technology?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Wiggly Wayne DDS posted:

Do you have some sort of blog where I can read more of these... creative interpretations of technology?

Yeah, that's some good stuff. (NVIDIA and AMD both announced support for DX12 the same day that MSFT revealed it, right?)

b0lt
Apr 29, 2005

Subjunctive posted:

Yeah, that's some good stuff. (NVIDIA and AMD both announced support for DX12 the same day that MSFT revealed it, right?)

The truth comes out; it's actually a massive conspiracy against Intel!

sauer kraut
Oct 2, 2004
Did we find out yet which cards have full feature hardware DX12 support?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

sauer kraut posted:

Did we find out yet which cards have full feature hardware DX12 support?

IIRC NVIDIA said that all their DX11 cards would also support DX12 (which says interesting things about their architecture in terms of the pre-emption stuff), and AMD as well.

E: AMD says all GCN parts, and NVIDIA says Fermi, Kepler, Maxwell

Subjunctive fucked around with this message at 01:55 on Mar 4, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo

Khronos Group Press Release posted:

About Vulkan:

Vulkan is a unified specification that minimizes driver overhead and enables multi-threaded GPU command preparation for optimal graphics and compute performance on diverse mobile, desktop, console and embedded platforms.

Emphasis mine.

I believe this to mean that GLNext will come to the PS4.

As we know that DX12 will run on existing Xbox One hardware, from a purely scientific standpoint, I am now curious to see how much additional graphical fidelity a console can now output relative to now.

Fallows
Jan 20, 2005

If he waits long enough he can use his accrued interest from his savings to bring his negative checking balance back into the black.


Wonder how much boost this has

veedubfreak
Apr 2, 2005

by Smythe

Fallows posted:



Wonder how much boost this has

Somehow this makes me angrier than a standard donk. I think it's because nvidia fanboys are some of the biggest idiots around.

Weirdoman
Jun 12, 2001

I was walking down the street when I saw a bovus. And then it hit me...I was hit by a bovus.
Wonder how long it'll take that NVidia driver to crash...

1gnoirents
Jun 28, 2014

hello :)
1ms of wait what nvidia drivers are very stable
2ms later :v:

- my brain

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Weirdoman posted:

Wonder how long it'll take that NVidia driver to crash...

:vince:

Zig-Zag
Aug 29, 2007

Why don't we just start shooting tar heroin instead?
Looking to get a new card and I can't decide between the 970 or the 960 with 4gb of vram that just got announced by evga. Would it be worth the extra money? I'm still gaming on 1080p and do t plan on going 4k anytime soon but I would like to get another year or so out of my new card before upgrading again.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Zig-Zag posted:

Looking to get a new card and I can't decide between the 970 or the 960 with 4gb of vram that just got announced by evga. Would it be worth the extra money? I'm still gaming on 1080p and do t plan on going 4k anytime soon but I would like to get another year or so out of my new card before upgrading again.

Don't waste your money on a GTX 960 with 4GB of VRAM, the price will be close enough to the 970 that just getting the 970 will be a better value.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Zig-Zag posted:

Looking to get a new card and I can't decide between the 970 or the 960 with 4gb of vram that just got announced by evga. Would it be worth the extra money? I'm still gaming on 1080p and do t plan on going 4k anytime soon but I would like to get another year or so out of my new card before upgrading again.

The 4gb on a 960 is pretty pointless since the memory issues of the 970 only crop up at higher resolutions. I'd say to just get a 970, at 1080p the memory issue won't really matter much and it will be a good bit faster than the 960. I'll also say that the 970 will last you 2-3 years or so, at least if you stick with 1080p and are willing to game at high down the road instead of ultra.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Zig-Zag posted:

Looking to get a new card and I can't decide between the 970 or the 960 with 4gb of vram that just got announced by evga. Would it be worth the extra money? I'm still gaming on 1080p and do t plan on going 4k anytime soon but I would like to get another year or so out of my new card before upgrading again.

The 960 and 970 are not even close. If you have the money for a 970 then you shouldn't even be looking in the direction of a 960, you should have bought a 970 already.
The Radeon 280X and 290 are the cards you should be looking at if you don't have the money for a 970, but want a card faster than a 960.

HalloKitty fucked around with this message at 18:28 on Mar 4, 2015

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
oh boy titan x http://www.pcper.com/news/Graphics-Cards/GDC-15-NVIDIA-Shows-TITAN-X-Epic-Games-Keynote

tiny text on the black box: "INSPIRED BY GAMERS. BUILT BY NVIDIA."

Gamers, indeed.

Fajita Fiesta
Dec 15, 2013
I thought the original titan series was supposed to be some sort of big processing compute platform for businesses.

EoRaptor
Sep 13, 2003

by Fluffdaddy

Wiggly Wayne DDS posted:

Do you have some sort of blog where I can read more of these... creative interpretations of technology?

I prefer to keep my madness purely in random forum posts.

And I also missed the AMD list of supported cards announcement. It hasn't appeared anywhere on their branding, where nVidia puts it everywhere, which is a bit odd.

I still wonder about why AMD went faffing about with Mantle if they knew MS was making DX12. It just seems like, at some point, MS simply didn't tell them.

veedubfreak
Apr 2, 2005

by Smythe

Fajita Fiesta posted:

I thought the original titan series was supposed to be some sort of big processing compute platform for businesses.

Titan still did not have all the compute enabled on it. It was basically just the full chip.

Bet it's over 1000 bucks.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Fajita Fiesta posted:

I thought the original titan series was supposed to be some sort of big processing compute platform for businesses.

That is partly correct. Titan did not have all of its compute enabled. However, there are economies of scale, and I can only assume that at the point where you're loading up multiple Titans per ATX box, you'd probably be better off loading Teslas into a server instead.

Linus doesn't count, he got those things sponsored as part of the whole-room-watercooling thing.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
nVidia sold more Kepler Titans than was expected. Those individual consumers also happened to be very vocal about justifying it as a new tier of halo product for the consumer market--definitely ended up doing a lot of the marketing for them, as well. The term "gaming supercomputer" popped up somewhere, and the world was never the same (read: worse) again.

Zephro
Nov 23, 2000

I suppose I could part with one and still be feared...
On the Freesync/G-Sync thing - would I be right in assuming that Nvidia are going to have to abandon G-Sync at some point? It requires an extra bit of custom hardware per monitor and it pushes the price up. Freesync requires nothing extra to work besides maybe a firmware update on old monitors. I can't see how they can keep it going once Freesync parts start arriving in volume. Right?

And if so, does that mean I can buy a 970 without worrying about getting locked into a dead ecosystem, which is the main thing preventing me from doing so at the moment? I'd like an adaptive-sync monitor at some point in the near future, after all.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


nVidia explicitly refused to update the DisplayPort spec on their Maxwell I cards to the level that includes Adaptive Sync so they could flog G-Sync some more. An actual GPU wonk could tell you whether it requires extra hardware or if nVidia could just update the firmware for it, but nVidia would probably refuse to.

1gnoirents
Jun 28, 2014

hello :)
They will if it gets popular enough.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender
Eventually NVidia will have to support FreeSync since it's part of the DisplayPort spec. Otherwise they'll be in a position where there are monitors that are being sold with DisplayPort 2.0 ports but sorry they can't connect to your GTX 1170 because they still just have DisplayPort 1.2 ports.

Realistically that point is still at least a couple of years away so in the mean time they might as well try to push as many GSync monitors as they can.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
They're going to drag their heels as long as they can. Sucks for me since I was looking at getting a 4K adaptive sync monitor next year and I'll probably be stuck paying a premium for G-SYNC. G-SYNC feels a bit like NVIDIA's Mantle.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Krailor posted:

Eventually NVidia will have to support FreeSync since it's part of the DisplayPort spec. Otherwise they'll be in a position where there are monitors that are being sold with DisplayPort 2.0 ports but sorry they can't connect to your GTX 1170 because they still just have DisplayPort 1.2 ports.

Realistically that point is still at least a couple of years away so in the mean time they might as well try to push as many GSync monitors as they can.

Buying a kickass 1080p gsync monitor now and a kickass 4k freesync monitor in a few years (when the rest of the components for 4k gaming become more reasonable) doesn't seem like such a bad thing. The XL2420G I have just looks sooooo good @ 144Hz with no tearing.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Not sure if this is thread or if someone could recommend another but why is the focus of nearly all game engines lighting?

Whatever happen to environmental destruction such as Red Faction?

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Tab8715 posted:

Not sure if this is thread or if someone could recommend another but why is the focus of nearly all game engines lighting?

Whatever happen to environmental destruction such as Red Faction?

Allowing the player to alter the level is Hard and also a poke in the level design team's collective eye.

Lighting is flashy as hell.

Kazinsal
Dec 13, 2011

Tab8715 posted:

Not sure if this is thread or if someone could recommend another but why is the focus of nearly all game engines lighting?

Whatever happen to environmental destruction such as Red Faction?

Writing better and better real-time lighting engines is one of the core technologies driving engines closer and closer to near-photorealistic real-time rendering every graphics generation.

Environmental destruction, on the other hand, is a gameplay feature.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Sir Unimaginative posted:

nVidia explicitly refused to update the DisplayPort spec on their Maxwell I cards to the level that includes Adaptive Sync so they could flog G-Sync some more.

Whoa. Where were they explicit about that?

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

hello :)
Destruction is more on the game developer side of things isnt it ? I mean, I don't really know, but if I had to guess. Lighting is indeed important for graphics though. Realistic lighting will be a huge milestone whenever it comes

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply