Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Cygni
Nov 12, 2005

raring to post

why are you gaming on a linux

Adbot
ADBOT LOVES YOU

wargames
Mar 16, 2008

official yospos cat censor
counter strike on linux get more fps then on windows, the differance between 230 fps and 310 fps makes all the differance.

Alzion
Dec 31, 2006
Technically a '06
Are there even sanely priced monitors on the market that will go to 300+ fps?

Kazinsal
Dec 13, 2011
Source is a terrible engine that links input to the graphics thread.

Much above 144 FPS doesn't do a whole lot though because at best the server tick rate is 128 Hz.

Also why are you gaming on a Linux

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
e-peen nerd cred

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Fauxtool posted:

linux is bad

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

Cygni posted:

why are you gaming on a linux

Cannot be arsed maintaining a Windows installation when all the work I do is best done in Linux and all the games I play are released for Linux?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Measly Twerp posted:

Cannot be arsed maintaining a Windows installation when all the work I do is best done in Linux and all the games I play are released for Linux?

What's up with answering questions with questions these days

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

HalloKitty posted:

What's up with answering questions with questions these days

I don't know, maybe it has something to do with everything on the internet being snarky and sarcastic? Or maybe I'm just writing this in my Australian accent?

New AdoredTV video:

https://www.youtube.com/watch?v=lgweXM5LZGQ

1gnoirents
Jun 28, 2014

hello :)
Well thats just the reality of linux. Driver support for games is way down the list of priorities for either company. I dont see that changing. If you aren't willing to "put up" with Windows for gaming you'll just have to suffer I suppose. There is zero guarantee drivers for one company will be better than the other in 6 months time.

Honestly if I was in that boat I'd do whatever I could to just have two computers. Even if there was no space id stack two mitx cases and use a kvm or something

1gnoirents fucked around with this message at 22:13 on Jul 8, 2017

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

1gnoirents posted:

Well thats just the reality of linux. Driver support for games is way down the list of priorities for either company. I dont see that changing. If you aren't willing to "put up" with Windows for gaming you'll just have to suffer I suppose. There is zero guarantee drivers for one company will be better than the other in 6 months time.

I wasn't complaining about driver support in gaming, that's actually fine. It's everywhere else that sucks.

1gnoirents
Jun 28, 2014

hello :)

Measly Twerp posted:

I wasn't complaining about driver support in gaming, that's actually fine. It's everywhere else that sucks.

Ah well you'd know better. I only ever used linux for one specific cuda based image mashing thing

The Warthog
Mar 25, 2013

Did I just do your job for you?

MaxxBot posted:

That's the GV100 which is the datacenter Volta chip just like GP100 was the datacenter Pascal chip. With the Pascal launch the GP100 datacenter chip was released first followed by the GP104, GP106, and GP102 gaming chips. The GV100 is already out meaning that consumer Volta could be ready fairly soon, some rumors are saying Q4 2017 and I would be surprised if they didn't release something by Q1 2018. You'd probably be interested in the successor to the 1070 which will hopefully keep the same under $400 pricepoint.

Thanks, I may just wait a little longer then after all.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

AMD is absolutely going to push for "faster than 1080", that's an important threshold for them to cross with respect to pricing. Lots of power but faster than a 1080 and AMD will be able to price it head-to-head with a straight face (it's still an overall worse card but it'll be at least justifiable to fanboys). If it's using lots of power and still slower than a 1080 then reviewers will (rightly) give them tons of poo poo for it, andit'll need to come in decently cheaper than the 1080 to be viable.

Since the hardware is all baked at this point it's actually probably better for consumers if the drivers stay terrible for another couple weeks, you can't raise the price after the fact for driver improvements. Truth is that "just matching" hardware that's a year old (and that AMD blasted as a "price gouge" at the time) is not very good, Vega really should be coming in significantly under the 1080 anyway, but AMD hasn't even been able to offer any real price-to-performance improvements since Hawaii (almost 3 years) let alone any serious performance wins. And it doesn't change the fact that it will need to go up against Volta very soon, which will be offering price-to-performance gains.

In the short term, bad drivers would force AMD to actually compete aggressively on price and result in a more long-term viable card, instead of just following NVIDIA's pricing and relying on sales to fanboys like they have with Fiji/Polaris. The reality is that AMD still has been offering inferior hardware (more power-hungry with inconsistent performance and a DX11 performance deficit) for a long time now, the gap is continuing to widen, and the price has rarely been sufficiently compelling to justify that. It was great when you could get 8 GB RX 480s for $175 and 4 GB for <$150 but AMD made sure to kill those prices by doing a rebrand line so they could be 1% faster than NVIDIA (at twice the power) and crank the price back up to NVIDIA levels.

AMD desperately wants to be pricing head-to-head with NVIDIA, regardless of whether the hardware can actually command that kind of premium pricing. Vega is no exception. And that's a huge reason their marketshare has waned since Hawaii. "Ripping off fanboys" is not a viable long-term market strategy.

Paul MaudDib fucked around with this message at 01:57 on Jul 9, 2017

ItBurns
Jul 24, 2007
I'm going to guess that the difference in opinion on ' NVidia is fine on linux' between people doing HPC and random gamers is only partially attributable to driver quality.

ItBurns fucked around with this message at 02:02 on Jul 9, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
To put some numbers to that, I think even if they're beating the 1080 by a few percent they really ought to be coming in around $400. The 1080 is a year old at this point, in the AMD fanbase's mythology the 1080 was supposed to be a huge price gouge, and the 1080 is still a more consistent and lower-power card.

Frankly $400 for 1080 is even pushing that a bit, I'd really like to see Vega priced closer to $350 given the timeframes involved and the upcoming Volta. The 1160 will probably be similar in performance, offer some further reductions in power consumption, and probably will come in around $300.

Paul MaudDib fucked around with this message at 02:54 on Jul 9, 2017

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
Buildzoid did a stream today, he talked a fair bit about Vega and he has come to the conclusion that the card is very badly power throttled at 300W, conservatively it's gonna need like 400-500W on water. Basically you will want an 800W+ PSU bare minimum.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Non-buttcoin history has shown AMD stands no chance in hell moving GPUs without undercutting NV by a significant margin for the same performance, let alone trailing far behind in every metric.

Even as early as 2001 I remember every gamer I know were buying only NV cards and won't touch ATI even if it was faster at the same price thanks to their terrible drivers and quirks, where NV cards just worked on mobos using chipsets as crappy as the Socket 7 AGP ones.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

AVeryLargeRadish posted:

Buildzoid did a stream today, he talked a fair bit about Vega and he has come to the conclusion that the card is very badly power throttled at 300W, conservatively it's gonna need like 400-500W on water. Basically you will want an 800W+ PSU bare minimum.

So it has to be shader throttling then. If it needs 400-500W under water than it better sure as gently caress beat a 1080ti, holy poo poo.

Also HOOOOOWWWW. HOOOOOOOOOOOWWW did this happen. AMD was saying 225W, this is double that. This is more power inefficient by a large measure compared to loving Polaris, you would be better served with getting the RX 580 in XCF, even overclocked to the max it'll beat it in performance and consume less power. What in gods loving name AMD.

gently caress this poo poo, I have Freesync and most of what I play will either run fine on one RX 580 or actually scales great with XCF, I'll get two after the mining collapse for ~300$ instead of the expensive furnace Vega 10 is.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

FaustianQ posted:

So it has to be shader throttling then. If it needs 400-500W under water than it better sure as gently caress beat a 1080ti, holy poo poo.

Also HOOOOOWWWW. HOOOOOOOOOOOWWW did this happen. AMD was saying 225W, this is double that. This is more power inefficient by a large measure compared to loving Polaris, you would be better served with getting the RX 580 in XCF, even overclocked to the max it'll beat it in performance and consume less power. What in gods loving name AMD.

gently caress this poo poo, I have Freesync and most of what I play will either run fine on one RX 580 or actually scales great with XCF, I'll get two after the mining collapse for ~300$ instead of the expensive furnace Vega 10 is.

Yeah, he said it's significantly less efficient than Polaris, he was somewhat baffled at how that was possible but it appears to be true. Good job AMD. :thumbsup:

GRINDCORE MEGGIDO
Feb 28, 1985


So they've done the opposite of nVidia's known good strategy of cutting compute and power use?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

GRINDCORE MEGGIDO posted:

So they've done the opposite of nVidia's known good strategy of cutting compute and power use?

There was this study in California or something where they started printing a graph of people's power consumption relative to their neighbors on the power bill. Overall it worked well for getting people to reduce their power consumption but there were a handful of users who went "NO gently caress YOU, YOU'RE NOT EVEN MY REAL DAD" and would increase their power consumption.

That's basically what AMD has done with compute. NVIDIA has been building faster and cooler systems optimized for gaming and AMD is just totally convinced that compute-based graphics is the future and is doubling down on super-advanced async schedulers and all this other bullshit. Meanwhile since the NVIDIA cards are so good at actually running programs, the actual compute market has taken those optimized gaming cards and launched a machine-learning revolution that was unimaginable only 5 years ago.

If your hardware is good at a thing then the compute applications will follow. That's how the GPGPU revolution started in the first place. And scheduling and all that crap is just parasitic overhead on the actual program.

Paul MaudDib fucked around with this message at 03:42 on Jul 9, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

AVeryLargeRadish posted:

Yeah, he said it's significantly less efficient than Polaris, he was somewhat baffled at how that was possible but it appears to be true. Good job AMD. :thumbsup:

It is 16% slower than Fiji per shader, per clock. It is more inefficient than Polaris by almost 30%. A loving Pro Duo beats it in power/perf.

This mindboggling loving bad, like unimaginably loving bad. No wonder they never breathed a word, and I have no loving idea why they are coming to market with this abomination.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
I'm sure the Vega architecture is well-suited to some tasks:

https://www.youtube.com/watch?v=9xNQimoBNBc

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Also, HBM was just a massive mis-step. Vega would be a little less of a trainwreck if you had the option to go "direct to video" and dump it into the midrange market using GDDR5, but AMD put the same memory and packaging NVIDIA uses on a $7000 P100 card on a 1080 that runs at 375W, so they really have to make this work as a high-end product.

Paul MaudDib fucked around with this message at 04:07 on Jul 9, 2017

Kazinsal
Dec 13, 2011
Linux really shot itself in the foot with regards to GPU drivers. Intel HD drivers on Linux don't suck but that's because if you know anything about driver stacks you can go grab the hardware docs for free and go to town. AMD drivers mostly don't suck because iirc the official ATI drivers for Linux back in the day were grown out of some driver developer's pet project, and the unification of the Catalyst stuff with the homegrown reverse engineered drivers went half-decently overall. I suspect if it weren't for the market share and target market discrepancy between ATI/AMD and Nvidia we wouldn't see actually usable Radeon drivers on Linux.

Nvidia drivers suck on Linux because Nvidia was heavily invested in gaming cards when Linux was starting to get to a point vaguely resembling desktop usability, but there just wasn't the software there on the Linux side of things to make it worth their time to write a whole new driver stack that was anywhere close to on par with their Windows drivers. Torvalds then did his usual autistic screeching and told Nvidia to go gently caress themselves, and well, lol, if I were in Nvidia's position I wouldn't spend a shitload of money to help Torvalds and his enraged fireball of an ego. There's only half-decent drivers now because of the branching out in Nvidia's target markets.

So I think it's less an Nvidia problem and more of a "Linux continues to be led by someone with the social skills of a thermonuclear device" problem. But that's just my opinion, and one that always leads to Linux crusaders calling me a Microsoft shill or something equally obtuse over.

AVeryLargeRadish posted:

Yeah, he said it's significantly less efficient than Polaris, he was somewhat baffled at how that was possible but it appears to be true. Good job AMD. :thumbsup:

It's frankly amazing that ATI's corpse continues to shamble. This whole architectural launch is just embarrassing. :cripes:

Volguus
Mar 3, 2009

Kazinsal posted:

Linux really shot itself in the foot with regards to GPU drivers. Intel HD drivers on Linux don't suck but that's because if you know anything about driver stacks you can go grab the hardware docs for free and go to town. AMD drivers mostly don't suck because iirc the official ATI drivers for Linux back in the day were grown out of some driver developer's pet project, and the unification of the Catalyst stuff with the homegrown reverse engineered drivers went half-decently overall. I suspect if it weren't for the market share and target market discrepancy between ATI/AMD and Nvidia we wouldn't see actually usable Radeon drivers on Linux.

Nvidia drivers suck on Linux because Nvidia was heavily invested in gaming cards when Linux was starting to get to a point vaguely resembling desktop usability, but there just wasn't the software there on the Linux side of things to make it worth their time to write a whole new driver stack that was anywhere close to on par with their Windows drivers. Torvalds then did his usual autistic screeching and told Nvidia to go gently caress themselves, and well, lol, if I were in Nvidia's position I wouldn't spend a shitload of money to help Torvalds and his enraged fireball of an ego. There's only half-decent drivers now because of the branching out in Nvidia's target markets.

So I think it's less an Nvidia problem and more of a "Linux continues to be led by someone with the social skills of a thermonuclear device" problem. But that's just my opinion, and one that always leads to Linux crusaders calling me a Microsoft shill or something equally obtuse over.


It's frankly amazing that ATI's corpse continues to shamble. This whole architectural launch is just embarrassing. :cripes:

That's quite a poorly informed opinion. Just like believing that NVidia does give a flying gently caress what Linus says or doesnt say about them or their cards. NVidia has closed source drivers on Linux and other OSes because Hollywood is making a shitton of movies on these cards and they're running a very wide range of software and OS-es in their datacenters. Has nothing to do with games and nothing to do with their love/hate of opensource. It only has to do with money. The people who play games on Linux (hell, Linux as a whole) are in the extreme minority and if they all die tomorrow, NVidia's sales would not even register it on their radar.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Linus also did his idiot rant against closed standards on ARM mobile hardware because building a community-backed Linux OS on already-Android devices is obviously such a huge, lucrative and completely viable market.

Palladium fucked around with this message at 04:23 on Jul 9, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
There is literally no doubt in my mind that Big Polaris was the correct choice now. 3328-3584 shaders, 384 bit bus, probably 225-250W TBP, actually beats the 1080 consistently.

RX 480 Firestrike average is 13380, RX 470 average is 11770. That gives a 13% advantage, multiplied the RX 470 shader count of 2048 and you get 2328, 1% difference. This is reference to reference so they'd actually be at similar clocks due to the known RX 480 reference design throttling issues. Using this information we can infer that every properly supplied 256 shaders nets a 1610 point difference. The GTX 1080 average score is 21950, or in other words to beat the 1080, based on Polaris AMD needs ~3667 shaders at ~1200Mhz. 3328 shaders do this at 1323Mhz, 3584 shaders do this at 1228Mhz. This really shouldn't be pulling more than 225W at baseclock (obviously more when pushed to ~1500Mhz, probably ~270-300W, and beating the 1080 if Polaris clock scaling holds, some 14% slower than a 1080ti) and while we would have giggled at AMD for failing to match Nvidia in perf/power it'd have been an acceptable product that could slot into nearly any system.

Volguus
Mar 3, 2009

Palladium posted:

Linus also did his idiot rant against closed standards on ARM mobile hardware because building a community-built Linux OS on already-Android devices is obviously such a huge, lucrative and completely viable market.

Linus cares about desktop about the user experience. He does not give a poo poo about the "big iron". The problem is, none of the companies who actually contribute massively to the kernel care about anything else but big iron (RedHat, Microsoft, IBM, etc.). And NVidia is in the same spot: they are focused on the big datacenters, and if it so happens that the unwashed masses get a driver that they can play WoW on Linux from that effort (all 5 of them) ... meh, that's fine.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
The problem is that apart from HPC/datacenter there's just no money in supporting Linux, so NVIDIA basically wanted to write a shim/wrapper that calls into their existing driver stack. The Linux team think this is unacceptable (they take a hardline stance against warping their APIs to suit specific vendors), and NVIDIA is unwilling to rewrite their existing stack to suit Linux so the open-source option is more or less off the table. Instead NVIDIA went closed-source instead so they could do what they wanted rather than humoring a bunch of :spergin: kernel nerds.

And without NVIDIA's backing they don't have much of a chance. Nouveau is basically hobby-project quality, it boots and runs a desktop/games/etc but it's basically nowhere near the capabilities of the hardware.

The funny thing is that's basically what the Raspberry Pi does - the "open-source GPU driver" Broadcom released is literally just a shim that makes calls into a blob, but this time that's totally fine with the :spergin:'s because :words:

Paul MaudDib fucked around with this message at 04:41 on Jul 9, 2017

Cygni
Nov 12, 2005

raring to post

Maybe SteamOS will finally be the revolution and 2018 will be the year of linux on the d-*laugh track starts early*

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Yeah because the average cash strapped third worlder/teenager who want to PC game on the cheap is obviously gonna be an open source OS whiteknight and will not just simply install a bootleg Win10 ISO.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Palladium posted:

Yeah because the average cash strapped third worlder/teenager who want to PC game on the cheap is obviously gonna be an open source OS whiteknight and will not just simply install a bootleg Win10 ISO.

Whatever happened to Microsoft's sure-fire plan to sell those markets crippled versions of Windows that could only run like two programs at once?

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Paul MaudDib posted:

Whatever happened to Microsoft's sure-fire plan to sell those markets crippled versions of Windows that could only run like two programs at once?

it has to be better than pirated to be competitive. I think they have to be able to enforce some kind of penalties before they can offer up an alternative.

Even after the latest big ransomware you just know all the chinese computers are still running the same cracked windows XP

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

AVeryLargeRadish posted:

Buildzoid did a stream today, he talked a fair bit about Vega and he has come to the conclusion that the card is very badly power throttled at 300W, conservatively it's gonna need like 400-500W on water. Basically you will want an 800W+ PSU bare minimum.

Quoting myself here to add something: that 400-500W is at stock clocks, not OCed, who knows what it would pull if you OCed it. He's seen it use 450W at stock clocks and still want more.

Craptacular!
Jul 9, 2001

Fuck the DH
I think Ubuntu etc proved that Linux is viable to a certain segment of people, and you can even show it's strengths to people in some degree or another now that "App Stores" have conditioned people to see the strengths in object-oriented design and pre-compiled binary package management. "I just install this and it grabs the prerequisite packages and nicely all puts it together and I can uninstall whenever I like" has some strengths over the Installshield/Nullsoft installers that throw poo poo everywhere and never clean up after themselves that we've all become accustomed to. The thing is, people want non-FOSS drivers, non-FOSS file format support in music and videos, etc. They want, at minimum, some sort of *buntu with the proprietary driver installable at the push of a button to get full 3D acceleration and the non-free repo added by default and the "Click here to install MP3 and DVD support UNLESS YOU ARE AMERICAN because it's against the DMCA but we know you're totally an American and clicking this box" box.

Linus never has been an acolyte that every element of the operating system needs to be approved by the standards of libre pedants, he just makes the kernel and never really cared whether your distro is so built around FOSS that normal people have to go find the hidden repository of useful poo poo (Fedora and RPMFusion) or has tons and tons of closed source code bundled with it. That in olden times was always Stallman's thing, and Linux desktop "works" for a lot of people if your design philosophy is so opposed to his that you practically have "gently caress RICHARD STALLMAN" commented at the top of every page of code. (And you should probably do this anyway, because he's a terrible person.)

Because people like patented, protected methods of actually accomplishing their poo poo no matter how capable the free software people's alternative is. And consequently, the "most people" who would be fine with a Linux desktop today are also just fine with iPads so oh well.
I'm fine with consumer grade Linux but all I'm gonna do in the end is run Chrome and Steam on top of it.

Craptacular! fucked around with this message at 05:45 on Jul 9, 2017

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Someone's gonna try to quad crossfire this thing and discover that they need to hire an electrician. Super Flowers' nice shiny 2kW PSU might not cut it either, luckily there are cases that fit dual PSUs.

I am legitimately more excited about silly videos related to Vega power consumption than I am about the gaming performance.

MaxxBot fucked around with this message at 05:48 on Jul 9, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

AVeryLargeRadish posted:

Quoting myself here to add something: that 400-500W is at stock clocks, not OCed, who knows what it would pull if you OCed it. He's seen it use 450W at stock clocks and still want more.

As in he's increasing the power limit in software? Or in hardware somehow? (overvolting will draw more power but you'd only overvolt if you're OC'ing, or your chip isn't stable at factory clocks)

Adbot
ADBOT LOVES YOU

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Paul MaudDib posted:

As in he's increasing the power limit in software? Or in hardware somehow? (overvolting will draw more power but you'd only overvolt if you're OC'ing, or your chip isn't stable at factory clocks)

I'm not sure how since he just talked about it before he got to the point of the stream which was OCing an i7-7700k on LN2. Hell, it might be that this is 2nd hand stuff that he knows from some other OCer he knows who also got ahold of a Vega card.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply