Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Cybernetic Vermin
Apr 18, 2005

either way there seems to be little reason to buy a 2080 over a 1080, raytracing seems to be the differentiator, but buying into that sort of tech first-gen largely never pays off. 50/50 whether it is a dead-end implementation (possibly with another raytracing take with slightly incompatible primitives taking over) or whether the first title to make significant use of it (i.e. regularly feature it in gameplay rather than have one room with a shiny statue seated above a pool of water surrounded by a checkerboard floor) requires at least a 3080, or i guess 4080 or whatever is consistent with 980 -> 1080 -> 2080, to run acceptably

largely determined by the route next-gen consoles take either way, little point buying into any hardware that doesn't just do things the consoles can do, except faster and at higher resolution

Adbot
ADBOT LOVES YOU

Cybernetic Vermin
Apr 18, 2005

Broken Machine posted:

it's the other way 'round, but yeah. it first figures out the frame at a higher res and downsamples it, giving a nice anti-aliasing effect

it might be able to do that too (the 'ss' in the name hopefully refers to 'supersampling' rather than signifying a nazi influence), but i am quite sure the most common usecase will be to replace the now-standard engine (temporal) upsampling "AA"

e.g. the demo nvidia used to show it off was some epic thing that rendered in 1440k and used dlss to upsample to 4k

the weird bit is that it'll be a feature mostly for AAA titles, since the full effect is only achieved by giving nvidia a custom build of the game which renders scenes in various resolutions at the same time (and presumably navigates through various parts of the game automatically), which nvidia then grinds out a specific neural network for. as far as i understand the tools to make the nets are not available for anyone else to work with

Cybernetic Vermin
Apr 18, 2005

KOTEX GOD OF BLOOD posted:

hackintoshers and egpu people are enough of a combined market that the drivers are worth developing? i guess? i cant imagine more than a couple hundred people are still using cheese grater mac pros except with fancy new graphics cards.

i pretty much assume that there is some big customer that has something hacked up in one way or another and asks nvidia to do the drivers, the public release being largely incidental

i also assume that there'll only be drivers for the latest arch when that customer upgrades

Cybernetic Vermin
Apr 18, 2005

DuckConference posted:

in some ways the stock arm cores are more impressive than the apple CPUs, because they’re close to competitive while having way lower area. like on die shot comparisons they’re sometimes like half the size at iso process

in general one should remember that there is a certain level of getting what one pays for, the apple a12 has 6.9 billion transistors to a snapdragon 845 with 5.3 billion (i believe these are both estimates from die shots + averages of the processes, so a grain of salt should be added), and the latter implements the modem within that envelope (where apple uses a separate chip)

still really apples to apples to compare performance, but it is not like apple is applying some truly unheard-of magic to their stuff

Cybernetic Vermin
Apr 18, 2005

Notorious b.s.d. posted:

i'm very impressed

you chose to compare to a dogshit midrange chip, but apple is delivering like 80% of the performance of intel's best single-core performance

in a cellphone chip, no less. which is necessarily compromised to fit into that crappy power and packaging envelope

maybe their desktop arm effort will actually be good and cool

power is one thing, but the packaging situation is pretty good, for one the a12 has 3x the transistors of that skylake, and for the other the memory is on-package (relieving pin-out issues), so outside of power they are not exactly constraining themselves

not to mean it isn't impressive, just that some of the ingrained ideas of mobile processors don't really hold anymore

Cybernetic Vermin fucked around with this message at 09:49 on Nov 4, 2018

Cybernetic Vermin
Apr 18, 2005

e: nm

Cybernetic Vermin
Apr 18, 2005

it seems that the 2080 is still too slow to really make the raytracing practical technology (though that may be a misunderstanding of what has so far been reported), and we are not where we were vis-à-vis moores law now that we were in 2001. as such it seems unlikely to make it into a next generation of consoles (and perhaps not even a next one after that), which means any broad game support is 10+ years away

Cybernetic Vermin
Apr 18, 2005

Good Sphere posted:

anyone talking about nvidia's stock plunge? i was just checking prices and came across it

https://www.fool.com/investing/2018/11/06/why-nvidia-stock-lost-25-in-october.aspx

kinda baffling considering their recent popularity. i think it's the lowest its been since last year still

pretty easy to see the issue, in the one-two of crypto mining bullshit finally dying down and a lot of companies launching into custom machine learning hardware. nvidias pricing was high under some weird assumptions that they'd have a lasting foothold and revenue stream in those two areas, there is only so large a market for high-end graphics in itself.

Adbot
ADBOT LOVES YOU

Cybernetic Vermin
Apr 18, 2005

ADINSX posted:

You can also get cloud instances with GPU acceleration for tensorflow and stuff. Are those GPUs being made by nvidia, or someone else? The market is more than just graphics if you count all machine learning/AI stuff, but idiots running bitcoin probably dwarfed both those markets.

cuda largely means nvidia has the Enterprise gpgpu stuff secured, but custom designs for the deep learning look set to encroach on that, and tbqh i expect aws is a thorn in their side in that amazon no doubt pushes a bit on price, and most outfits wont buy huge stacks of GPUs they can't keep busy anyway when the cost is similar or less getting aws instances as needed

  • Locked thread