Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
SRQ
Nov 9, 2009

This'll be just like hardware TnL in 2001 and DX10 in 2006. The first gen will be kinda neat and come with a steep premium and quickly be obsolete and useless as by the time enough adoption is there to justify developing for it, faster cards for a lower price will exist.

although those 8800 cards lasted for a while because they were just enormous motherfucks that required a small nuclear reactor to run.

also just like both those cases people are dismissing the new technology out of hand when it's not immediately useful, although in this case the lack of demos is odd. TnL and DX10 both had tons demonstrable content early on.

those were also both industry advancements not just one side so i guess it depends on if AMD follows along or this becomes Hairworks for Lighting

conclusion: rtx is meaningless until the 8800GT 2 is out.

SRQ fucked around with this message at 10:51 on Nov 5, 2018

Adbot
ADBOT LOVES YOU

Powerful Two-Hander
Mar 10, 2004

Mods please change my name to "Tooter Skeleton" TIA.


I remember thinking hardware t&l was the most incredible thing ever when I first enabled it

anyway I bought a 1070ti and it is perfectly able to run things at maximum on 1440p but then I spend 90% of my time grinding path of exile which means it's kind of excessive

the ridiculous 3 slot cooler is really quiet though

Cybernetic Vermin
Apr 18, 2005

it seems that the 2080 is still too slow to really make the raytracing practical technology (though that may be a misunderstanding of what has so far been reported), and we are not where we were vis-à-vis moores law now that we were in 2001. as such it seems unlikely to make it into a next generation of consoles (and perhaps not even a next one after that), which means any broad game support is 10+ years away

Condiv
May 7, 2008

Sorry to undo the effort of paying a domestic abuser $10 to own this poster, but I am going to lose my dang mind if I keep seeing multiple posters who appear to be Baloogan.

With love,
a mod


Cybernetic Vermin posted:

it seems that the 2080 is still too slow to really make the raytracing practical technology (though that may be a misunderstanding of what has so far been reported), and we are not where we were vis-à-vis moores law now that we were in 2001. as such it seems unlikely to make it into a next generation of consoles (and perhaps not even a next one after that), which means any broad game support is 10+ years away

sooo...

you're saying nvidia tricked people into paying massive markups for hardware that will never be able to be used for what it was designed for and is only slightly better than last gen hardware for current tasks?

Olivil
Jul 15, 2010

Wow I'd like to be as smart as a computer
the RTX cards could be useful for DLSS if it gets implemented

Jenny Agutter
Mar 18, 2009

Sniep posted:

what about RTX

the ray tracing poo poo

has anyone proof of concepted that yet - is there like, a demo, or something? (There's none on nvidia's own demos page)

DirectX raytracing is included in the windows 10 October 2018 update that is currently in limbo

Coffee Jones
Jul 4, 2004

16 bit? Back when we was kids we only got a single bit on Christmas, as a treat
And we had to share it!

Condiv posted:

sooo...

you're saying nvidia tricked people into paying massive markups for hardware that will never be able to be used for what it was designed for and is only slightly better than last gen hardware for current tasks?

that’s one way of looking at it, sure. There’s different ways of slicing and dicing the same data
https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
https://www.anandtech.com/bench/GPU18/2304
extremely overkill vs extremely overkill and then some


Red Dead on consoles is locked to 30FPS so an example of a next gen task would be a PC port with an unlocked frame rate and higher resolutions.


Re: Raytracing -
Any new tech is going to have a chicken and egg problem, Nvidia has to introduce the hardware and they paid to incorporate RT into a new high profile game, maybe Unreal Engine will have support in an upcoming version?

Coffee Jones fucked around with this message at 00:06 on Nov 6, 2018

brand engager
Mar 23, 2011

I don't think we'll know whether the rtx stuff is just a gimmick until at least the series after these. The neural-net stuff screams fad-chasing to me.

Sagebrush
Feb 26, 2012

why don't graphics cards support hdmi-cec?

my computer should 100% be able to turn on the tv automatically when it is woken from sleep, goddamnit

SRQ
Nov 9, 2009

https://www.pulse-eight.com/p/104/usb-hdmi-cec-adapter

E: this does not do what you ask and i am confused as to why it's hard to do

Notorious b.s.d.
Jan 25, 2003

by Reene

Sagebrush posted:

why don't graphics cards support hdmi-cec?

my computer should 100% be able to turn on the tv automatically when it is woken from sleep, goddamnit

do TVs not typically support DPMI?

Sagebrush
Feb 26, 2012

SRQ posted:

https://www.pulse-eight.com/p/104/usb-hdmi-cec-adapter

E: this does not do what you ask and i am confused as to why it's hard to do

right? like, we've had wake-on-lan for like thirty fuckin years or something. why can't we just have wake-on-hdmi?

the tv does already support it, apparently, just that the graphics card doesn't have any way of sending the signal.


Notorious b.s.d. posted:

do TVs not typically support DPMI?

i don't know what that is.

SRQ
Nov 9, 2009

why isn't power on via usb a thing.

the power macintosh G3 actually did that with the keyboard why did it stop beign a thing.

Powerful Two-Hander
Mar 10, 2004

Mods please change my name to "Tooter Skeleton" TIA.


Sagebrush posted:

why don't graphics cards support hdmi-cec?

my computer should 100% be able to turn on the tv automatically when it is woken from sleep, goddamnit

in my experience cec barely loving works even when both devices support it

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

Powerful Two-Hander posted:

in my experience cec barely loving works even when both devices support it

ceo seems to be almost entirely optional features for the manufacturers to implement so nobody ever implements the same stuff

cowboy beepboop
Feb 24, 2001

my chromecast works well with my ancient sarnsung 32" tv and can turn it on and off. also the remote can control playback, thank you hdmi-cec.

cowboy beepboop
Feb 24, 2001

or "anynet+ magic" as samsung liked to call it back in 2007

heated game moment
Oct 30, 2003

Lipstick Apathy

Powerful Two-Hander posted:

in my experience cec barely loving works even when both devices support it

it works great on my apple tv 4k

Endless Mike
Aug 13, 2003



cec works like 90-95% right for me.

PleasureKevin
Jan 2, 2011

Xaris posted:

so like as an average idiot joe myself,

what's the best current or, as much as is known, the upcoming bang for the buck card? i remember poo poo like visiontek geforce TI 4200 being incredible bang but i dont think deals like that exist anymore?

my 970 i think was pretty good bu tits showing its age, im not interested in 4k ultra everything, but i'd like to run 1080p 144hz on high settings in most games f possible and this usually struggles to keep over 90 at medium+ settings.

are amd cards still really bad or are they actually viable again? i remember like 2008-2016 ish or something they were pretty awful

no AMDs are not really bad. they are video cards that do the exact same poo poo as nVideo. although whether they do “Ray Tracing” as well as $700 RT cards will take a few years to reach a verdict. bare in mind Ray Tracing is NOT on the low or mid range cards, just the very expensive ones.

AMD today announced 7nm GPUs with 1TB/s bandwidth and HBM2 and all kinds of stuff. again i don’t really care if they have the highest-benchmarking card on the block today, i actually care most about future-proof technology that doesn’t leave you with 3.5/4GB of VRam when the consoles all had 8GB shared GDDR5 like a couple years ago.

take a look at the history of nVidium cards and you can see whatever you want to see. it depends really on what timeframe you look at. over-heating problems, bad drivers, super behind schedule, way too expensive, etc. if you adjust the dates on this hypothetical comparison, it looks the other way; Geforce is more powerful and arrives just when you’re planning to upgrade, and it’s AMDs cards that are playing catch-up, awaiting better drivers to showcase their true potential, and so on.

in fact this is where AMD is outdone by nValida, and basically my whole thesis: nVidia is able to portray a very narrow sample of performance highlights and capture mindshare at launch, whereas AMD may lead in technology and even performance, but doesn’t “sell” their strengths, which sometimes don’t show themselves until months or years after the chip debuts.

but just look at the RT launch. there is no software. it’s entirely a matter of nVidia’s deep pockets and marketing ethics that they seem to have the equivalent of great console launches; buzzword-laced products that pull ahead using a mixture of yesterday’s technology and emphasis on re-branding that tech in blatant benchmark rigging. this is the elephant in the room, and most companies and careers in tech journalism are fated based on how well you can keep it this weird open secret.

flakeloaf
Feb 26, 2003

Still better than android clock

Sagebrush posted:

why don't graphics cards support hdmi-cec?

my computer should 100% be able to turn on the tv automatically when it is woken from sleep, goddamnit

i'd be happy if the tv, that was already on, noticed when the computer woke up from sleep mode without having to be powercycled

Lightbulb Out
Apr 28, 2006

slack jawed yokel
intel nucs turn on and off the tv via hdmi

why can’t nvidia

pram
Jun 10, 2001
it probably costs like 2 cents to license turning a tv on/off or something

KOTEX GOD OF BLOOD
Jul 7, 2012

why is your nvidia card connected to a tv

My Linux Rig
Mar 27, 2010
Probation
Can't post for 6 years!

incels interlinked posted:

it works great on my apple tv 4k

:same:

although it has to be directly connected. if it goes through anything on its way to the tv then the cec stops working or at least it does in my experience

Agile Vector
May 21, 2007

scrum bored



My Linux Rig posted:

:same:

although it has to be directly connected. if it goes through anything on its way to the tv then the cec stops working or at least it does in my experience

the atv4 as well, but sometimes itll slip into sleep mode and forget the tv, which is baffling

KOTEX GOD OF BLOOD posted:

why is your nvidia card connected to a tv

the only real reason: a very nice motion smoothed excel spreadsheet is best viewed in 60 inches

Notorious b.s.d.
Jan 25, 2003

by Reene

Agile Vector posted:

the atv4 as well, but sometimes itll slip into sleep mode and forget the tv, which is baffling


the only real reason: a very nice motion smoothed excel spreadsheet is best viewed in 60 inches

excel is a better video player than vlc

Agile Vector
May 21, 2007

scrum bored



after checking out VLC for the first time in years i wouldn't doubt it. poo poo looks ancient

was it excel that snuck a 3D game engine in one release?

Laslow
Jul 18, 2007

Agile Vector posted:

was it excel that snuck a 3D game engine in one release?
yeah. they had to stop doing that kind of stuff because of stipulations in government contracts disallowing undocumented features in any software they purchase or operate. i understand the reasoning, but it doesn't make the government any less lame.

Condiv
May 7, 2008

Sorry to undo the effort of paying a domestic abuser $10 to own this poster, but I am going to lose my dang mind if I keep seeing multiple posters who appear to be Baloogan.

With love,
a mod


apparently the new nvidia cards are getting BSODs when connected to certain monitors

or attached to two monitors at once

https://www.youtube.com/watch?v=A4g5CZCaWXo

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
nVidia drivers causing BSODs well i never

Good Sphere
Jun 16, 2018

anyone talking about nvidia's stock plunge? i was just checking prices and came across it

https://www.fool.com/investing/2018/11/06/why-nvidia-stock-lost-25-in-october.aspx

kinda baffling considering their recent popularity. i think it's the lowest its been since last year still

akadajet
Sep 14, 2003

you can buy stock in mongodb? lol

quote:

One of the stocks that I personally bought while the market fell over the last few weeks was MongoDB. If you're looking for a potential hot ticket to wealth, MongoDB could be right up your alley, too.

Cybernetic Vermin
Apr 18, 2005

Good Sphere posted:

anyone talking about nvidia's stock plunge? i was just checking prices and came across it

https://www.fool.com/investing/2018/11/06/why-nvidia-stock-lost-25-in-october.aspx

kinda baffling considering their recent popularity. i think it's the lowest its been since last year still

pretty easy to see the issue, in the one-two of crypto mining bullshit finally dying down and a lot of companies launching into custom machine learning hardware. nvidias pricing was high under some weird assumptions that they'd have a lasting foothold and revenue stream in those two areas, there is only so large a market for high-end graphics in itself.

Good Sphere
Jun 16, 2018

Cybernetic Vermin posted:

pretty easy to see the issue, in the one-two of crypto mining bullshit finally dying down and a lot of companies launching into custom machine learning hardware. nvidias pricing was high under some weird assumptions that they'd have a lasting foothold and revenue stream in those two areas, there is only so large a market for high-end graphics in itself.

i thought the crypto mining thing making it drop happened last year, but possibly it echoed with people finally switching to custom hardware?

it seems like they forecasted this drop too trying to make other services appealing other than graphics like super computing

Fuzzy Mammal
Aug 15, 2001

Lipstick Apathy
what's the deal with ati's new poo poo today?

ADINSX
Sep 9, 2003

Wanna run with my crew huh? Rule cyberspace and crunch numbers like I do?

Cybernetic Vermin posted:

pretty easy to see the issue, in the one-two of crypto mining bullshit finally dying down and a lot of companies launching into custom machine learning hardware. nvidias pricing was high under some weird assumptions that they'd have a lasting foothold and revenue stream in those two areas, there is only so large a market for high-end graphics in itself.

You can also get cloud instances with GPU acceleration for tensorflow and stuff. Are those GPUs being made by nvidia, or someone else? The market is more than just graphics if you count all machine learning/AI stuff, but idiots running bitcoin probably dwarfed both those markets.

Nomnom Cookie
Aug 30, 2009



AWS gpu instances are all nvidia

Adbot
ADBOT LOVES YOU

Cybernetic Vermin
Apr 18, 2005

ADINSX posted:

You can also get cloud instances with GPU acceleration for tensorflow and stuff. Are those GPUs being made by nvidia, or someone else? The market is more than just graphics if you count all machine learning/AI stuff, but idiots running bitcoin probably dwarfed both those markets.

cuda largely means nvidia has the Enterprise gpgpu stuff secured, but custom designs for the deep learning look set to encroach on that, and tbqh i expect aws is a thorn in their side in that amazon no doubt pushes a bit on price, and most outfits wont buy huge stacks of GPUs they can't keep busy anyway when the cost is similar or less getting aws instances as needed

  • Locked thread