|
This'll be just like hardware TnL in 2001 and DX10 in 2006. The first gen will be kinda neat and come with a steep premium and quickly be obsolete and useless as by the time enough adoption is there to justify developing for it, faster cards for a lower price will exist. although those 8800 cards lasted for a while because they were just enormous motherfucks that required a small nuclear reactor to run. also just like both those cases people are dismissing the new technology out of hand when it's not immediately useful, although in this case the lack of demos is odd. TnL and DX10 both had tons demonstrable content early on. those were also both industry advancements not just one side so i guess it depends on if AMD follows along or this becomes Hairworks for Lighting conclusion: rtx is meaningless until the 8800GT 2 is out. SRQ fucked around with this message at 10:51 on Nov 5, 2018 |
# ? Nov 5, 2018 10:46 |
|
|
# ? May 7, 2024 10:57 |
|
I remember thinking hardware t&l was the most incredible thing ever when I first enabled it anyway I bought a 1070ti and it is perfectly able to run things at maximum on 1440p but then I spend 90% of my time grinding path of exile which means it's kind of excessive the ridiculous 3 slot cooler is really quiet though
|
# ? Nov 5, 2018 10:55 |
|
it seems that the 2080 is still too slow to really make the raytracing practical technology (though that may be a misunderstanding of what has so far been reported), and we are not where we were vis-à-vis moores law now that we were in 2001. as such it seems unlikely to make it into a next generation of consoles (and perhaps not even a next one after that), which means any broad game support is 10+ years away
|
# ? Nov 5, 2018 10:58 |
|
Cybernetic Vermin posted:it seems that the 2080 is still too slow to really make the raytracing practical technology (though that may be a misunderstanding of what has so far been reported), and we are not where we were vis-à-vis moores law now that we were in 2001. as such it seems unlikely to make it into a next generation of consoles (and perhaps not even a next one after that), which means any broad game support is 10+ years away sooo... you're saying nvidia tricked people into paying massive markups for hardware that will never be able to be used for what it was designed for and is only slightly better than last gen hardware for current tasks?
|
# ? Nov 5, 2018 11:03 |
|
the RTX cards could be useful for DLSS if it gets implemented
|
# ? Nov 5, 2018 14:19 |
|
Sniep posted:what about RTX DirectX raytracing is included in the windows 10 October 2018 update that is currently in limbo
|
# ? Nov 5, 2018 22:15 |
Condiv posted:sooo... that’s one way of looking at it, sure. There’s different ways of slicing and dicing the same data https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html https://www.anandtech.com/bench/GPU18/2304 extremely overkill vs extremely overkill and then some Red Dead on consoles is locked to 30FPS so an example of a next gen task would be a PC port with an unlocked frame rate and higher resolutions. Re: Raytracing - Any new tech is going to have a chicken and egg problem, Nvidia has to introduce the hardware and they paid to incorporate RT into a new high profile game, maybe Unreal Engine will have support in an upcoming version? Coffee Jones fucked around with this message at 00:06 on Nov 6, 2018 |
|
# ? Nov 5, 2018 23:56 |
|
I don't think we'll know whether the rtx stuff is just a gimmick until at least the series after these. The neural-net stuff screams fad-chasing to me.
|
# ? Nov 6, 2018 00:24 |
|
why don't graphics cards support hdmi-cec? my computer should 100% be able to turn on the tv automatically when it is woken from sleep, goddamnit
|
# ? Nov 6, 2018 03:54 |
|
https://www.pulse-eight.com/p/104/usb-hdmi-cec-adapter E: this does not do what you ask and i am confused as to why it's hard to do
|
# ? Nov 6, 2018 07:13 |
|
Sagebrush posted:why don't graphics cards support hdmi-cec? do TVs not typically support DPMI?
|
# ? Nov 6, 2018 07:22 |
|
SRQ posted:https://www.pulse-eight.com/p/104/usb-hdmi-cec-adapter right? like, we've had wake-on-lan for like thirty fuckin years or something. why can't we just have wake-on-hdmi? the tv does already support it, apparently, just that the graphics card doesn't have any way of sending the signal. Notorious b.s.d. posted:do TVs not typically support DPMI? i don't know what that is.
|
# ? Nov 6, 2018 08:04 |
|
why isn't power on via usb a thing. the power macintosh G3 actually did that with the keyboard why did it stop beign a thing.
|
# ? Nov 6, 2018 08:56 |
|
Sagebrush posted:why don't graphics cards support hdmi-cec? in my experience cec barely loving works even when both devices support it
|
# ? Nov 6, 2018 13:03 |
|
Powerful Two-Hander posted:in my experience cec barely loving works even when both devices support it ceo seems to be almost entirely optional features for the manufacturers to implement so nobody ever implements the same stuff
|
# ? Nov 6, 2018 13:10 |
|
my chromecast works well with my ancient sarnsung 32" tv and can turn it on and off. also the remote can control playback, thank you hdmi-cec.
|
# ? Nov 6, 2018 13:28 |
|
or "anynet+ magic" as samsung liked to call it back in 2007
|
# ? Nov 6, 2018 13:28 |
|
Powerful Two-Hander posted:in my experience cec barely loving works even when both devices support it it works great on my apple tv 4k
|
# ? Nov 6, 2018 13:32 |
|
cec works like 90-95% right for me.
|
# ? Nov 6, 2018 14:56 |
|
Xaris posted:so like as an average idiot joe myself, no AMDs are not really bad. they are video cards that do the exact same poo poo as nVideo. although whether they do “Ray Tracing” as well as $700 RT cards will take a few years to reach a verdict. bare in mind Ray Tracing is NOT on the low or mid range cards, just the very expensive ones. AMD today announced 7nm GPUs with 1TB/s bandwidth and HBM2 and all kinds of stuff. again i don’t really care if they have the highest-benchmarking card on the block today, i actually care most about future-proof technology that doesn’t leave you with 3.5/4GB of VRam when the consoles all had 8GB shared GDDR5 like a couple years ago. take a look at the history of nVidium cards and you can see whatever you want to see. it depends really on what timeframe you look at. over-heating problems, bad drivers, super behind schedule, way too expensive, etc. if you adjust the dates on this hypothetical comparison, it looks the other way; Geforce is more powerful and arrives just when you’re planning to upgrade, and it’s AMDs cards that are playing catch-up, awaiting better drivers to showcase their true potential, and so on. in fact this is where AMD is outdone by nValida, and basically my whole thesis: nVidia is able to portray a very narrow sample of performance highlights and capture mindshare at launch, whereas AMD may lead in technology and even performance, but doesn’t “sell” their strengths, which sometimes don’t show themselves until months or years after the chip debuts. but just look at the RT launch. there is no software. it’s entirely a matter of nVidia’s deep pockets and marketing ethics that they seem to have the equivalent of great console launches; buzzword-laced products that pull ahead using a mixture of yesterday’s technology and emphasis on re-branding that tech in blatant benchmark rigging. this is the elephant in the room, and most companies and careers in tech journalism are fated based on how well you can keep it this weird open secret.
|
# ? Nov 6, 2018 19:46 |
|
Sagebrush posted:why don't graphics cards support hdmi-cec? i'd be happy if the tv, that was already on, noticed when the computer woke up from sleep mode without having to be powercycled
|
# ? Nov 6, 2018 20:33 |
|
intel nucs turn on and off the tv via hdmi why can’t nvidia
|
# ? Nov 6, 2018 20:39 |
|
it probably costs like 2 cents to license turning a tv on/off or something
|
# ? Nov 7, 2018 21:04 |
|
why is your nvidia card connected to a tv
|
# ? Nov 7, 2018 23:38 |
|
incels interlinked posted:it works great on my apple tv 4k although it has to be directly connected. if it goes through anything on its way to the tv then the cec stops working or at least it does in my experience
|
# ? Nov 8, 2018 01:38 |
|
My Linux Rig posted:
the atv4 as well, but sometimes itll slip into sleep mode and forget the tv, which is baffling KOTEX GOD OF BLOOD posted:why is your nvidia card connected to a tv the only real reason: a very nice motion smoothed excel spreadsheet is best viewed in 60 inches
|
# ? Nov 8, 2018 06:22 |
|
Agile Vector posted:the atv4 as well, but sometimes itll slip into sleep mode and forget the tv, which is baffling excel is a better video player than vlc
|
# ? Nov 8, 2018 06:56 |
|
after checking out VLC for the first time in years i wouldn't doubt it. poo poo looks ancient was it excel that snuck a 3D game engine in one release?
|
# ? Nov 8, 2018 07:00 |
Agile Vector posted:was it excel that snuck a 3D game engine in one release?
|
|
# ? Nov 8, 2018 07:28 |
|
apparently the new nvidia cards are getting BSODs when connected to certain monitors or attached to two monitors at once https://www.youtube.com/watch?v=A4g5CZCaWXo
|
# ? Nov 8, 2018 16:09 |
|
nVidia drivers causing BSODs well i never
|
# ? Nov 9, 2018 01:06 |
|
anyone talking about nvidia's stock plunge? i was just checking prices and came across it https://www.fool.com/investing/2018/11/06/why-nvidia-stock-lost-25-in-october.aspx kinda baffling considering their recent popularity. i think it's the lowest its been since last year still
|
# ? Nov 9, 2018 15:44 |
|
you can buy stock in mongodb? lolquote:One of the stocks that I personally bought while the market fell over the last few weeks was MongoDB. If you're looking for a potential hot ticket to wealth, MongoDB could be right up your alley, too.
|
# ? Nov 9, 2018 15:46 |
|
Good Sphere posted:anyone talking about nvidia's stock plunge? i was just checking prices and came across it pretty easy to see the issue, in the one-two of crypto mining bullshit finally dying down and a lot of companies launching into custom machine learning hardware. nvidias pricing was high under some weird assumptions that they'd have a lasting foothold and revenue stream in those two areas, there is only so large a market for high-end graphics in itself.
|
# ? Nov 9, 2018 16:08 |
|
Cybernetic Vermin posted:pretty easy to see the issue, in the one-two of crypto mining bullshit finally dying down and a lot of companies launching into custom machine learning hardware. nvidias pricing was high under some weird assumptions that they'd have a lasting foothold and revenue stream in those two areas, there is only so large a market for high-end graphics in itself. i thought the crypto mining thing making it drop happened last year, but possibly it echoed with people finally switching to custom hardware? it seems like they forecasted this drop too trying to make other services appealing other than graphics like super computing
|
# ? Nov 9, 2018 16:39 |
|
what's the deal with ati's new poo poo today?
|
# ? Nov 9, 2018 17:07 |
|
Cybernetic Vermin posted:pretty easy to see the issue, in the one-two of crypto mining bullshit finally dying down and a lot of companies launching into custom machine learning hardware. nvidias pricing was high under some weird assumptions that they'd have a lasting foothold and revenue stream in those two areas, there is only so large a market for high-end graphics in itself. You can also get cloud instances with GPU acceleration for tensorflow and stuff. Are those GPUs being made by nvidia, or someone else? The market is more than just graphics if you count all machine learning/AI stuff, but idiots running bitcoin probably dwarfed both those markets.
|
# ? Nov 9, 2018 17:24 |
|
AWS gpu instances are all nvidia
|
# ? Nov 9, 2018 17:51 |
|
|
# ? May 7, 2024 10:57 |
|
ADINSX posted:You can also get cloud instances with GPU acceleration for tensorflow and stuff. Are those GPUs being made by nvidia, or someone else? The market is more than just graphics if you count all machine learning/AI stuff, but idiots running bitcoin probably dwarfed both those markets. cuda largely means nvidia has the Enterprise gpgpu stuff secured, but custom designs for the deep learning look set to encroach on that, and tbqh i expect aws is a thorn in their side in that amazon no doubt pushes a bit on price, and most outfits wont buy huge stacks of GPUs they can't keep busy anyway when the cost is similar or less getting aws instances as needed
|
# ? Nov 9, 2018 18:36 |