Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ZombieCrew
Apr 1, 2019

cheesetriangles posted:

Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again.

Yes. And diablo runs great!

Honestly, the technology is amazing these days and i love to tinker with pcs. Top end gpus were something i wasnt able to afford way back when. Im still using speakers i got when i built my first pc back in 2001.

Adbot
ADBOT LOVES YOU

SSJ_naruto_2003
Oct 12, 2012



Yeah this will be the first time I buy a gpu over about $200 and most of my playtime now is eu4 and league.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

cheesetriangles posted:

Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again.

The reason is that this is an internet forum from 20 years ago, and shockingly enough most of the people here are old as gently caress.

shrike82
Jun 11, 2005

lol, at least we're not collecting GPUs from that era

the closest would be buying CRTs and even so there's some practical value to doing so

cheesetriangles
Jan 5, 2011





K8.0 posted:

The reason is that this is an internet forum from 20 years ago, and shockingly enough most of the people here are old as gently caress.

I see it other places but everyone there is probably old too I guess.

Dr. Video Games 0031
Jul 17, 2004

i'll have you know that my 1000 dollar gpu is being put to good use playing the new release "last call bbs"

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Bioshock Remastered technically counts as a game released in the last decade

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

cheesetriangles posted:

Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again.

I was going to post in this thread about other stuff but then I saw your post and felt seen


(it's actually 26 years old now)

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
When FEAR first came out I could barely run it at the time. I played it again last year and it was an absolute revelation. You just can't beat the old classics

FEAR will be 18 next year. Time flies, it's absurd

Mr. Neutron
Sep 15, 2012

~I'M THE BEST~
Indeed. I love being able to run older games maxed at 4k60.

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

From MLID a couple days ago:



Intel is generally the one company where he isn't completely talking out of his rear end or regurgitating other leaks. That intel slide either corroborates this or wccftech faked it based on mlid's leak (it's a toss-up with them). mlid seemed unsure if all of these SKUs would actually materialize (in NA, at least)

intel denies that A780 exists or has ever been on the cards, so MLID or Intel are lying (probably MLID)

https://twitter.com/ryanshrout/status/1548430503644057605?t=J8MnFihIm3-4UYZ29_836Q&s=19

Dr. Video Games 0031
Jul 17, 2004

repiv posted:

intel denies that A780 exists or has ever been on the cards, so MLID or Intel are lying (probably MLID)

https://twitter.com/ryanshrout/status/1548430503644057605?t=J8MnFihIm3-4UYZ29_836Q&s=19

lmfao

https://twitter.com/mooreslawisdead/status/1548528154020569088

shrike82
Jun 11, 2005

i've been replaying Persona 3 on the Deck. It's great but it ran great on PSP and then Vita over a decade ago so :shrug:

repiv
Aug 13, 2009

the A780 rumour never made that much sense, it had the same specs as the A770 so it could only have been differentiated by power limit and binning

that may make sense for a halo part but that chip isn't going to touch AMD/NVs flagships no matter how much they juice it

Dr. Video Games 0031
Jul 17, 2004

I missed this, but a linux driver update from a few days ago seems to have confirmed the Navi 31 chiplet count: https://videocardz.com/newz/amd-rdna3-navi-31-gpu-with-six-mcds-to-feature-384-bit-wide-memory-bus

The earlier rumor about there being many memory controller dies is seemingly true. Looks like it'll be one compute die (5nm) surrounded by 6 MCDs (6nm). The MCDs will also contain the infinity cache, and there's a possibility 3D stacking could be used to double the cache count per MCD, though this seems more speculative. So only one compute die, but there's a lot of die area being split off into those MCDs that will presumably be quite cheap to manufacture since each one will be small and on 6nm.

Wiggly Wayne DDS
Sep 11, 2010



repiv posted:

all intel needs to do is catch up with AMD/NVs decade+ of driver heuristics and per-game hacks for making DX11 and earlier go fast, no biggie
beyond making sure the game runs i'd wager their plan is making them run good enough on pre-dx12 apis and focusing on dx12/vulkan going forward

or they just focus time on optimising how their driver handles glide and work their way up

Phone
Jul 30, 2005

親子丼をほしい。
Here's something dumb I've been working on, mostly off, for a few months trying to avoid the siren song of a 3080.

I completely missed the mark when I did the math when I built this PC initially in 2019 because a decade of 1920x1200 gaming had been fairly trivial, this compounded further when I went for a 3440x1440 monitor instead of a 2560x1440 that I had been planning around. Turns out that's a lot of pixels and going over 60fps is real hard. I replaced a 2070 with a 2080 Super right before COVID hit, and it's been mostly OK; however, it's kind of noisy and I don't like the GPU temp sitting that high and/or fans spinning up.

I picked up a spare heatsink off of eBay and tried to figure out what to hack away to strap some 120mm fans to it. Things were going OK until I learned about the "proprietary 14-pin connector" for the fans for reference board 20 series PCBs. Arctic does make an adapter from what I can tell; however, it's only to a single output and they don't ship to the US.

I did think about trying to 3d print a shroud; however, I was still running up against how to mount it to the heatsink itself, but zipties...


Left to right: CRJ "PWM to Graphics card" cable, 14-pin cable off of the spare GPU, stock fan leads


Arctic's "proprietary 14-pin adapter"


An afternoon of looking at datasheets and various dead ends, I guessed that I would need to make some Molex Pico Blade to KK 254 cables to make everything work. Working with 30awg wire sucks.

Crimpin' ain't easy


I would have assumed that the brown wire was ground; it was not


Color matched to the wires off of the NF A12x25 PWM


I didn't account for the the MB tray being lower than the rest of the panel in the case, so I had some clearance issues and re-ziptied one of the fans up a bit for clearance. Unfortunately I didn't gently caress up and it worked on the first try, so my wallet goes uninspected and a 3080 is not in the mail.


At 60% or so on Precision X1, the fans are maxed out at 2k rpm. I only messed with it a little bit to move the fan curves down to prevent them from sitting at 100% speed, but so far so good. House/computer hasn't burned down and it looks like I'm able to sit at in boost comfortably. I do need to spend some times to figure out how to properly undervolt this thing and maybe use Afterburner to set the curves and do some weird import into Precision X1. I need to flatten and reinstall real bad, though.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Neat, it’s the kind of project I’d love to have a go at, but I cannot afford to risk a fuckup.

grack
Jan 10, 2012

COACH TOTORO SAY REFEREE CAN BANISH WHISTLE TO LAND OF WIND AND GHOSTS!

What's up fellow "I strapped a pair of 120mm Noctua fans to my 2080's heatsink so it would run quieter" Goon



Instead of trying to run the fans off of the card, I went the much more expedient route and plugged them in to an unused motherboard header. I then used Fan Control to set a custom curve based on GPU temps.

Cygni
Nov 12, 2005

raring to post


can people stop giving this guy info so i dont have to hear from him any more? god he sucks.

Lord Stimperor
Jun 13, 2018

I'm a lovable meme.

I was thinking about the fact that people were, well, whelmed by the Intel GPU lineup. Going by the specs they are really nothing to write home about it seems. But I'm wondering: the big leap isn't really in thinking about silicone and PCBs, it's making it to release, isn't it? Any company can think up clever architectures. But getting the wheels of a big corporation to turn in the right direction is the actual achievement imo. You have to build teams, start up development on a bunch of different aspects, build up a network of suppliers and customers. To me, having all that business infrastructure working smoothly is just as big an achievment as the actual hardware itself. Even if it's Intel we're talking about, considering how much inertia big corpos can develop.

repiv
Aug 13, 2009

it's better than larrabee!

FuturePastNow
May 19, 2014


The vast majority of cards sold (to gamers, at least) are the low to mid range models so focusing on those products, instead of trying to make a 3090-tier halo card, might be the right decision. They just have to get the prices right and do a LOT better with the driver game optimization. Oh and get the cards to market 6 months ago lol

wargames
Mar 16, 2008

official yospos cat censor

Lord Stimperor posted:

I was thinking about the fact that people were, well, whelmed by the Intel GPU lineup. Going by the specs they are really nothing to write home about it seems. But I'm wondering: the big leap isn't really in thinking about silicone and PCBs, it's making it to release, isn't it? Any company can think up clever architectures. But getting the wheels of a big corporation to turn in the right direction is the actual achievement imo. You have to build teams, start up development on a bunch of different aspects, build up a network of suppliers and customers. To me, having all that business infrastructure working smoothly is just as big an achievment as the actual hardware itself. Even if it's Intel we're talking about, considering how much inertia big corpos can develop.

one of the best things i see intel gpu team doing is what is currently showing up in they are sending out technical people to talk with big techtubers. I and them going yes this isn't going to be the fastest gpu, and yes the driver stack is VERY young but "we/intel" are trying to be a good third option.

shrike82
Jun 11, 2005

I’m not sure why Intel gets brownie points for releasing a GPU when they have a ton more resources than AMD or Nvidia and it’s not like they’re entering a new product market

Maybe if they had managed to use their own fabs but even they had to go to TSMC

WonkyBob
Jan 1, 2013

Holy shit, you own a skirt?!

cheesetriangles posted:

Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again.

I bought my 1000 pound gpu because I want those sweet RT reflections.

MarcusSA
Sep 23, 2007

They maybe could have gotten some points if they had managed to release anything during the shortage but lol they managed to gently caress that up so lmao

Indiana_Krom
Jun 18, 2007
Net Slacker

shrike82 posted:

I’m not sure why Intel gets brownie points for releasing a GPU when they have a ton more resources than AMD or Nvidia and it’s not like they’re entering a new product market

Maybe if they had managed to use their own fabs but even they had to go to TSMC

Intel has a ton more resources, but not necessarily as many resources being spent on GPUs as either AMD or Nvidia have. Intel is a massively bigger company but they are also way more spread out and doing a lot of things that are entirely different. Basically it wouldn't be the least bit surprising if Nvidia and AMD both have significantly more and smarter people dedicated to GPU design and the accompanying software than Intel does or can even manage due to their manufacturing overhead.

CoolCab
Apr 17, 2005

glem
I don't really care about the fate or character of these megacorps, I'm just here for the content

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
it would have been incredibly impressive if intel's first-gen gpus were super competitive and appealing. that they aren't isn't some huge surprise - maybe they'll price them low enough to make up for the obvious flaws (drivers, poor efficiency) but if not oh well.

maybe within a few generations the drivers will mature & they'll end up rather competitive, certainly doesn't hurt that they're trying at least

CoolCab
Apr 17, 2005

glem
the big thing that the gn video highlighted for me is that there kind of is no good time to start making drivers, but if there ever was one the start of the dx12/vulcan period is not terrible at all. dx11 is always going to suck in the absence of a time machine but maybe they can get their more modern performance on lock and it becomes less important.

pofcorn
May 30, 2011
Drivers matter much less for dx12/vulkan right ? And for lesser known dx11 stuff, could they do some kind of dxvk, like in linux?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

cheesetriangles posted:

Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again.

listen man we are all going to die, you want to die with lame rear end pixel virgin eyes that never even saw a RT lmao be my guest

that said they need to remake Chrono Trigger with ray tracing.

Shipon
Nov 7, 2005
it might be nostalgia pandering but i am completely down with remaking games from the 90s and early 2000s but with raytraced lighting/reflections, quake 2 rtx is a great proof of concept for this sort of thing

Indiana_Krom
Jun 18, 2007
Net Slacker

Shipon posted:

it might be nostalgia pandering but i am completely down with remaking games from the 90s and early 2000s but with raytraced lighting/reflections, quake 2 rtx is a great proof of concept for this sort of thing

Raytraced Descent

shrike82
Jun 11, 2005

Depends on the type of game, I couldn’t get into the original Deus Ex game because of how clunky it is

And I played it when it released back in the day

Yaoi Gagarin
Feb 20, 2014

I wonder if Rogue Squadron would benefit from raytracing.

Dr. Fishopolis
Aug 31, 2004

ROBOT
one thing i like about the intel arc launch is that the engineers have been very vocal on twitter about how much it sucks and you shouldn't be too excited about it.

refreshing honesty and coming from intel of all places, it's shocking.

hobbesmaster
Jan 28, 2008

Indiana_Krom posted:

Raytraced Descent

The thing about Quake RTX is that hardware accelerated Id engines had been open sourced while still somewhat relevant and so a lot of people that wanted to learn about game renderers and new features gently caress around with the engines. Just look at this list! https://quakeengines.github.io

Descent 1 and 2 were software rendered only. There are source ports today for descent 1/2 that use OpenGL but it’s pretty rudimentary. While Quake came out only a year after descent 3D rendering was moving fast.

Overload is built off unity 5 from 2015. Maybe try and get that dev team back together for one more gig ;)

hobbesmaster fucked around with this message at 03:22 on Jul 18, 2022

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

nvidia have painted themselves into a corner with idtech remasters unfortunately, their researchers have come up with some much improved algorithms since Q2RTX but that newer stuff got productized into proprietary middleware that they can't legally link with the GPL idtech releases

same reason why there's no DLSS in Q2RTX, nvidias license and the GPL don't mix

repiv fucked around with this message at 03:38 on Jul 18, 2022

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply