|
cheesetriangles posted:Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again. Yes. And diablo runs great! Honestly, the technology is amazing these days and i love to tinker with pcs. Top end gpus were something i wasnt able to afford way back when. Im still using speakers i got when i built my first pc back in 2001.
|
# ? Jul 17, 2022 06:55 |
|
|
# ? Jun 5, 2024 18:26 |
Yeah this will be the first time I buy a gpu over about $200 and most of my playtime now is eu4 and league.
|
|
# ? Jul 17, 2022 07:01 |
|
cheesetriangles posted:Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again. The reason is that this is an internet forum from 20 years ago, and shockingly enough most of the people here are old as gently caress.
|
# ? Jul 17, 2022 07:11 |
|
lol, at least we're not collecting GPUs from that era the closest would be buying CRTs and even so there's some practical value to doing so
|
# ? Jul 17, 2022 07:12 |
K8.0 posted:The reason is that this is an internet forum from 20 years ago, and shockingly enough most of the people here are old as gently caress. I see it other places but everyone there is probably old too I guess.
|
|
# ? Jul 17, 2022 07:21 |
|
i'll have you know that my 1000 dollar gpu is being put to good use playing the new release "last call bbs"
|
# ? Jul 17, 2022 08:09 |
|
Bioshock Remastered technically counts as a game released in the last decade
|
# ? Jul 17, 2022 08:37 |
|
cheesetriangles posted:Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again. I was going to post in this thread about other stuff but then I saw your post and felt seen (it's actually 26 years old now)
|
# ? Jul 17, 2022 09:15 |
|
When FEAR first came out I could barely run it at the time. I played it again last year and it was an absolute revelation. You just can't beat the old classics FEAR will be 18 next year. Time flies, it's absurd
|
# ? Jul 17, 2022 09:31 |
|
Indeed. I love being able to run older games maxed at 4k60.
|
# ? Jul 17, 2022 10:09 |
|
Dr. Video Games 0031 posted:From MLID a couple days ago: intel denies that A780 exists or has ever been on the cards, so MLID or Intel are lying (probably MLID) https://twitter.com/ryanshrout/status/1548430503644057605?t=J8MnFihIm3-4UYZ29_836Q&s=19
|
# ? Jul 17, 2022 10:59 |
|
repiv posted:intel denies that A780 exists or has ever been on the cards, so MLID or Intel are lying (probably MLID) lmfao https://twitter.com/mooreslawisdead/status/1548528154020569088
|
# ? Jul 17, 2022 11:11 |
|
i've been replaying Persona 3 on the Deck. It's great but it ran great on PSP and then Vita over a decade ago so
|
# ? Jul 17, 2022 11:31 |
|
the A780 rumour never made that much sense, it had the same specs as the A770 so it could only have been differentiated by power limit and binning that may make sense for a halo part but that chip isn't going to touch AMD/NVs flagships no matter how much they juice it
|
# ? Jul 17, 2022 11:39 |
|
I missed this, but a linux driver update from a few days ago seems to have confirmed the Navi 31 chiplet count: https://videocardz.com/newz/amd-rdna3-navi-31-gpu-with-six-mcds-to-feature-384-bit-wide-memory-bus The earlier rumor about there being many memory controller dies is seemingly true. Looks like it'll be one compute die (5nm) surrounded by 6 MCDs (6nm). The MCDs will also contain the infinity cache, and there's a possibility 3D stacking could be used to double the cache count per MCD, though this seems more speculative. So only one compute die, but there's a lot of die area being split off into those MCDs that will presumably be quite cheap to manufacture since each one will be small and on 6nm.
|
# ? Jul 17, 2022 11:46 |
|
repiv posted:all intel needs to do is catch up with AMD/NVs decade+ of driver heuristics and per-game hacks for making DX11 and earlier go fast, no biggie or they just focus time on optimising how their driver handles glide and work their way up
|
# ? Jul 17, 2022 11:54 |
|
Here's something dumb I've been working on, mostly off, for a few months trying to avoid the siren song of a 3080. I completely missed the mark when I did the math when I built this PC initially in 2019 because a decade of 1920x1200 gaming had been fairly trivial, this compounded further when I went for a 3440x1440 monitor instead of a 2560x1440 that I had been planning around. Turns out that's a lot of pixels and going over 60fps is real hard. I replaced a 2070 with a 2080 Super right before COVID hit, and it's been mostly OK; however, it's kind of noisy and I don't like the GPU temp sitting that high and/or fans spinning up. I picked up a spare heatsink off of eBay and tried to figure out what to hack away to strap some 120mm fans to it. Things were going OK until I learned about the "proprietary 14-pin connector" for the fans for reference board 20 series PCBs. Arctic does make an adapter from what I can tell; however, it's only to a single output and they don't ship to the US. I did think about trying to 3d print a shroud; however, I was still running up against how to mount it to the heatsink itself, but zipties... Left to right: CRJ "PWM to Graphics card" cable, 14-pin cable off of the spare GPU, stock fan leads Arctic's "proprietary 14-pin adapter" An afternoon of looking at datasheets and various dead ends, I guessed that I would need to make some Molex Pico Blade to KK 254 cables to make everything work. Working with 30awg wire sucks. Crimpin' ain't easy I would have assumed that the brown wire was ground; it was not Color matched to the wires off of the NF A12x25 PWM I didn't account for the the MB tray being lower than the rest of the panel in the case, so I had some clearance issues and re-ziptied one of the fans up a bit for clearance. Unfortunately I didn't gently caress up and it worked on the first try, so my wallet goes uninspected and a 3080 is not in the mail. At 60% or so on Precision X1, the fans are maxed out at 2k rpm. I only messed with it a little bit to move the fan curves down to prevent them from sitting at 100% speed, but so far so good. House/computer hasn't burned down and it looks like I'm able to sit at in boost comfortably. I do need to spend some times to figure out how to properly undervolt this thing and maybe use Afterburner to set the curves and do some weird import into Precision X1. I need to flatten and reinstall real bad, though.
|
# ? Jul 17, 2022 18:39 |
|
Neat, it’s the kind of project I’d love to have a go at, but I cannot afford to risk a fuckup.
|
# ? Jul 17, 2022 18:47 |
|
What's up fellow "I strapped a pair of 120mm Noctua fans to my 2080's heatsink so it would run quieter" Goon Instead of trying to run the fans off of the card, I went the much more expedient route and plugged them in to an unused motherboard header. I then used Fan Control to set a custom curve based on GPU temps.
|
# ? Jul 17, 2022 20:10 |
|
can people stop giving this guy info so i dont have to hear from him any more? god he sucks.
|
# ? Jul 17, 2022 20:20 |
|
I was thinking about the fact that people were, well, whelmed by the Intel GPU lineup. Going by the specs they are really nothing to write home about it seems. But I'm wondering: the big leap isn't really in thinking about silicone and PCBs, it's making it to release, isn't it? Any company can think up clever architectures. But getting the wheels of a big corporation to turn in the right direction is the actual achievement imo. You have to build teams, start up development on a bunch of different aspects, build up a network of suppliers and customers. To me, having all that business infrastructure working smoothly is just as big an achievment as the actual hardware itself. Even if it's Intel we're talking about, considering how much inertia big corpos can develop.
|
# ? Jul 17, 2022 20:31 |
|
it's better than larrabee!
|
# ? Jul 17, 2022 20:38 |
|
The vast majority of cards sold (to gamers, at least) are the low to mid range models so focusing on those products, instead of trying to make a 3090-tier halo card, might be the right decision. They just have to get the prices right and do a LOT better with the driver game optimization. Oh and get the cards to market 6 months ago lol
|
# ? Jul 17, 2022 20:54 |
|
Lord Stimperor posted:I was thinking about the fact that people were, well, whelmed by the Intel GPU lineup. Going by the specs they are really nothing to write home about it seems. But I'm wondering: the big leap isn't really in thinking about silicone and PCBs, it's making it to release, isn't it? Any company can think up clever architectures. But getting the wheels of a big corporation to turn in the right direction is the actual achievement imo. You have to build teams, start up development on a bunch of different aspects, build up a network of suppliers and customers. To me, having all that business infrastructure working smoothly is just as big an achievment as the actual hardware itself. Even if it's Intel we're talking about, considering how much inertia big corpos can develop. one of the best things i see intel gpu team doing is what is currently showing up in they are sending out technical people to talk with big techtubers. I and them going yes this isn't going to be the fastest gpu, and yes the driver stack is VERY young but "we/intel" are trying to be a good third option.
|
# ? Jul 17, 2022 21:48 |
|
I’m not sure why Intel gets brownie points for releasing a GPU when they have a ton more resources than AMD or Nvidia and it’s not like they’re entering a new product market Maybe if they had managed to use their own fabs but even they had to go to TSMC
|
# ? Jul 17, 2022 21:54 |
|
cheesetriangles posted:Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again. I bought my 1000 pound gpu because I want those sweet RT reflections.
|
# ? Jul 17, 2022 22:08 |
|
They maybe could have gotten some points if they had managed to release anything during the shortage but lol they managed to gently caress that up so lmao
|
# ? Jul 17, 2022 22:08 |
|
shrike82 posted:I’m not sure why Intel gets brownie points for releasing a GPU when they have a ton more resources than AMD or Nvidia and it’s not like they’re entering a new product market Intel has a ton more resources, but not necessarily as many resources being spent on GPUs as either AMD or Nvidia have. Intel is a massively bigger company but they are also way more spread out and doing a lot of things that are entirely different. Basically it wouldn't be the least bit surprising if Nvidia and AMD both have significantly more and smarter people dedicated to GPU design and the accompanying software than Intel does or can even manage due to their manufacturing overhead.
|
# ? Jul 17, 2022 22:24 |
|
I don't really care about the fate or character of these megacorps, I'm just here for the content
|
# ? Jul 17, 2022 22:25 |
|
it would have been incredibly impressive if intel's first-gen gpus were super competitive and appealing. that they aren't isn't some huge surprise - maybe they'll price them low enough to make up for the obvious flaws (drivers, poor efficiency) but if not oh well. maybe within a few generations the drivers will mature & they'll end up rather competitive, certainly doesn't hurt that they're trying at least
|
# ? Jul 17, 2022 22:33 |
|
the big thing that the gn video highlighted for me is that there kind of is no good time to start making drivers, but if there ever was one the start of the dx12/vulcan period is not terrible at all. dx11 is always going to suck in the absence of a time machine but maybe they can get their more modern performance on lock and it becomes less important.
|
# ? Jul 17, 2022 22:35 |
|
Drivers matter much less for dx12/vulkan right ? And for lesser known dx11 stuff, could they do some kind of dxvk, like in linux?
|
# ? Jul 17, 2022 22:44 |
|
cheesetriangles posted:Is the reason everyone talks about buying some 1000 dollar gpu just to play 25 year old games on it because the only people actually buying them is middle age people that just want to experience their youth again. listen man we are all going to die, you want to die with lame rear end pixel virgin eyes that never even saw a RT lmao be my guest that said they need to remake Chrono Trigger with ray tracing.
|
# ? Jul 18, 2022 00:35 |
|
it might be nostalgia pandering but i am completely down with remaking games from the 90s and early 2000s but with raytraced lighting/reflections, quake 2 rtx is a great proof of concept for this sort of thing
|
# ? Jul 18, 2022 01:06 |
|
Shipon posted:it might be nostalgia pandering but i am completely down with remaking games from the 90s and early 2000s but with raytraced lighting/reflections, quake 2 rtx is a great proof of concept for this sort of thing Raytraced Descent
|
# ? Jul 18, 2022 01:58 |
|
Depends on the type of game, I couldn’t get into the original Deus Ex game because of how clunky it is And I played it when it released back in the day
|
# ? Jul 18, 2022 02:08 |
|
I wonder if Rogue Squadron would benefit from raytracing.
|
# ? Jul 18, 2022 02:09 |
|
one thing i like about the intel arc launch is that the engineers have been very vocal on twitter about how much it sucks and you shouldn't be too excited about it. refreshing honesty and coming from intel of all places, it's shocking.
|
# ? Jul 18, 2022 02:12 |
|
Indiana_Krom posted:Raytraced Descent The thing about Quake RTX is that hardware accelerated Id engines had been open sourced while still somewhat relevant and so a lot of people that wanted to learn about game renderers and new features gently caress around with the engines. Just look at this list! https://quakeengines.github.io Descent 1 and 2 were software rendered only. There are source ports today for descent 1/2 that use OpenGL but it’s pretty rudimentary. While Quake came out only a year after descent 3D rendering was moving fast. Overload is built off unity 5 from 2015. Maybe try and get that dev team back together for one more gig hobbesmaster fucked around with this message at 03:22 on Jul 18, 2022 |
# ? Jul 18, 2022 03:18 |
|
|
# ? Jun 5, 2024 18:26 |
|
nvidia have painted themselves into a corner with idtech remasters unfortunately, their researchers have come up with some much improved algorithms since Q2RTX but that newer stuff got productized into proprietary middleware that they can't legally link with the GPL idtech releases same reason why there's no DLSS in Q2RTX, nvidias license and the GPL don't mix repiv fucked around with this message at 03:38 on Jul 18, 2022 |
# ? Jul 18, 2022 03:35 |