|
I just wanna max everything out in VR and not have to be all "hmm well beatsabers fine but better turn down the resolution for skyrim better spend more time looking at framerates than actually playing" and I'm willing to pay a lot for it. I don't care 4080 just be that
|
# ? Aug 4, 2022 12:21 |
|
|
# ? Jun 6, 2024 06:25 |
If you are caring about power the 4000 series makes more sense. You can get same performance at less power thanks to the smaller process.
|
|
# ? Aug 4, 2022 15:01 |
|
cheesetriangles posted:If you are caring about power the 4000 series makes more sense. You can get same performance at less power thanks to the smaller process. I hope this is true, I'm generally trying to reduce power draw across my system. I guess undervolted 3000 series cards would probably get me an OK result on power as well.
|
# ? Aug 4, 2022 15:09 |
|
The Asrock version of the Arc A380 apparently costs $192 in China. That is not a compelling price for that hardware. It has a single slot cooler and four display outputs directly sprouting from the PCB. For some reason it has a two-slot PCI slot cover. I run Linux on the desktop. I rarely play computer games. I use a discrete card to get multiple display outputs. Even with Intel's strong history of open source graphics drivers compared to AMD and Nvidia $200 is way too much money for the weak performance on offer. Make it $100 and I'll bite. Is there any reason to believe the two higher cards (the i5 and i7 ones) will be any less reliant on rebar? PBCrunch fucked around with this message at 19:01 on Aug 4, 2022 |
# ? Aug 4, 2022 18:31 |
|
it also doesn't have hdmi 2.1 :/
|
# ? Aug 4, 2022 18:45 |
|
PBCrunch posted:The Asrock version of the Arc A380 apparently costs $192 in China. That is not a compelling price for that hardware. Yeah, that kind of money gets you a GTX 1660 even in this moment's hosed up graphics card economy. I bet we don't get a US launch at all.
|
# ? Aug 4, 2022 19:06 |
|
I would not compare prices in one country to prices in another. They will be different once they hit store shelves in the US. That said... https://www.techpowerup.com/297490/intel-arc-board-partners-are-reportedly-stopping-production-encountering-quality-issues I'll just quote the whole article since it's short: quote:According to sources close to Igor Wallossek from Igor's lab, Intel's upcoming Arc Alchemist discrete graphics card lineup is in trouble. As the anonymous sources state, certain add-in board (AIB) partners are having difficulty adopting the third GPU manufacturer into their offerings. As we learn, AIBs are sitting on a pile of NVIDIA and AMD GPUs. This pile is decreasing in price daily and losing value, so it needs to be moved quickly. Secondly, Intel is reportedly suggesting AIBs ship cards to OEMs and system integrators to start the market spread of the new Arc dGPUs. This business model is inherently lower margin compared to selling GPUs directly to consumers. Oof. Not all AIBs will be in this same position (e.g. previously AMD-exclusive AIBs like ASRock), but I have been wondering how Intel would manage to step into the game while most potential board partners are oversupplied with competing products.
|
# ? Aug 4, 2022 21:13 |
|
https://www.youtube.com/watch?v=G2SP9u5ke1k Are people just getting ARC cards now?
|
# ? Aug 5, 2022 00:08 |
|
change my name posted:https://www.youtube.com/watch?v=G2SP9u5ke1k I thought Intel was giving them to everyone, but they could only publish A380 numbers.
|
# ? Aug 5, 2022 00:10 |
|
I don't remember who it was, but I've seen some people accuse some of these accounts posting split-screen performance comparisons as faking the results. Considering this would be the first time we've ever heard of anyone having the A770 or A750, I'm very suspicious of that video. As far as I know, nobody's even managed to test them in China yet.
|
# ? Aug 5, 2022 00:12 |
|
teagone posted:I thought Intel was giving them to everyone, but they could only publish A380 numbers. I don't think this is the case. They had the A750 running on some test benches at LTT and GamersNexus (maybe elsewhere?), but they haven't been supplying anyone with review samples. Those were just demo units that they went on a media tour with, basically.
|
# ? Aug 5, 2022 00:15 |
|
Dr. Video Games 0031 posted:I don't think this is the case. They had the A750 running on some test benches at LTT and GamersNexus (maybe elsewhere?), but they haven't been supplying anyone with review samples. Those were just demo units that they went on a media tour with, basically. Ahh, ok. I only watch GN, and had just assumed Intel was giving them to everyone after I saw LTT had their hands on an A770 or A750, lol.
|
# ? Aug 5, 2022 00:18 |
|
yeah that's absolutely a fake benchmarker they're barely even making an effort, if you look at the other videos on their channel they use exactly the same gameplay footage for every video, just with a different set of AVG and 1% low numbers overlaid on top the more competent fake benchmark channels at least go to the effort of downclocking their card and recording actual gameplay to plausibly approximate footage recorded on a weaker card repiv fucked around with this message at 00:30 on Aug 5, 2022 |
# ? Aug 5, 2022 00:23 |
|
I mean they have a joker as the splash screen at the start of the video. That should tell you everything.
|
# ? Aug 5, 2022 00:31 |
|
we live in a society
|
# ? Aug 5, 2022 00:34 |
|
Is that joker society or costanza society
|
# ? Aug 5, 2022 01:14 |
|
The MSI Mech 6600 XT is down to $300 after rebate: https://www.newegg.com/msi-radeon-rx-6600-xt-rx6600xt-mech2x-8goc/p/N82E16814137682?Item=N82E16814137682 The MSI Mech 6600 non-XT is down to $250 after rebate, but get the Sapphire Pulse for $250 without any mail-in rebate fuckery instead: https://www.newegg.com/sapphire-radeon-rx-6600-11310-04-20g/p/N82E16814202415?quicklink=true It's nice to see actual midrange pricing for midrange cards. edit: The pulse linked above is a version with just one DP port and one HDMI port, so that's something to be aware of. There's a version with two more DP ports that costs $20 more. Dr. Video Games 0031 fucked around with this message at 04:14 on Aug 5, 2022 |
# ? Aug 5, 2022 02:34 |
|
adding another 150W to my 3090 will maybe make me spend, like, 5-10 more a month in electric bills. big whoop a lot of people making a big deal out of this, i don't understand what they're trying to say. are you trying to say they just shouldn't bother adding more performance? because things have changed, you're not getting the performance gains with node shrinks like you used to. want more performance? better expect more power draw for the most part
|
# ? Aug 5, 2022 03:42 |
|
Shipon posted:adding another 150W to my 3090 will maybe make me spend, like, 5-10 more a month in electric bills. big whoop What they're trying to say is very simple to understand because they've already said it quite clearly: they want to either reduce or avoid increasing their power consumption. That's all. It's not an unreasonable desire, either, especially if you live in europe:
|
# ? Aug 5, 2022 03:53 |
|
Shipon posted:adding another 150W to my 3090 will maybe make me spend, like, 5-10 more a month in electric bills. big whoop A noticeably super hot gaming PC is a pain in the rear end for the hot months of the year, regardless of power. It feels like doing a roast in the oven! Also, you don't have to have the silicon pushed way up the power curve. Apple is shipping gigantic dies that are matching high-end Intel stuff while using 1/3 the wattage, and big GPUs that should be somewhere in the ballpark of a 3060, but nobody's distributing games for them so they only get used for compute or video editing. They're only doing it by having stonking huge dies, 8 channel LPDDR5, and using a low-power targeted process. I guess what you could say about modern gaming GPUs is that you can just undervolt them yourself if you want to use 25% less power in exchange for giving up 5% performance.
|
# ? Aug 5, 2022 04:16 |
|
Twerk from Home posted:A noticeably super hot gaming PC is a pain in the rear end for the hot months of the year, regardless of power. It feels like doing a roast in the oven! There's a limit to how far undervolting and power limiting goes on modern GPUs, though. As an experiment, I tried to see how much performance I could wring out of the 3080 Ti at 75W, only to find that it's basically impossible to run any modern 3D games on it and have it draw less than 150W, no matter what you're doing to the voltage curve and power limits. At a certain point, it just started ignoring my power limits and drew 150W anyway, and performance was kind of all over the place. Buying a more expensive GPU to bring it down one or two performance tiers feels bad, anyway, so current undervolting practices focus on maintaining or slightly improving stock performance while reducing power draw instead. And something I've discovered recently is that 4K causes the card to draw way more current than 1440p, so undervolting doesn't even help with reducing power draw much at that resolution (though it does improve performance). Dr. Video Games 0031 fucked around with this message at 11:33 on Aug 5, 2022 |
# ? Aug 5, 2022 04:27 |
|
I seem to remember the 4070 and the 4070Ti were so close in specs so it wasn't worth buying one of the models given the price difference, but I can't remember which model was the one to buy, can anyone confirm? Also are all Nvidia brand cards Founder's Edition? There's one on the Best Buy app that is Nvidia but the box picture doesn't say Founder's. edit- this is for rendering, I don't care about game performance Listerine fucked around with this message at 06:40 on Aug 5, 2022 |
# ? Aug 5, 2022 06:27 |
|
Listerine posted:I seem to remember the 4070 and the 4070Ti were so close in specs so it wasn't worth buying one of the models given the price difference, but I can't remember which model was the one to buy, can anyone confirm? Assuming you mean the 3070/3070 Ti, the one to buy is the 3070. If they're almost identical in performance, then it stands to reason that you should just get the cheaper one, after all. If the card itself is manufactured by Nvidia, then it is a founder's edition. The "founder's edition" branding isn't displayed prominently on the box or the best buy store listings, it seems, but this is an FE, for instance. They all have that same general design, with fans embedded into a large heatsink and no tacky plastic shroud.
|
# ? Aug 5, 2022 06:46 |
|
Dr. Video Games 0031 posted:Assuming you mean the 3070/3070 Ti, the one to buy is the 3070. If they're almost identical in performance, then it stands to reason that you should just get the cheaper one, after all. Doh, that's exactly what I meant (but I'd certainly take a 4070 tomorrow if I could!). Thanks!
|
# ? Aug 5, 2022 07:40 |
|
Dr. Video Games 0031 posted:I would not compare prices in one country to prices in another. They will be different once they hit store shelves in the US. That said... Oh my god. I hope Intel figures things out by the end, because it’s looking like a disaster atm.
|
# ? Aug 5, 2022 07:47 |
|
Shipon posted:adding another 150W to my 3090 will maybe make me spend, like, 5-10 more a month in electric bills. big whoop like i'm going to care about the opinion of someone who bought a 3090 lol
|
# ? Aug 5, 2022 08:24 |
|
We’re almost 2 years into next gen systems but nobody can buy the things and we’re still seeing big titles go multi gen. I’m not trying to do the Bill Gates “640k Ram is enough for everyone” thing, but unless you need 4K RT 120hz no DLSS or some crazy high VR or rendering use case, when are we going to see games that actually need this much power?
|
# ? Aug 5, 2022 09:57 |
|
bloodysabbath posted:We’re almost 2 years into next gen systems but nobody can buy the things and we’re still seeing big titles go multi gen. I’m not trying to do the Bill Gates “640k Ram is enough for everyone” thing, but unless you need 4K RT 120hz no DLSS or some crazy high VR or rendering use case, when are we going to see games that actually need this much power? I need a 4090 because clicking the "This thing halves your fps for no perceivable gain" checkbox makes my fps go bad
|
# ? Aug 5, 2022 10:33 |
|
if you had a game that was running at a capped FPS [and could hit that cap], and assuming everything else being equal, you would expect that moving from one generation of card to the next would consume less power, right?
|
# ? Aug 5, 2022 11:00 |
|
gradenko_2000 posted:if you had a game that was running at a capped FPS [and could hit that cap], and assuming everything else being equal, you would expect that moving from one generation of card to the next would consume less power, right? In general yeah, as mentioned earlier smaller nodes consume less power. iirc the gtx 780 and gtx 1080 both had the same peak power consumption for example, but the 1080 is obviously better performance because it's a newer, more efficient process. If you capped a game at the same framerate on both GPUs, the 1080 will have significantly less power consumption than the 780, it's not working as hard to hit the cap. There's a small number of exceptions to this, I am sure, but it's generally the rule that the next generation of cards need less power to reach the same framerate as the previous generation.
|
# ? Aug 5, 2022 11:19 |
|
ijyt posted:like i'm going to care about the opinion of someone who bought a 3090 lol
|
# ? Aug 5, 2022 14:01 |
|
bloodysabbath posted:We’re almost 2 years into next gen systems but nobody can buy the things and we’re still seeing big titles go multi gen. I’m not trying to do the Bill Gates “640k Ram is enough for everyone” thing, but unless you need 4K RT 120hz no DLSS or some crazy high VR or rendering use case, when are we going to see games that actually need this much power? Some people like to play MSFS I guess
|
# ? Aug 5, 2022 14:46 |
|
Twerk from Home posted:A noticeably super hot gaming PC is a pain in the rear end for the hot months of the year, regardless of power. It feels like doing a roast in the oven! it's definitely not rational from a cost perspective for me in the US, but it seems super wasteful for me to blast out more wattage through my little space heater in the summer in order for me to spend even more power to cool the apartment back down
|
# ? Aug 5, 2022 15:46 |
|
that heat only really sucks for me maybe two or three months, most of the year it's literally all the heating i pay for otherwise
|
# ? Aug 5, 2022 15:51 |
|
CoolCab posted:that heat only really sucks for me maybe two or three months, most of the year it's literally all the heating i pay for otherwise For real?
|
# ? Aug 5, 2022 15:57 |
|
Rinkles posted:For real? yeah. my name isn't ironic, i used to live in the arctic. where i live now still isn't that cold to me, even twenty years later. i usually just use enough heat so the pipes don't freeze.
|
# ? Aug 5, 2022 16:00 |
|
This is why you don’t buy GPUs that were used for mining You could just end up with some ash!
|
# ? Aug 5, 2022 16:23 |
|
i was chewing that over in the other thread. those are gamerocks right? i think they're at least 3080s but i could be wrong. aren't those fans blowing air directly into the exhaust vent
|
# ? Aug 5, 2022 16:25 |
|
like, what is the story with those 120s mounted horizontally. fuckin environmental storytelling but in miner pic form e: rgb 120s! those cost a fortune WHY
|
# ? Aug 5, 2022 16:27 |
|
|
# ? Jun 6, 2024 06:25 |
|
MarcusSA posted:This is why you don’t buy GPUs that were used for mining Maybe it’s arson fraud after the crypto downturn
|
# ? Aug 5, 2022 16:30 |