Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

Subjunctive posted:

This long predates Tim’s reign. I once had a visit from Apple engineers to talk about codecs, and they showed up with lawyers in tow (fine) who wouldn’t tell us their names or sign in on anything (wtf?).

E: The original WebGL was co-edited by an Apple engineer, which seemed weird at the time, and I wonder if that’s coming back to bite Khronos. They were an enormous pain about <canvas> too, in the WHATWG days.

opencl was also a khronos+apple deal, originally developed by apple then put up for standardization, until it eventually fizzled out and they pursued metal instead

Adbot
ADBOT LOVES YOU

Yudo
May 15, 2003

wargames posted:

the 7900xt at 750~ isn't terribad but we should really be exepecting cards to max out at 600 dollars, this generation's pricing still has the stink of crypto.

There have been $600+ cards for years, easily predating the crypto boom. Given their product stack, the 7900xt should have been a $800 card at launch, settling in at $700-$750. In fact, at $700 it would sell really well, but AMD is not serious about market share in consumer discreet graphics.

wargames
Mar 16, 2008

official yospos cat censor

Yudo posted:

There have been $600+ cards for years, easily predating the crypto boom. Given their product stack, the 7900xt should have been a $800 card at launch, settling in at $700-$750. In fact, at $700 it would sell really well, but AMD is not serious about market share in consumer discreet graphics.

I don't disagree, their real money maker with Graphics is APUs and datacenter, Discrete is just a test bed for those two market.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

wargames posted:

I don't disagree, their real money maker with Graphics is APUs and datacenter, Discrete is just a test bed for those two market.

Are AMD data center GPUs that big? From what I've seen, it's CUDA or bust. Nvidia is the only option from most rental vendors, whether cloud or smaller.

Dr. Video Games 0031
Jul 17, 2004

Datacenter uses a different architecture now anyway (CDNA), and I'm not sure how much overlap there is with RDNA. RDNA is primarily developed for consoles though, and desktop graphics cards are just a bonus. I think AMD is also going to try to make a big APU for notebooks, though it's hard to imagine anyone effectively competing with apple in that space.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Dr. Video Games 0031 posted:

Datacenter uses a different architecture now anyway (CDNA), and I'm not sure how much overlap there is with RDNA. RDNA is primarily developed for consoles though, and desktop graphics cards are just a bonus. I think AMD is also going to try to make a big APU for notebooks, though it's hard to imagine anyone effectively competing with apple in that space.

It doesn’t matter if they can effectively compete, as long as they keep buying APUs while they try!

Yudo
May 15, 2003

Twerk from Home posted:

Are AMD data center GPUs that big? From what I've seen, it's CUDA or bust. Nvidia is the only option from most rental vendors, whether cloud or smaller.

From what I know of the ai side of things, they are not. Cdna does power some monster super computers though, and MI300 looks pretty good, hardware wise. The problem is that AMD's software stack is...well, not at Nvidia's level. I think that ai companies are going to develop their own tools such that they are not locked into a vendor, but for now, it is a problem for AMD's entry into market.

wargames
Mar 16, 2008

official yospos cat censor

Twerk from Home posted:

Are AMD data center GPUs that big? From what I've seen, it's CUDA or bust. Nvidia is the only option from most rental vendors, whether cloud or smaller.

In HPC AMD does command a healthy lead over nvidia, but general data center they are making up ground. DPU and CNDA is doing work.

Dr. Video Games 0031
Jul 17, 2004

Subjunctive posted:

It doesn’t matter if they can effectively compete, as long as they keep buying APUs while they try!

The biggest roadblock is going to be the lack of on-package memory. Dell/HP can use soldered lpddr5x, but it's not going to be anywhere near as fast as what Apple is doing. And I don't recall any of the rumors about AMD's "Strix Halo" or whatever Intel's doing mentioning anything about this. Though maybe Intel's "Adamantine" L4 cache solution can help.

edit: according to mlid "Strix Halo" will have a 256-bit LPDDR5X memory controller which should allow for around 450GB/s bandwidth with 7500 MT/s memory? So that would be pretty good actually.

Dr. Video Games 0031 fucked around with this message at 03:16 on May 23, 2023

repiv
Aug 13, 2009

a few references to frame generation just showed up in AMDs open source driver, i wonder what that means for FSR3 running (or not) on NV/Intel if it has special hooks into AMDs driver

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Frame generation implemented like RSR would be a compelling proposition from AMD even if it sucks

Josh Lyman
May 24, 2009


sauer kraut posted:

Inno coolers are ok especially for a 200W part, but like the other budget brands (Gigabyte, PowerColor, Palit etc) they skimp on electronic components wherever possible.
Gigabyte is a budget brand?

How is Sapphire on the AMD side? I always thought they were pretty decent, maybe not as good as an Asus, MSI or Gigabyte but better than Powercolor or XFX.

Kibner
Oct 21, 2008

Acguy Supremacy

Josh Lyman posted:

Gigabyte is a budget brand?

How is Sapphire on the AMD side? I always thought they were pretty decent, maybe not as good as an Asus, MSI or Gigabyte but better than Powercolor or XFX.

Sapphire is the AMD EVGA equivalent.

Yudo
May 15, 2003

Josh Lyman posted:

Gigabyte is a budget brand?

How is Sapphire on the AMD side? I always thought they were pretty decent, maybe not as good as an Asus, MSI or Gigabyte but better than Powercolor or XFX.

The AMD partner hierarchy is generally considered: sapphire as undisputed champ; powercolor and xfx are second and third, but not always in that order. Powercolor has gotten a very bad reputation for coil whine since RDNA2, though it seems to be an endemic problem in all 300w+ cards, not just powercolor. For example, on the Nvidia side, there are lots of complaints about asus cards, but then they sell like hotcakes compared to the competition so the sample is biased.

I would not consider gigabyte budget. They do suck as a company.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Paul MaudDib posted:

I don't really see a way for them to move beyond sponsored showcase ports until they officially support vulkan, and there's no chance of them doing that because they're apple and they almost never admit they've misstepped. Plus their hardware essentially implements metal at a hardware level so they'd almost need something like an official moltenvk.

(how is moltenvk, is it viable for playing actual games or just toy applications? or is the problem the near-total lack of traction on vulkan compared to DX12? I guess maybe what they need is moltendx instead, but, it's their hardware and asahi is the only prospect for anyone in the community doing the work for them...)

Yeah, I doubt Apple will ever officially support Vulkan, especially with their (almost completed) transition to ASi and doubling-down on Metal, including MetalFX as their DLSS/FSR equivalent.

UHD
Nov 11, 2006


Apple has spent the last decade and change moving as much of their products’ bits and pieces in-house as they can, they are not about to adopt vulkan to make other peoples lives easier

Arivia
Mar 17, 2011
Every once in awhile I see mlid and shake my head and go “what kind of lid?”

SCheeseman
Apr 23, 2003

m'lidy

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SourKraut posted:

Yeah, I doubt Apple will ever officially support Vulkan, especially with their (almost completed) transition to ASi and doubling-down on Metal, including MetalFX as their DLSS/FSR equivalent.

It's really stupid, they could even help steer Vulkan dev, but no, they have serious Not Made Here syndrome

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
You can't kill the Metal
The Metal will live on
OpenCL tried to kill the Metal
But they failed, as they were smite to the ground
Vulkan tried to kill the Metal
But they failed, as they were stricken down to the ground
Khronos tried to kill the Metal
Ha-ha-ha-ha
They failed, as they were thrown to the ground

Truga
May 4, 2014
Lipstick Apathy
apple's more of a "threw it on the ground" vibe

sauer kraut
Oct 2, 2004

Josh Lyman posted:

Gigabyte is a budget brand?

I haven't seen a teardown of a recent Aorus yet, but their other 3 or 4 tiers below that, definitely.

sauer kraut fucked around with this message at 08:39 on May 23, 2023

orcane
Jun 13, 2012

Fun Shoe

sauer kraut posted:

Aorus is ok, but their other 3 or 4 tiers below that not so much.
Because Asus (lol) and MSI don't have product lines where they're cutting everything down to the basics but still ask for a premium thanks to marketing.
:thunk:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
TUF is generally considered one of the better entry level models right now afaik

kliras
Mar 27, 2021

Paul MaudDib posted:

TUF is generally considered one of the better entry level models right now afaik
for ampere, definitely. i don't know if it's necessarily the case with ada, but asus has this weird thing where there's probably a reporting bias, perhaps because so many people buy it, or because people expect their stuff to be top shelf

and qc obviously doesn't mean 100% of products will have an issue

Char
Jan 5, 2013
I'm testing a 4070 on my rig (5600x, 1400p g7 Odyssey, nothing exceptional), and compared to my vanilla 2070 the results are underwhelming.
I tested only CP2077 and Battlefield 2042, I'm impressed CP2077 runs on ultra on decent (45-55) framerates on my 2070 tbh, the 4070 pushes 80-90fps but I'm not feeling a 600euro difference there.

sauer kraut
Oct 2, 2004

Char posted:

I'm testing a 4070 on my rig (5600x, 1400p g7 Odyssey, nothing exceptional), and compared to my vanilla 2070 the results are underwhelming.
I tested only CP2077 and Battlefield 2042, I'm impressed CP2077 runs on ultra on decent (45-55) framerates on my 2070 tbh, the 4070 pushes 80-90fps but I'm not feeling a 600euro difference there.

Hmm maybe that's why US vendors are packing in $100 of Steam bucks on top of Diablo 4, in order to move what should have been a $450 card :confused:

sauer kraut fucked around with this message at 11:27 on May 23, 2023

Char
Jan 5, 2013

sauer kraut posted:

Hmm maybe that's why US vendors are packing in $100 of Steam bucks on top of Diablo 4, in order to move what should have been a $450 card :confused:

I had to witness it with my own two eyes, sadly

Shrimp or Shrimps
Feb 14, 2012


I've still got an Inno3D GTX1080 going strong in my partner's PC. With that said, I tend to steer clear of the less well known brands these days just because the big brands are the only ones with a second hand market in my part of the world (SEA).

SlowBloke
Aug 14, 2017
I have a gigabyte 1070 founder edition, purchased in an openbox condition a couple of months after launch for 400€. Since i use my pc just to check email, shitpost and watch youtube, I'm hardly pressed to replace it. I'm more likely to replace my current frankenmonitor(https://www.displayspecifications.com/en/model/5c6f1324) than my gpu since nvidia equivalents are motorbike prices rather than computer parts ones.

SlowBloke fucked around with this message at 12:20 on May 23, 2023

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Uh, the thumbnails of 4060ti reviews aren’t encouraging

Cao Ni Ma
May 25, 2010



Rinkles posted:

Uh, the thumbnails of 4060ti reviews aren’t encouraging

Its rear end. Steve straight up telling you it sucks at the start of the video because he doesnt want to waste your time but the rest of the video is just going ovet it

njsykora
Jan 23, 2012

Robots confuse squirrels.


I'm watching the GN review and the phrase "waste of sand" has been used multiple times. The actual charts show it basically equal to the 3060Ti, this is really bad. It's only slightly better than the A770 in some games.

Jeff Fatwood
Jun 17, 2013
holy lmao, that's incredible

TorakFade
Oct 3, 2006

I strongly disapprove


the other day I formatted my gaming PC and reinstalled Windows 11 , and I'm having some kind of gsync issue. I have activated Gsync compatibility in the Nvidia control panel, set the refresh rate to "prefer highest possible", and disabled vsync in games.

This used to work just fine on my PC before reinstalling, but it seems now I'm getting a lot of tearing that wasn't there before. Almost like VRR isn't working for some reason. If I turn on vsync via in-game options, no tearing at all.

Is there any step by step guide that will lead me to reactivate the correct things in order to have VRR and play at high refresh rates without tearing? Or at least a quick way to see if I do have some kind of gsync/VRR problem or I'm just imagining things? I really can't stand tearing so even a small amount ticks me off immensely, I'd rather have vsync on than get even very mild tearing/stuttering (but ideally I'd just want VRR to work as it used to). The Nvidia control panel looks old, unwieldy and silly and I'm never sure if I'm setting the options right there... :smith:

(for reference, Nvidia 4080 GPU and Dell G3223Q 32" 144hz monitor, if that changes things. VRR definitely used to work before, I didn't get any tearing with vsync off...)

TorakFade fucked around with this message at 14:38 on May 23, 2023

power crystals
Jun 6, 2007

Who wants a belly rub??

The highlight of the GN video was "We only ran [GTA V] at 4K because we thought a game from 2015 from 2013 would be too light to run at 1440. We were wrong." And then they show it getting 73fps to the 3060ti's 77.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy


jesus christ

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
ok lol at the 4060 ti sometimes even being marginally worse than the 3060 ti

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
That's what I love about these GPUs, man. I get older, they stay the same

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
All those comparisons are to a launch 3060 Ti FE, right? If you go buy a 3060 Ti right now it's going to have GDDR6X and 600GB/s of memory bandwidth and perform a little better, yeah?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply