Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dr. Video Games 0031
Jul 17, 2004

That's an Asus Phoenix, and I believe it to be from the Geforce GTX 1600 line. Which one exactly is hard to say, but I think it might be the 1650 GDDR6 version since the heatsink fin orientation seems to match (the original 1650 and 1660 models had a radial fins coming from a central copper slug like an intel box cooler)

Adbot
ADBOT LOVES YOU

Kibner
Oct 21, 2008

Acguy Supremacy
E: nvm, not new information

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Olewithmilk posted:

Hi all, bit of a long shot but I'm looking at a PC on an auction website. I don't think the GPU is going to be a 4090 or anything, but can anyone work out what type it is from this admittedly lovely picture?



My guess is it's an ASUS Phoenix GeForce GTX 1650 [OC Edition?] 4GB GDDR6 [V2?]
Here's a video of the card at various angles, everything is a match: https://youtu.be/DPJhf2Et0_E?t=334

One of these:
https://www.asus.com/us/motherboards-components/graphics-cards/phoenix/ph-gtx1650-o4gd6-p-v2/
https://www.asus.com/us/motherboards-components/graphics-cards/phoenix/ph-gtx1650-4gd6-p-v2/
https://www.asus.com/us/motherboards-components/graphics-cards/phoenix/ph-gtx1650-o4gd6/

edit: wasted my time, beaten hard

HalloKitty fucked around with this message at 19:43 on May 8, 2023

hobbesmaster
Jan 28, 2008

Dr. Video Games 0031 posted:

That's an Asus Phoenix, and I believe it to be from the Geforce GTX 1600 line. Which one exactly is hard to say, but I think it might be the 1650 GDDR6 version since the heatsink fin orientation seems to match (the original 1650 and 1660 models had a radial fins coming from a central copper slug like an intel box cooler)

It has to be earlier, NVIDIA requires “GEFORCE GTX” or RTX to be printed on the top. AMD followed similarly. 1050 maybe?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

hobbesmaster posted:

It has to be earlier, NVIDIA requires “GEFORCE GTX” or RTX to be printed on the top. AMD followed similarly. 1050 maybe?

It has GEFORCE GTX printed on the side, it's just very much the same colour as the card itself so it's hard to see depending on the angle, it's absolutely the ASUS Phoenix 1650

Dr. Video Games 0031
Jul 17, 2004

hobbesmaster posted:

It has to be earlier, NVIDIA requires “GEFORCE GTX” or RTX to be printed on the top. AMD followed similarly. 1050 maybe?

It says Geforce GTX in dark grey lettering. After looking at some more Asus models, it's a perfect match for the Asus 1650 Phoenix (GDDR6). No other models match it exactly. This is using the shroud design from the 3060 Phoenix, but with "Geforce GTX" printed on it instead of RTX and it's using a cheaper (crappier) heatsink. There are no 1660 models like that, nor are there any older Phoenix models like this.

edit: Doesn't have to be the OC version, it's just one of the 1650 G6 models.

Dr. Video Games 0031 fucked around with this message at 19:45 on May 8, 2023

hobbesmaster
Jan 28, 2008

HalloKitty posted:

It has GEFORCE GTX printed on the side

Wow that is almost invisible on my phone. That’d point to a 1650.

Dr. Video Games 0031
Jul 17, 2004

I reverse image searched that image and I think you should bid on that bad boy.

Olewithmilk
Jun 30, 2006

What?

Thank you all! I was hoping it was a 30xx series card, but 1650 is worse than my current one. Great detective work!

Dr. Video Games 0031 posted:

I reverse image searched that image and I think you should bid on that bad boy.

If I buy more computer parts for no good reason my partner will kill me.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

hobbesmaster posted:

It has to be earlier, NVIDIA requires “GEFORCE GTX” or RTX to be printed on the top. AMD followed similarly. 1050 maybe?

1050 didn't need additional power iirc. I originally thought 1050ti but I think the 1650 4GB suggestion might be correct.

Edit: Whoa, late.

MarcusSA
Sep 23, 2007

Olewithmilk posted:

Thank you all! I was hoping it was a 30xx series card, but 1650 is worse than my current one. Great detective work!

If I buy more computer parts for no good reason my partner will kill me.

lol how much is it?

Dr. Video Games 0031
Jul 17, 2004

the winning bid was £101, which is really good for a low-end gaming pc of those specs, assuming it actually works. The memory was 8GB of DDR4, and I'm assuming that's an i3-10100 or something in there (though the actual model was unspecified).

Dr. Video Games 0031 fucked around with this message at 21:05 on May 8, 2023

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Olewithmilk posted:

If I buy more computer parts for no good reason my partner will kill me.

Everybody knows computer parts are a great investment that appreciates in value over time :colbert:

MarcusSA
Sep 23, 2007

Dr. Video Games 0031 posted:

the winning bid was £101, which is really good for a low-end gaming pc of those specs, assuming it actually works. The memory was 8GB of DDR4, and I'm assuming that's an i3-10100 or something in there.

Oh yeah definitely say that’s a good deal.

dkj
Feb 18, 2009

power crystals posted:

Have you also tried at 0? What does HWINFO say your temps are for the various runs? And what kind of clock speed is it hitting for those tests?

Are you sure your board's BIOS isn't old enough it can't actually let the CPU boost? The one this board came with was super old which was an "oh, duh" when I realized it had been stuck at 3.4ghz the whole time.

Compared to 0, the max and 16 thread scores are lower, the 8 and 4 are higher, and the single is basically identical. The highest clock I saw was 4.4K on Hwinfo. Highest temp it got to was 69.5C.

It’s an X570 ITX board, I don’t know how to check when it was manufactured.

Wonton
Jul 5, 2012

hobbesmaster posted:

Have you tested with PCIE3 mode? Even good PCIE4 risers can be questionable and that’s something that could impact overall stability.

Not yet, so try to use pcie 3.0 mode with my 4.0 cable?

Funny enough I moved another itx computer from a raijin tek itx case with riser cable to another matx and I had no problems

yummycheese
Mar 28, 2004

thread moves fast

hobbesmaster
Jan 28, 2008

Wonton posted:

Not yet, so try to use pcie 3.0 mode with my 4.0 cable?

Funny enough I moved another itx computer from a raijin tek itx case with riser cable to another matx and I had no problems

It’d be an EMI or EMC issue so there’s little that you could convince me wouldn’t have a theoretical impact - anything from the bends in the cable, the WiFi channel you’re using, the outlet you have it plugged into, whether your AC is on, the phase of the moon…

Honestly I’m shocked PCIE risers ever work at all.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Olewithmilk posted:

If I buy more computer parts for no good reason my partner will kill me.

:thunkher: : another package came in? what is it this time?

:sweatdrop: : oh, you know... just... some... computer... stuff

power crystals
Jun 6, 2007

Who wants a belly rub??

e: let they who have not had the above conversation...

dkj posted:

Compared to 0, the max and 16 thread scores are lower, the 8 and 4 are higher, and the single is basically identical. The highest clock I saw was 4.4K on Hwinfo. Highest temp it got to was 69.5C.

It’s an X570 ITX board, I don’t know how to check when it was manufactured.

Well you've got me, sorry. Undervolting it reducing the score makes it sound like it's somehow power limited rather than thermal which for a 5800X3D just seems wrong. If you try Prime95 small FFT how hot does it get? Mine even with a -25 CO and custom water cooling still hits like 90-95C in a few seconds. If yours stays cool that might mean something else is acting weird, but if it's hitting 4.4 that doesn't make sense either.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"
Finally got around to trying Jedi Survivor, waited until they "patched" the PC performance and goddamn. 9700K and a 3080, running the game natively at 1440 Ultrawide looks like absolute crap with everything set to high. Everything is pixelated and looks like it's a sub 720p render that's been stretched.

Turning on FSR makes the game look markedly better but doesn't gain any performance. Conversely, turning on raytracing barely impacts performance, maybe a 6-8% performance drop but my GPU and CPU actually get utilized (30-40% usage on both in rasterization but 75%+ when raytracing is enabled)

:shrug:

Sphyre
Jun 14, 2001

How do you make your shovelware unreal engine 4 game so much worse than all the other shovelware unreal engine 4 games, it's nuts

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Didn’t EA have their own amazing engine in dev? Frostbite, or whatever Battlefront was made in, I remember an impressive tech demo. Did they give up on that ? they famously had tons of trouble with non FPS games

Shipon
Nov 7, 2005

jisforjosh posted:

Finally got around to trying Jedi Survivor, waited until they "patched" the PC performance and goddamn. 9700K and a 3080, running the game natively at 1440 Ultrawide looks like absolute crap with everything set to high. Everything is pixelated and looks like it's a sub 720p render that's been stretched.

Turning on FSR makes the game look markedly better but doesn't gain any performance. Conversely, turning on raytracing barely impacts performance, maybe a 6-8% performance drop but my GPU and CPU actually get utilized (30-40% usage on both in rasterization but 75%+ when raytracing is enabled)

:shrug:

Supposedly there's a "bug" where if you disable FSR it just renders at 50% scaling even if you have Native selected...

Kazinsal
Dec 13, 2011

Comfy Fleece Sweater posted:

Didn’t EA have their own amazing engine in dev? Frostbite, or whatever Battlefront was made in, I remember an impressive tech demo. Did they give up on that ? they famously had tons of trouble with non FPS games

Frostbite was notable early on for being a mess whenever they try to do something that isn't a Battlefield game in it, but they eventually forked it for the FIFA and Madden games and those are only mostly janky. The engine first showed up in 2008 with Battlefield: Bad Company and has been a flagship engine for in-house EA games since. It's when they started trying to do non-shooters in it that it rears its technical debt head.

Anthem was the most notable fuckup related to Frostbite not being suitable for the game being developed in the past few years. Squadrons was also a bit of a fuckup but more for the fact that they couldn't keep it going for more than about four months. Dragon Age Inquisition was also a Frostbite game but it was relatively slow paced enough to be workable in those conditions.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
Frostbite actually looked really good - NFS Heat was especially a stand-out game that used it

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Wonton posted:

Not yet, so try to use pcie 3.0 mode with my 4.0 cable?

Funny enough I moved another itx computer from a raijin tek itx case with riser cable to another matx and I had no problems

Definitely try pcie 3 mode.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://twitter.com/VideoCardz/status/1655821359438606336

lol at stringing people along with an 8 GB model before actually getting to the good stuff

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

gradenko_2000 posted:

https://twitter.com/VideoCardz/status/1655821359438606336

lol at stringing people along with an 8 GB model before actually getting to the good stuff

nah sales are so abysmal/the reaction to 8GB is so unexpectedly negative that they have to send it out for manufacturing. they had no plan to do this and now they need hundreds of thousands of clamshell or T-topology PCBs made and manufactured and...

how long does it take to have all of that done at a run? well manufacturing/shipping is at least a month, 2 months sounds about right for a "you started a half hour ago" crash timeline. it immediately leaked that this is the timeline/launch schedule, I bet.

(now do the price... doesn't matter if that's stupid)

Paul MaudDib fucked around with this message at 07:52 on May 9, 2023

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

gradenko_2000 posted:

https://twitter.com/VideoCardz/status/1655821359438606336

lol at stringing people along with an 8 GB model before actually getting to the good stuff

nvidia, please fix your product stack so high end cards have more vram than the mid-range, for crying out loud

HalloKitty fucked around with this message at 09:02 on May 9, 2023

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

HalloKitty posted:

nvidia, please fix your product stack so high end cards have more than the mid-range, for crying out loud

What would it cost to pay a professional to desolder the 10gb VRAM on my 3080 and put 20gb on? Or is that locked behind BIOS or memory bus? I could have is was an option at some point. Once upon a time there was a company that would cheaply desolder a G3 processor from your Macbook and solder in a G4 and it worked.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

gradenko_2000 posted:

https://twitter.com/VideoCardz/status/1655821359438606336

lol at stringing people along with an 8 GB model before actually getting to the good stuff

The sooner GPUs start using non-binary RAM sizes, the better.

Former Human
Oct 15, 2001

Since the 4070Ti already exists and only has 12GB of VRAM what would a hypothetical 16GB edition be called? 4070TiTi? 4070Ti²?

Or does the design even permit 16GB?

I can't wait for it to cost $950.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
4070 Super, obviously

Cygni
Nov 12, 2005

raring to post

Former Human posted:

Since the 4070Ti already exists and only has 12GB of VRAM what would a hypothetical 16GB edition be called? 4070TiTi? 4070Ti²?

Or does the design even permit 16GB?

I can't wait for it to cost $950.

They’ve done multiple ram sizes with the same GPU die and name countless times over the years. It used to be the norm.

Wonton
Jul 5, 2012

HalloKitty posted:

Definitely try pcie 3 mode.

No luck, still have driver clashes :( maybe I should junk my case and transfer it to a NR200P and swap in air cooling? Air cooling is quieter and long term more reliable than AIO imho

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
congrats to nvidia for doing something goofier than the 3060 having 12GB

kliras
Mar 27, 2021
nvidia also doing a diablo iv bundle now ... but the deal only lasts a month

https://twitter.com/VideoCardz/status/1655875382627663872
https://twitter.com/VideoCardz/status/1655880419395477504

article link until they fix the tweet typo: https://videocardz.com/newz/nvidia-bundles-diablo-vi-with-geforce-rtx-40-series

kliras fucked around with this message at 11:31 on May 9, 2023

wargames
Mar 16, 2008

official yospos cat censor

ConanTheLibrarian posted:

The sooner GPUs start using non-binary RAM sizes, the better.

look at amd with 20 and 24gb

Adbot
ADBOT LOVES YOU

Josh Lyman
May 24, 2009


kliras posted:

nvidia also doing a diablo iv bundle now ... but the deal only lasts a month

https://twitter.com/VideoCardz/status/1655875382627663872

article link until they fix the tweet typo: https://videocardz.com/newz/nvidia-bundles-diablo-vi-with-geforce-rtx-40-series
AMD needs to announce the 7800XT bc unlike the $100 steam GC, I’m definitely gonna buy D4 so this is a $70 discount. Looks like a 4070 might be in my future?

I was gonna say that I should just get whatever 3 fan is available at MSRP since length is a nonissue and better cooling would provide more boosting headroom, but this 3 fan Gigabyte is actually slightly shorter than the Asus dual, which makes me think the Asus might actually cool better if it has more fan blade surface area?

https://www.newegg.com/asus-geforce-rtx-4070-dual-rtx4070-o12g/p/N82E16814126638

https://www.newegg.com/gigabyte-geforce-rtx-4070-gv-n4070wf3oc-12gd/p/N82E16814932611

Edit: somehow I missed this MSI. It’s longer but also thinner so I’m not sure about HSF capacity: https://www.newegg.com/msi-geforce-rtx-4070-rtx-4070-ventus-3x-12g-oc/p/N82E16814137789

Josh Lyman fucked around with this message at 11:41 on May 9, 2023

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply