Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SpartanIvy
May 18, 2007
Hair Elf
I won a Newegg Shuffle but it's for a Gigabyte Aorus 6800 and it's $1,400, which is MSRP??? How can anyone justify that for a 6800? :psyduck:

Adbot
ADBOT LOVES YOU

SpartanIvy
May 18, 2007
Hair Elf

Bofast posted:

Does it not also include some other overpriced/trash hardware to up the price? The Aorus 6800 XT was supposed to be $900 or so when Gigabyte announced launch prices, so $1400 for a non-XT seems bizarre even in this environment.

I realized after posting it does include a Mobo so it's actually only $1,250.

So still absurd.

SpartanIvy
May 18, 2007
Hair Elf
Don't forget everything being configured with hardware jumpers. Master/Slave drives, various BIOS settings. Fun times.

SpartanIvy
May 18, 2007
Hair Elf
I'm waiting on the 3080 GIF

SpartanIvy
May 18, 2007
Hair Elf

latinotwink1997 posted:

Gonna have to study up on my metallurgy to figure out which card is the best

Some say 3080 TI FE is optimal but I think a 6040 mix results in a stronger and lighter card.

SpartanIvy
May 18, 2007
Hair Elf
I got a 3060 for MSRP and sold my 1070 TI on eBay and it paid for itself :yeshaha:

SpartanIvy
May 18, 2007
Hair Elf
They're already being resold on eBay for around $2,400. :wtc:

SpartanIvy
May 18, 2007
Hair Elf

Vintersorg posted:

So people are using their sick days to buy this poo poo and wait in line?

If you make $1,000 scalping the card after waiting in line for 24 hours you're still making over 5x minimum wage.

Not a bad use of their time for a large segment of the population.

SpartanIvy
May 18, 2007
Hair Elf
I imagine after demand collapses and stock is freely available you'll see MSRP slowly trickle down back to normal over a couple generations of cards as everyone can't justify upgrading from their $2,000 card after only 2 years. I imagine the 40XX cards will sell like the 20XX did, and prices will drop.

The long term impact may be that there's another huge demand spike in the future as the aging 30XX card owners all start to look to upgrade at the same generation. So we could have another supply shortage with the 50XX or 60XX cards.

SpartanIvy
May 18, 2007
Hair Elf

Cygni posted:

That’s not really how this works. The market segment for $1k GPUs is firmly established at this point. If anything, the market will continue to stretch up, not down.

The market factors that established a $1K GPU segment are not necessarily permanent. You have lots of people with money built up over a pandemic, including stimulus checks, as well as crypto being a driving factor. Plus there's a huge FOMO factor in play at the moment.

If those factors subside there will be considerably less of a market for $1,000+ GPUs

SpartanIvy
May 18, 2007
Hair Elf

Duck and Cover posted:

Before these companies are willing to lower their prices, people will have like four year old cards and will pay whatever.

You could be right, but if NVIDIA wants to play chicken with consumer demand that seems like a great opportunity for AMD or Intel to gain a poo poo load of market share by releasing a card that undercuts them.

You could argue that every manufacturer will just hold stiff at the $1K price point but with AMD and Intel wanting to make inroads on Nvidias market share and them not really being able to compete on performance 1:1, price is really all that's left for them to be competitive on.

SpartanIvy
May 18, 2007
Hair Elf

njsykora posted:

Yeah the 2060 was $150 more RRP than the 1060, this was meant to be the generation where RRP came down again (3060 RRP down $20 but lol) but they're going to keep pushing prices up as long as people will pay them. I also don't see AMD or Intel putting any effort into stopping that, AMD only undercut Nvidia very slightly this gen and Intel will never want to be seen as the cheap option ever.

There's a big difference between the price going up $100 for the next gen and the price staying double of what it should be. As more performance and features get added prices go up. It's happening with CPUs and everything else.

Prices will go up but I doubt the 4060 will be $1,000.

SpartanIvy
May 18, 2007
Hair Elf
Is SLI pronounced Sly? :thunk:

SpartanIvy
May 18, 2007
Hair Elf
SLI let you squeeze out like an additional 20% of performance from your graphics card for double the price of it. Which hilariously enough is also what the 3080 Ti does.

SpartanIvy
May 18, 2007
Hair Elf

BIG HEADLINE posted:

Puget Systems already did a system with four now-extinct Gigabyte blower 3090s and didn't need NVLink.

The system also sounded like a C-130 with all four turning on the tarmac.

Now I want to make a Convair B-36 Peacemaker themed PC with 6 fans and 4 GPUs.

Six turning, four burning

SpartanIvy
May 18, 2007
Hair Elf
The more cards they can muddy the waters with the easier it is to fleece confused buyers

SpartanIvy
May 18, 2007
Hair Elf
They're also way overpriced, even more so than their better performing Nvidia counterparts.

I'd absolutely take a 6800XT but not for 3090 prices.

But they're still selling out so I guess I'm the fool

SpartanIvy
May 18, 2007
Hair Elf

Twibbit posted:

The dark Chobit future is upon us

I hate you for reminding me of Chobits

SpartanIvy
May 18, 2007
Hair Elf

repiv posted:

What are people using those for?

Looks like it turns one PCIE GPU slot into several. So probably crypto mining.

SpartanIvy
May 18, 2007
Hair Elf

Comfy Fleece Sweater posted:

Wasn't it the game that ran with VOXELS ?

Yeah, the entire Delta Force series (and I think everything from Novalogic?) used voxels which gave them features way ahead of their time, like destructible terrain. DF2 and Landwarrior were soooo good and a huge part of what got me into PC gaming. I wish they worked still.

SpartanIvy
May 18, 2007
Hair Elf

Comfy Fleece Sweater posted:

At this rate why not sign up for the 3090, you might get it around the time the 4090 is released

The nice thing about the 4000 series is that the pricing will be so straight forward. The model number and the MSRP will be the same.

SpartanIvy
May 18, 2007
Hair Elf
RGB is out

Beige is in

SpartanIvy
May 18, 2007
Hair Elf
I'm concerned that if things continue on for much longer were going to see a sharp decline in PC gaming in a few years because the barrier to entry will have become impossibly steep for young people and once a lot of people are established with their social groups on consoles, why switch to PC?

SpartanIvy
May 18, 2007
Hair Elf
My case is old and small so there's no back panel to hide cables but I love it so I cut and solder/crimp my cables to length :getin:

SpartanIvy
May 18, 2007
Hair Elf

Back again but I get an error after waiting for validation

E: oos

SpartanIvy
May 18, 2007
Hair Elf
2001: Beige cases filled with colorful cards and bright lights

2021: Colorful cases with bright lights filled with beige cards

SpartanIvy
May 18, 2007
Hair Elf
I bought the Samsung 49" ultrawide monitor and if I hookup my work laptop through it's dock using HDMI it's fine, but the max resolution it shows is 3840x2160, but doesn't have 3840x1080, which is the correct 32x9 aspect ratio.



If I hook it up via DisplayPort it seems to have 3840x1080 as a resolution but the signal looks like this and doesn't even fit the signal on the panel.



I've tried a few display port cables, including the one the monitor came with to no avail.

The monitor is natively 5120x1440 but it looks like the Intel UHD 620 chip of my laptop doesn't support a resolution that large.

None of this is an issue on my personal desktop. It just works fine.

Any idea what could be causing the display port issue?

SpartanIvy fucked around with this message at 22:00 on Oct 7, 2021

SpartanIvy
May 18, 2007
Hair Elf

Shifty Pony posted:

Apparently if you get a USB-C to DP adapter/cable it might work:

https://www.reddit.com/r/ultrawidemasterrace/comments/llo55o/trying_to_get_5120x1440_on_an_intel_uhd_620_is/

Assuming you have an open thunderbolt port of course.
Thanks, I just ordered one. Maybe it'll be a solution.

I still don't understand why normal displayport resolutions won't work. Even 640x480 has that weird garbling.

SpartanIvy
May 18, 2007
Hair Elf

Paul MaudDib posted:

this is more of a monitor thread question (I know it's mostly the same set of people though) but basically if this is the Samsung G9 Neo it's known to be broken as gently caress and Samsung is supposedly working on a firmware update, HUB tested a beta release and said it fixed a lot but not all of the issues.

even if it's not that one, I'd be perfectly willing to believe the same issues might affect other Samsung monitors too, I'm sure there's some degree of hardware similarity and code sharing/etc.

changing which input port it's using might help, as others are suggesting. Unplugging and replugging might too, maybe? And dropping the refresh rate might help.

My old X34p ran 3440x1440 on my skylake laptop's HDMI port without problems, but the newer X34GS (gsync compatible, vs native gsync on the x34p) wouldn't run at the same resolution (it saw it as 2560x1080 max) and since it's a work laptop I didn't feel like getting into EDID spoofing and stuff. I solved this by getting work to get me a dock, it does run properly off the dock's displayport outputs, I assume it would also run properly off a thunderbolt usb-c to displayport cable, or a usb-c to usb-c cable (if your monitor has a type-c input) as those can carry up to DP1.4 (for a thunderbolt 3 port). Basically the displayport is just a better output on old skylake laptops, and a dock may help since it often gives you "active adapters" built in. Personally I found the dell thunderbolt dock has glitches on its hdmi 2.0b controller though, it would occasionally glitch out just like you're seeing (although I don't think it's related here) after switching monitor inputs to the laptop, and you'd have to unplug and replug, which is why I started exploring displayport and type-c displayport solutions.

As far as cables, I would buy something that's bidirectional (usb-c cables are "active" in a sense and do have directionality unless they explicitly specify bidirectional) it just makes things a lot simpler from your perspective. These were the cables I came up with at the time.

https://www.amazon.com/dp/B081VK7Q94

https://www.amazon.com/dp/B07D7PM1TX
It's the SAMSUNG LC49RG90SSNXZA 49-Inch CRG9 (rolls right off the tongue), but as you said I bet they all share the same firmware more or less.

My girlfriend has a thunderbolt dock so I'm going to be trying my laptop with her dock after she's done working. I used to have an LG 3440x1440 monitor and it had no issues with anything until it died. I'll find the monitor thread and post there.

It feels more like a lovely Intel GPU issue than a monitor issue to me but it probably just boils down to Samsung not thinking someone is going to plug in a crappy work laptop into their fancy gaming monitor and vice versa, and not designing around that possibility.

E: my firmware is 1000.1 which is quite out of date so maybe that'll fix it

SpartanIvy fucked around with this message at 23:24 on Oct 7, 2021

SpartanIvy
May 18, 2007
Hair Elf

Leave it to nerds to :regd08:

SpartanIvy
May 18, 2007
Hair Elf
Just to provide closure on my monitor/GPU issue. The firmware I had was actually the most up to date so that wasn't the issue. It looks like the problem is the HP made laptop dock. When I use a Dell Thunderbolt dock my girlfriend uses everything works perfectly at 5120x1440 at 59hz.

So off to Amazon I go!

SpartanIvy
May 18, 2007
Hair Elf
Best Buy drop is happening/happened.

I got a reference 3080 for MSRP somehow.

It's finally over :shepface:


Never thought I would get a racing heartbeat while updating credit card info.

SpartanIvy
May 18, 2007
Hair Elf

CoolCab posted:

nice! you mean an FE one? a reference card is just the generic model that OEMs build based on i think

Yeah I meant FE, which I think in the past was the same thing? I dunno it's all confusing and I dont have to care for another few years!

SpartanIvy
May 18, 2007
Hair Elf

Lord Stimperor posted:

Congratulations! Please look over your shoulder repeatedly while you're walking your purchase back to the car.


Or if it comes by mail, sit by your front door with a baseball bat.

Finally, a reason to exercise my Texas constitutional carry right!

SpartanIvy
May 18, 2007
Hair Elf

OhFunny posted:



Store nearest to me is getting real tired of people coming in and reading out SKUs at the help desk.

One weird trick to getting a GPU, Best Buy employees HATE it! (Because it doesn't work)

SpartanIvy
May 18, 2007
Hair Elf

They're not releasing serials, models, or the volume of cards stolen because they're trying to squash the secondary scalper market by adding perceived risk to the purchases of "new" cards from scalpers.

The more unknowns, the higher the perceived risk of buying a stolen card that can't be registered, and the less likely someone is willing to purchase one for the prices they bring.

SpartanIvy
May 18, 2007
Hair Elf

Elvis_Maximus posted:

:same:

But seriously, what?

Nvidia having a stronk, call the RTXulance

SpartanIvy
May 18, 2007
Hair Elf
Finally watercooled my PC including my 3080 FE. Nothing like combining and $650 card and a $300 water block and hoping you don't end up with a $1,000 paperweight...

SpartanIvy
May 18, 2007
Hair Elf

Kerbtree posted:

I was wondering about that. How’s it behaving, especially the memory junction temp? Which bits are you using?

I'm using the EKWB block.
https://www.ekwb.com/shop/ek-quantum-vector-fe-rtx-3080-d-rgb-black-special-edition

The temps are way better compared to the stock cooler, especially the VRAM temps.

I did a ton of benchmarking and gaming yesterday and couldn't get the GPU temp above 55 degrees after hours and hours. The stock cooler would be in the 80s easily after just a few minutes.

To test the VRAM temps I used the best test I know of: crypto mining. With a stock cooler the VRAM got near 100 degrees within a minute or two and I shut it down. With the water block I let it sit for half an hour or so and the VRAM temp only reached 74!

e: Here's a picture of it, unfortunately the card is a little buried behind the PSU, but you can make out the silver top.

:pcgaming::pcgaming::pcgaming:

SpartanIvy fucked around with this message at 14:50 on May 17, 2022

Adbot
ADBOT LOVES YOU

SpartanIvy
May 18, 2007
Hair Elf
Coop VR, finally!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply