Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
GRINDCORE MEGGIDO
Feb 28, 1985


jonathan posted:

The curtains don't match the rug.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

jonathan posted:

The curtains don't match the rug.

:vince:

magimix
Dec 31, 2003

MY FAT WAIFU!!! :love:
She's fetish efficient :3:

Nap Ghost
One of the fans in my PC is starting to get a bit noisy under load (bearing wearing out, presumably), but I've not quite managed to track down *which* fan. I've ruled out the front and rear case fans, as well as the PSU. This leaves the GPU and CPU fans. What (Windows 10 compatible) program might people recommend to place meaningful, discretely-targeted load that is CPU-only, and GPU only? I tried "HeavyLoad" (it was the first Google result), but was not able to induce fan-noise with however it simulates load.

That aside, assuming it were the fan on the GPU (ASUS GTX 980 TI), presumably I'd be poo poo put of luck, in terms of servicing that?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

magimix posted:

One of the fans in my PC is starting to get a bit noisy under load (bearing wearing out, presumably), but I've not quite managed to track down *which* fan. I've ruled out the front and rear case fans, as well as the PSU. This leaves the GPU and CPU fans. What (Windows 10 compatible) program might people recommend to place meaningful, discretely-targeted load that is CPU-only, and GPU only? I tried "HeavyLoad" (it was the first Google result), but was not able to induce fan-noise with however it simulates load.

That aside, assuming it were the fan on the GPU (ASUS GTX 980 TI), presumably I'd be poo poo put of luck, in terms of servicing that?

When you hear the rattling, lightly press your finger on the center of each fan rotor to stop them from spinning and see which one it is.

Edit: Replacement fans are cheap on eBay if you don't mind waiting a few weeks: https://www.ebay.com/itm/New-87mm-T129215SU-4Pin-Fan-F-ASUS-Strix-RX470-RX460-GTX980TI-R9-390X-Video-Card/183162014786

canyoneer
Sep 13, 2005


I only have canyoneyes for you

1gnoirents posted:

Is this news to anybody here? I could have sworn the Intel GPU was like 100% confirmed not for gaming and probably stopped thinking about it but

https://www.forbes.com/sites/jasonevangelho/2018/04/11/intel-is-developing-a-desktop-gaming-gpu-to-fight-nvidia-amd/#32d2078c3578

Also this is just plain weird still



For years the Intel line of why they aren't interested in dGPUs for gaming was:
1. The market isn't big enough to support 3 platforms. Coke and Pepsi are already there, and they have no desire to be Dr Pepper with 8% market share.
2. Margins are low (especially in game consoles)
3. There aren't a lot of strategic adjacencies to other current lines of business.

#1 has shifted so that in the analogy if Nvidia used to be Coke and AMD was Pepsi, it's more like Nvidia is Coke and Pepsi and AMD is Dr Pepper. Nvidia is just slaying it in the GPU market, for both gaming and data center. The market has also grown like crazy because of the stupid crypto bubble.
#2 isn't true for PC gaming anymore, due to crypto stuff they're able to command higher MSRP and hold to those for longer. Game consoles probably still suck
#3 is definitely not true anymore, because GPUs are a big part of the data center these days and for certain applications they are the most important part of the data center (like AI). So there's meaningful adjacencies in their Xeon Phi business and in their integrated graphics.

TheJeffers
Jan 31, 2007

1gnoirents posted:

Is this news to anybody here? I could have sworn the Intel GPU was like 100% confirmed not for gaming and probably stopped thinking about it but

https://www.forbes.com/sites/jasonevangelho/2018/04/11/intel-is-developing-a-desktop-gaming-gpu-to-fight-nvidia-amd/#32d2078c3578

Also this is just plain weird still



What's more bizarre to me is that people keep presenting the idea that Intel is making a GPU for gamers as some kind of grand newsworthy revelation today. If you read the announcement they made when they hired Koduri it is pretty clear that doing so was in the cards from the get-go. From the horse's mouth: "In this position, Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments." You can parse "broad range of computing segments" several ways but "gaming chip" absolutely fits.

To get something out the door "quickly" Koduri may be repurposing IP that was already in development for other use cases but that's a question of how, not whether, Intel is going to try and break into the market.

GRINDCORE MEGGIDO
Feb 28, 1985


I didn't think they'd make a discrete card, only betterer integrated graphics.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

GRINDCORE MEGGIDO posted:

I didn't think they'd make a discrete card, only betterer integrated graphics.

If they could have made betterer integrated graphics I'd have thought they'd do so by now, but alas it's all sucked.

1gnoirents
Jun 28, 2014

hello :)
Yeah I mean I agree with the idea that I shouldn't be surprised, but its just I'm not used to the idea. They kinda just stopped caring about iGPUs suddenly and that was basically my whole exposure to Intel and GPUs.

I will find it extremely funny if their first GPUs are actually good or great... under Raja after being poached.

GRINDCORE MEGGIDO
Feb 28, 1985


It's win all around. If they suck utterly and AMD's next gen is good, it's maybe even funnier.

I'm holding out for an exquisite shroud design.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Don't they at least have a legit better die process vs. Nvidia?

Cygni
Nov 12, 2005

raring to post

1070 for $430, in store only. Best price on a 1070 I've seen in a while if you really need a graphics card now.

https://www.bestbuy.com/site/nvidia-founders-edition-geforce-gtx-1070-8gb-gddr5-pci-express-3-0-graphics-card-black/5330700.p

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

Cygni posted:

1070 for $430, in store only. Best price on a 1070 I've seen in a while if you really need a graphics card now.

https://www.bestbuy.com/site/nvidia-founders-edition-geforce-gtx-1070-8gb-gddr5-pci-express-3-0-graphics-card-black/5330700.p

I see the 1080 is down to $589.99 at BestBuy too and available for purchase online if, like me, there's no nearby store that has 1070s available.


https://www.bestbuy.com/site/nvidia-founders-edition-geforce-gtx-1080-8gb-gddr5x-pci-express-3-0-graphics-card-black/5330600.p?skuId=5330600

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

1gnoirents posted:

Is this news to anybody here? I could have sworn the Intel GPU was like 100% confirmed not for gaming and probably stopped thinking about it but

https://www.forbes.com/sites/jasonevangelho/2018/04/11/intel-is-developing-a-desktop-gaming-gpu-to-fight-nvidia-amd/#32d2078c3578

Also this is just plain weird still



Looks more like a Just For Men ad

RAJA, YOUR STACHE IS TRASH

craig588
Nov 19, 2005

by Nyc_Tattoo
Furmark will test the GPU, Prime95 will test the CPU. Neither are great as stability tests because they test a limited set of functions, but they'll test the cooling and VRMs the hardest.

The best way is using your finger and tapping on the center of the rotor as already stated though.

Craptacular!
Jul 9, 2001

Fuck the DH
We had to buy a new plain-rear end monitor yesterday, got a 27” Freesync monitor that isn’t used for gaming and will mostly be hooked up to a MacBook, but finally being able to sit comfortably at a 27” monitor and consider its size makes me want to buy a Gsync monitor and extend the viable lifetime of my 1070, since I probably won’t be able to afford any next-gen cards that outperform it.

But my heart is torn because the PS4 Pro doesn’t do 1440, which doesn’t bode well for future consoles, and I don’t want to have to have two displays for two different gaming “experiences” ever again. It’s how I wound up using a well-made 40” TV as a PC display to start with.

Shaocaholica
Oct 29, 2002

Fig. 5E
Is the 1050 Ti still the best low profile gaming card there is? Is it discontinued tho?

Stanley Pain
Jun 16, 2001

by Fluffdaddy

canyoneer posted:

The market has also grown like crazy because of the stupid crypto bubble.

I think you meant to say AI ;)

Cygni
Nov 12, 2005

raring to post

Shaocaholica posted:

Is the 1050 Ti still the best low profile gaming card there is? Is it discontinued tho?

Naw, its not discontinued. Low profile 1050 Ti's are still on newegg, although they are $220. You can get a regular 1050 or RX560 low profile for $160 though.

I thought someone had shown a low profile 1060 3Gb, but I've never seen it on the market.

Shaocaholica
Oct 29, 2002

Fig. 5E

Cygni posted:

Naw, its not discontinued. Low profile 1050 Ti's are still on newegg, although they are $220. You can get a regular 1050 or RX560 low profile for $160 though.

I thought someone had shown a low profile 1060 3Gb, but I've never seen it on the market.

Thanks. What's with all these high end low profile cards on ebay that have been used for mining? Why mine with low profile cards :confused:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

BOOTY-ADE posted:


RAJA, YOUR STACHE IS TRASH

Should he rebrand it, or just wear a shroud?

SwissArmyDruid
Feb 14, 2014

by sebmojo

Zero VGS posted:

Don't they at least have a legit better die process vs. Nvidia?

It's probably safe to say that that lead is about to down to its last embers soon.

I mean, Intel's process lead has dramatically shrunk over the years. Until 10nm Cannon Lake officially drops later this year, they will have been on the 14nm node continually "refining" for almost four years. And they will continue to be on 10nm through the rest of their tock-tick-hitch cycle.

In that time, TSMC and GloFo have moved AMD and Nvidia (300-series and Maxwell at the time) from 28nm to where they're at 14nm now, and TSMC, GloFo, and Samsung are all targetting 7nm in 2019/2020 as their next step, bypassing the 12nm/10nm nodes entirely.

SwissArmyDruid fucked around with this message at 22:46 on Apr 12, 2018

Shaocaholica
Oct 29, 2002

Fig. 5E
Why is DVI still around???

craig588
Nov 19, 2005

by Nyc_Tattoo
It will hang around longer than HDMI. Most bandwidth and same protocol. HDMI was to solve the problem that they thought people were scared hooking up multiple wires.

HDMI is limited to 1920x1200 at 60Hz, while DVI does 2560x1440@120Hz.

Shaocaholica
Oct 29, 2002

Fig. 5E

craig588 posted:

HDMI is limited to 1920x1200 at 60Hz

Lol wut

quote:

Version 2.1
HDMI 2.1 was officially announced by the HDMI Forum on January 4, 2017,[79][80] and was released on November 28, 2017.[119] It adds support for higher resolutions and higher refresh rates, including 4K 120 Hz and 8K 120 Hz. HDMI 2.1 also introduces a new HDMI cable category called 48G, which certifies cables at the new higher speeds that these formats require. 48G HDMI cables are backwards compatible with older HDMI devices, and older cables are compatible with new HDMI 2.1 devices, though the full 48 Gbit/s bandwidth is not possible without the new cables.

Additional details for HDMI 2.1:[120][119]

Maximum supported resolution is 10K at 120 Hz

craig588
Nov 19, 2005

by Nyc_Tattoo
Oh, not bad. For a LONG time it was a more limited version of DVI that made no sense.

Edit: Now the main reason they don't have new videocards with new HDMI standards is probably because there haven't been any new videocards since the standard was ratified.

craig588 fucked around with this message at 23:09 on Apr 12, 2018

canyoneer
Sep 13, 2005


I only have canyoneyes for you

SwissArmyDruid posted:

It's probably safe to say that that lead is about to down to its last embers soon.

I mean, Intel's process lead has dramatically shrunk over the years. Until 10nm Cannon Lake officially drops later this year, they will have been on the 14nm node continually "refining" for almost four years. And they will continue to be on 10nm through the rest of their tock-tick-hitch cycle.

In that time, TSMC and GloFo have moved AMD and Nvidia (300-series and Maxwell at the time) from 28nm to where they're at 14nm now, and TSMC, GloFo, and Samsung are all targetting 7nm in 2019/2020 as their next step, bypassing the 12nm/10nm nodes entirely.

As always, big asterisks on the "x nm" claims because those are essentially marketing words now and are pretty far divorced from actual measurements now.

I wonder how each company's process yields will shake out, especially since Intel and TSMC have very different strategic approaches to yield and process health during the product lifecycle.

repiv
Aug 13, 2009

craig588 posted:

DVI does 2560x1440@120Hz.

Technically this isn't right either, the DL-DVI spec maxes out at 2560x1600@60hz. Some DVI implementations can push the pixel clock much higher than the spec requires but it's never guaranteed.

Shaocaholica
Oct 29, 2002

Fig. 5E
I can't think of anything still modern that would require a DVI connection other than maybe some really obscure scientific or industrial display.

craig588
Nov 19, 2005

by Nyc_Tattoo
I'm on all Display Port now, but that was my old monitor.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Will Nvidia's next generation of gaming cards (Volta? Ampere?) have HDMI 2.1 and variable refresh support? I could see their new gaming cards not support it or at least not support variable refresh just so Nvidia can keep trying to push G-Sync.

Shaocaholica
Oct 29, 2002

Fig. 5E
Whoa I didn't even know variable refresh was an HDMI thing. Is it like free sync?

repiv
Aug 13, 2009

spasticColon posted:

Will Nvidia's next generation of gaming cards (Volta? Ampere?) have HDMI 2.1 and variable refresh support? I could see their new gaming cards not support it or at least not support variable refresh just so Nvidia can keep trying to push G-Sync.

The one Volta chip they've released (Titan V, etc) still only does HDMI 2.0. We know literally nothing about the upcoming chips.

Shaocaholica posted:

Whoa I didn't even know variable refresh was an HDMI thing. Is it like free sync?

It's part of HDMI 2.1 which nothing supports yet, but yeah it's equivalent to Freesync.

Shaocaholica
Oct 29, 2002

Fig. 5E
So is GSync dead or dying?

repiv
Aug 13, 2009

G-Sync isn't going anywhere until AMD gets their poo poo together or Intel releases competitive dGPUs.

Nvidia knows they're the only game in town for enthusiast GPUs so they're under no pressure to support Freesync or HDMI 2.1 VRR.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
It seems like Vega GPUs are significantly higher than comparable Nvidia offerings. Are they better at mining or something?

craig588
Nov 19, 2005

by Nyc_Tattoo
They're significantly worse at mining and gaming, but they made a lot less of them so the prices are higher.

Shaocaholica
Oct 29, 2002

Fig. 5E

repiv posted:

G-Sync isn't going anywhere until AMD gets their poo poo together or Intel releases competitive dGPUs.

Nvidia knows they're the only game in town for enthusiast GPUs so they're under no pressure to support Freesync or HDMI 2.1 VRR.

Ah that makes sense. But if they don't support VFR doesn't that mean they don't get the '2.1' seal of approval from HDMI consortium or does no one give a poo poo about that?

Enos Cabell
Nov 3, 2004


Shaocaholica posted:

Ah that makes sense. But if they don't support VFR doesn't that mean they don't get the '2.1' seal of approval from HDMI consortium or does no one give a poo poo about that?

I could certainly be mistaken, but last time this came up it seems like variable refresh was an optional part of the spec.

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Shaocaholica posted:

Ah that makes sense. But if they don't support VFR doesn't that mean they don't get the '2.1' seal of approval from HDMI consortium or does no one give a poo poo about that?

No one gives a poo poo, people have this wacky fantasy like DisplayPort or HDMI can "force" Nvidia to support FreeSync and it's never gonna happen unless Nvidia has a reason to concede.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply