|
jonathan posted:The curtains don't match the rug.
|
# ? Apr 12, 2018 17:31 |
|
|
# ? Jun 4, 2024 07:30 |
|
jonathan posted:The curtains don't match the rug.
|
# ? Apr 12, 2018 17:57 |
|
One of the fans in my PC is starting to get a bit noisy under load (bearing wearing out, presumably), but I've not quite managed to track down *which* fan. I've ruled out the front and rear case fans, as well as the PSU. This leaves the GPU and CPU fans. What (Windows 10 compatible) program might people recommend to place meaningful, discretely-targeted load that is CPU-only, and GPU only? I tried "HeavyLoad" (it was the first Google result), but was not able to induce fan-noise with however it simulates load. That aside, assuming it were the fan on the GPU (ASUS GTX 980 TI), presumably I'd be poo poo put of luck, in terms of servicing that?
|
# ? Apr 12, 2018 19:20 |
|
magimix posted:One of the fans in my PC is starting to get a bit noisy under load (bearing wearing out, presumably), but I've not quite managed to track down *which* fan. I've ruled out the front and rear case fans, as well as the PSU. This leaves the GPU and CPU fans. What (Windows 10 compatible) program might people recommend to place meaningful, discretely-targeted load that is CPU-only, and GPU only? I tried "HeavyLoad" (it was the first Google result), but was not able to induce fan-noise with however it simulates load. When you hear the rattling, lightly press your finger on the center of each fan rotor to stop them from spinning and see which one it is. Edit: Replacement fans are cheap on eBay if you don't mind waiting a few weeks: https://www.ebay.com/itm/New-87mm-T129215SU-4Pin-Fan-F-ASUS-Strix-RX470-RX460-GTX980TI-R9-390X-Video-Card/183162014786
|
# ? Apr 12, 2018 19:28 |
|
1gnoirents posted:Is this news to anybody here? I could have sworn the Intel GPU was like 100% confirmed not for gaming and probably stopped thinking about it but For years the Intel line of why they aren't interested in dGPUs for gaming was: 1. The market isn't big enough to support 3 platforms. Coke and Pepsi are already there, and they have no desire to be Dr Pepper with 8% market share. 2. Margins are low (especially in game consoles) 3. There aren't a lot of strategic adjacencies to other current lines of business. #1 has shifted so that in the analogy if Nvidia used to be Coke and AMD was Pepsi, it's more like Nvidia is Coke and Pepsi and AMD is Dr Pepper. Nvidia is just slaying it in the GPU market, for both gaming and data center. The market has also grown like crazy because of the stupid crypto bubble. #2 isn't true for PC gaming anymore, due to crypto stuff they're able to command higher MSRP and hold to those for longer. Game consoles probably still suck #3 is definitely not true anymore, because GPUs are a big part of the data center these days and for certain applications they are the most important part of the data center (like AI). So there's meaningful adjacencies in their Xeon Phi business and in their integrated graphics.
|
# ? Apr 12, 2018 19:47 |
|
1gnoirents posted:Is this news to anybody here? I could have sworn the Intel GPU was like 100% confirmed not for gaming and probably stopped thinking about it but What's more bizarre to me is that people keep presenting the idea that Intel is making a GPU for gamers as some kind of grand newsworthy revelation today. If you read the announcement they made when they hired Koduri it is pretty clear that doing so was in the cards from the get-go. From the horse's mouth: "In this position, Koduri will expand Intel’s leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments." You can parse "broad range of computing segments" several ways but "gaming chip" absolutely fits. To get something out the door "quickly" Koduri may be repurposing IP that was already in development for other use cases but that's a question of how, not whether, Intel is going to try and break into the market.
|
# ? Apr 12, 2018 19:52 |
|
I didn't think they'd make a discrete card, only betterer integrated graphics.
|
# ? Apr 12, 2018 20:39 |
|
GRINDCORE MEGGIDO posted:I didn't think they'd make a discrete card, only betterer integrated graphics. If they could have made betterer integrated graphics I'd have thought they'd do so by now, but alas it's all sucked.
|
# ? Apr 12, 2018 20:42 |
|
Yeah I mean I agree with the idea that I shouldn't be surprised, but its just I'm not used to the idea. They kinda just stopped caring about iGPUs suddenly and that was basically my whole exposure to Intel and GPUs. I will find it extremely funny if their first GPUs are actually good or great... under Raja after being poached.
|
# ? Apr 12, 2018 20:44 |
|
It's win all around. If they suck utterly and AMD's next gen is good, it's maybe even funnier. I'm holding out for an exquisite shroud design.
|
# ? Apr 12, 2018 20:49 |
|
Don't they at least have a legit better die process vs. Nvidia?
|
# ? Apr 12, 2018 20:52 |
|
1070 for $430, in store only. Best price on a 1070 I've seen in a while if you really need a graphics card now. https://www.bestbuy.com/site/nvidia-founders-edition-geforce-gtx-1070-8gb-gddr5-pci-express-3-0-graphics-card-black/5330700.p
|
# ? Apr 12, 2018 20:56 |
|
Cygni posted:1070 for $430, in store only. Best price on a 1070 I've seen in a while if you really need a graphics card now. I see the 1080 is down to $589.99 at BestBuy too and available for purchase online if, like me, there's no nearby store that has 1070s available. https://www.bestbuy.com/site/nvidia-founders-edition-geforce-gtx-1080-8gb-gddr5x-pci-express-3-0-graphics-card-black/5330600.p?skuId=5330600
|
# ? Apr 12, 2018 21:06 |
|
1gnoirents posted:Is this news to anybody here? I could have sworn the Intel GPU was like 100% confirmed not for gaming and probably stopped thinking about it but Looks more like a Just For Men ad RAJA, YOUR STACHE IS TRASH
|
# ? Apr 12, 2018 21:15 |
|
Furmark will test the GPU, Prime95 will test the CPU. Neither are great as stability tests because they test a limited set of functions, but they'll test the cooling and VRMs the hardest. The best way is using your finger and tapping on the center of the rotor as already stated though.
|
# ? Apr 12, 2018 21:17 |
|
We had to buy a new plain-rear end monitor yesterday, got a 27” Freesync monitor that isn’t used for gaming and will mostly be hooked up to a MacBook, but finally being able to sit comfortably at a 27” monitor and consider its size makes me want to buy a Gsync monitor and extend the viable lifetime of my 1070, since I probably won’t be able to afford any next-gen cards that outperform it. But my heart is torn because the PS4 Pro doesn’t do 1440, which doesn’t bode well for future consoles, and I don’t want to have to have two displays for two different gaming “experiences” ever again. It’s how I wound up using a well-made 40” TV as a PC display to start with.
|
# ? Apr 12, 2018 21:24 |
|
Is the 1050 Ti still the best low profile gaming card there is? Is it discontinued tho?
|
# ? Apr 12, 2018 21:27 |
|
canyoneer posted:The market has also grown like crazy because of the stupid crypto bubble. I think you meant to say AI
|
# ? Apr 12, 2018 21:37 |
|
Shaocaholica posted:Is the 1050 Ti still the best low profile gaming card there is? Is it discontinued tho? Naw, its not discontinued. Low profile 1050 Ti's are still on newegg, although they are $220. You can get a regular 1050 or RX560 low profile for $160 though. I thought someone had shown a low profile 1060 3Gb, but I've never seen it on the market.
|
# ? Apr 12, 2018 21:46 |
|
Cygni posted:Naw, its not discontinued. Low profile 1050 Ti's are still on newegg, although they are $220. You can get a regular 1050 or RX560 low profile for $160 though. Thanks. What's with all these high end low profile cards on ebay that have been used for mining? Why mine with low profile cards
|
# ? Apr 12, 2018 21:52 |
|
BOOTY-ADE posted:
Should he rebrand it, or just wear a shroud?
|
# ? Apr 12, 2018 22:43 |
|
Zero VGS posted:Don't they at least have a legit better die process vs. Nvidia? It's probably safe to say that that lead is about to down to its last embers soon. I mean, Intel's process lead has dramatically shrunk over the years. Until 10nm Cannon Lake officially drops later this year, they will have been on the 14nm node continually "refining" for almost four years. And they will continue to be on 10nm through the rest of their tock-tick-hitch cycle. In that time, TSMC and GloFo have moved AMD and Nvidia (300-series and Maxwell at the time) from 28nm to where they're at 14nm now, and TSMC, GloFo, and Samsung are all targetting 7nm in 2019/2020 as their next step, bypassing the 12nm/10nm nodes entirely. SwissArmyDruid fucked around with this message at 22:46 on Apr 12, 2018 |
# ? Apr 12, 2018 22:44 |
|
Why is DVI still around???
|
# ? Apr 12, 2018 22:55 |
|
It will hang around longer than HDMI. Most bandwidth and same protocol. HDMI was to solve the problem that they thought people were scared hooking up multiple wires. HDMI is limited to 1920x1200 at 60Hz, while DVI does 2560x1440@120Hz.
|
# ? Apr 12, 2018 22:58 |
|
craig588 posted:HDMI is limited to 1920x1200 at 60Hz Lol wut quote:Version 2.1
|
# ? Apr 12, 2018 23:02 |
|
Oh, not bad. For a LONG time it was a more limited version of DVI that made no sense. Edit: Now the main reason they don't have new videocards with new HDMI standards is probably because there haven't been any new videocards since the standard was ratified. craig588 fucked around with this message at 23:09 on Apr 12, 2018 |
# ? Apr 12, 2018 23:05 |
|
SwissArmyDruid posted:It's probably safe to say that that lead is about to down to its last embers soon. As always, big asterisks on the "x nm" claims because those are essentially marketing words now and are pretty far divorced from actual measurements now. I wonder how each company's process yields will shake out, especially since Intel and TSMC have very different strategic approaches to yield and process health during the product lifecycle.
|
# ? Apr 12, 2018 23:08 |
|
craig588 posted:DVI does 2560x1440@120Hz. Technically this isn't right either, the DL-DVI spec maxes out at 2560x1600@60hz. Some DVI implementations can push the pixel clock much higher than the spec requires but it's never guaranteed.
|
# ? Apr 12, 2018 23:11 |
|
I can't think of anything still modern that would require a DVI connection other than maybe some really obscure scientific or industrial display.
|
# ? Apr 12, 2018 23:14 |
|
I'm on all Display Port now, but that was my old monitor.
|
# ? Apr 12, 2018 23:25 |
|
Will Nvidia's next generation of gaming cards (Volta? Ampere?) have HDMI 2.1 and variable refresh support? I could see their new gaming cards not support it or at least not support variable refresh just so Nvidia can keep trying to push G-Sync.
|
# ? Apr 12, 2018 23:25 |
|
Whoa I didn't even know variable refresh was an HDMI thing. Is it like free sync?
|
# ? Apr 12, 2018 23:28 |
|
spasticColon posted:Will Nvidia's next generation of gaming cards (Volta? Ampere?) have HDMI 2.1 and variable refresh support? I could see their new gaming cards not support it or at least not support variable refresh just so Nvidia can keep trying to push G-Sync. The one Volta chip they've released (Titan V, etc) still only does HDMI 2.0. We know literally nothing about the upcoming chips. Shaocaholica posted:Whoa I didn't even know variable refresh was an HDMI thing. Is it like free sync? It's part of HDMI 2.1 which nothing supports yet, but yeah it's equivalent to Freesync.
|
# ? Apr 12, 2018 23:30 |
|
So is GSync dead or dying?
|
# ? Apr 12, 2018 23:31 |
|
G-Sync isn't going anywhere until AMD gets their poo poo together or Intel releases competitive dGPUs. Nvidia knows they're the only game in town for enthusiast GPUs so they're under no pressure to support Freesync or HDMI 2.1 VRR.
|
# ? Apr 12, 2018 23:33 |
|
It seems like Vega GPUs are significantly higher than comparable Nvidia offerings. Are they better at mining or something?
|
# ? Apr 12, 2018 23:39 |
|
They're significantly worse at mining and gaming, but they made a lot less of them so the prices are higher.
|
# ? Apr 12, 2018 23:42 |
|
repiv posted:G-Sync isn't going anywhere until AMD gets their poo poo together or Intel releases competitive dGPUs. Ah that makes sense. But if they don't support VFR doesn't that mean they don't get the '2.1' seal of approval from HDMI consortium or does no one give a poo poo about that?
|
# ? Apr 12, 2018 23:44 |
|
Shaocaholica posted:Ah that makes sense. But if they don't support VFR doesn't that mean they don't get the '2.1' seal of approval from HDMI consortium or does no one give a poo poo about that? I could certainly be mistaken, but last time this came up it seems like variable refresh was an optional part of the spec.
|
# ? Apr 12, 2018 23:48 |
|
|
# ? Jun 4, 2024 07:30 |
|
Shaocaholica posted:Ah that makes sense. But if they don't support VFR doesn't that mean they don't get the '2.1' seal of approval from HDMI consortium or does no one give a poo poo about that? No one gives a poo poo, people have this wacky fantasy like DisplayPort or HDMI can "force" Nvidia to support FreeSync and it's never gonna happen unless Nvidia has a reason to concede.
|
# ? Apr 12, 2018 23:49 |