Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.

EdEddnEddy posted:

I have been wanting to do something similar. One of these days I need to give it a go and see if I can get it working in any good way.
I feel like this is perfect usage scenario for a Threadripper setup, with each VM getting 4/8 assigned and a GPU. I know it's not really any more feasible than just building 4 cheaper systems, but there's something elegant about it that draws me in.

Adbot
ADBOT LOVES YOU

EdEddnEddy
Apr 5, 2012



ZobarStyl posted:

I feel like this is perfect usage scenario for a Threadripper setup, with each VM getting 4/8 assigned and a GPU. I know it's not really any more feasible than just building 4 cheaper systems, but there's something elegant about it that draws me in.

I want to do exactly this for a simple Drone Piloting training sim that we currently run on vastly underpowered AIO's. The hard part after the GPU passthrough I worry about is the little remote controller check that the app uses as DRM. I suppose as long as the VM passes the USB through without issue it shouldn't be a problem.

Lord Ludikrous
Jun 7, 2008

Enjoy your tea...

Actually another question I was supposed to ask; if prices were to return to some form of sanity within the coming months, would I be suffering from bottle necking if I was to chuck a 6GB GTX 1060 (or whatever the new equivalent is thats coming out) in my machine?

Current specs:

Intel i5 4670k 3.5ghz (overclocked to 4.0ghz)
16gb DDR3 1600
Asus Z87-C Motherboard
Gigabyte Windforce OC GTX 760 2GB

Back when I was much more enthusiastic about PC hardware a CPU almost 5 years old was basically put out to pasture come upgrade time, but things seemed to have slowed fairly drastically in recent years. If I could get away with spending £250ish to keep me going for a few years I'd prefer to do that over a full upgrade.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Paul MaudDib posted:

Not a scam, that's just what they go for. Maybe a bit under the market for a quick sale, but ballpark correct.

A bit undermarket is a hell of a lot better than a bit over the market. Being slightly more expensive than everyone else or even the same price means you would sell your card from weeks to never as people keep undercutting you.

A couple of my cards sat for a day or two at market so I lowered them $50 and that did the trick. Glad I did because they're all at least that much cheaper now, a week later.

Lord Ludikrous posted:

Intel i5 4670k 3.5ghz (overclocked to 4.0ghz)

No card is going to be bottlenecked by that CPU in gaming, it's probably still in the top 10% of what game devs are targeting for clocks and cores, and IPC hasn't gone up much since then.

Zero VGS fucked around with this message at 23:51 on Feb 12, 2018

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Lord Ludikrous posted:

Actually another question I was supposed to ask; if prices were to return to some form of sanity within the coming months, would I be suffering from bottle necking if I was to chuck a 6GB GTX 1060 (or whatever the new equivalent is thats coming out) in my machine?

Current specs:

Intel i5 4670k 3.5ghz (overclocked to 4.0ghz)
16gb DDR3 1600
Asus Z87-C Motherboard
Gigabyte Windforce OC GTX 760 2GB

Back when I was much more enthusiastic about PC hardware a CPU almost 5 years old was basically put out to pasture come upgrade time, but things seemed to have slowed fairly drastically in recent years. If I could get away with spending £250ish to keep me going for a few years I'd prefer to do that over a full upgrade.

I've got a 4670K and I just put a GTX 1080 in it. It's helped a lot driving my 1440 display. I don't think you'll have any issues unless you find something very heavily multi-threaded to run.

Yaoi Gagarin
Feb 20, 2014

Zero VGS posted:

A bit undermarket is a hell of a lot better than a bit over the market. Being slightly more expensive than everyone else or even the same price means you would sell your card from weeks to never as people keep undercutting you.

A couple of my cards sat for a day or two at market so I lowered them $50 and that did the trick. Glad I did because they're all at least that much cheaper now, a week later.


No card is going to be bottlenecked by that CPU in gaming, it's probably still in the top 10% of what game devs are targeting for clocks and cores, and IPC hasn't gone up much since then.

I don't think this is entirely true. In battlefield 1 I'm pretty sure my CPU is bottlenecking my 1080ti at 1440p, and I have a 4790k

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

VostokProgram posted:

I don't think this is entirely true. In battlefield 1 I'm pretty sure my CPU is bottlenecking my 1080ti at 1440p, and I have a 4790k

There are some games so terribly programmed that there is just nothing you can do to wring acceptable performance out of them. That's one of them.

Yaoi Gagarin
Feb 20, 2014

DrDork posted:

There are some games so terribly programmed that there is just nothing you can do to wring acceptable performance out of them. That's one of them.

I mean, I'm getting well above 60 almost all the time so it's not like it's bad, just pointing out that it can still happen

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

VostokProgram posted:

I mean, I'm getting well above 60 almost all the time so it's not like it's bad, just pointing out that it can still happen

Sure, but even an 8700k isn't going to make that big of an improvement in these cases, is what I'm saying. I have a SiliconLottery.com binned 5.2ghz chip and it still can't make, let's say Arma, run smooth.

Truga
May 4, 2014
Lipstick Apathy
The biggest bottleneck on arma is ram bandwidth, though.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Truga posted:

The biggest bottleneck on arma is ram bandwidth, though.

Well whatever, replace with anything Ubisoft.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

VostokProgram posted:

I don't think this is entirely true. In battlefield 1 I'm pretty sure my CPU is bottlenecking my 1080ti at 1440p, and I have a 4790k

I upgraded my 4690k to a 7700k after getting a 1080, because it was so bottlenecked in Battlefield 1 at 1080p. It made such a huge difference upgrading the processor.

I think it may have been specific to Battlefield 1 though, not exactly sure since it was the only game I was playing at the time. For some reason performance was trash on the 4690k after retail release (it was fine in beta). There were tons of threads on reddit about it...everybody was thinking it would be fixed in a patch at some point. Guess not.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
Battlefield 1 performance issues are new to me. For all of EA's faults, their big budget games tend to perform well in that category.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I wonder if BF1 really likes RAM bandwidth. I run my 5820K with both hands tied behind its back (3.75 GHz, mining in the background with 6 threads of xmr-stak-cpu consuming 50% of the CPU and 12/15 MB of my L3$ with AVX2) and I still crank a very playable framerate (80-90fps) at 3440x1440 on my 1080. That's Haswell, so it wouldn't be that, the other difference that comes to mind is quad-channel DDR4-3000 vs dual-channel DDR3. Certainly wouldn't be L3$ or clocks making the difference, and I have less MT performance available than a 4790K.

The game logic is trash but it's always performed fine for me. I'd consider it to be pretty close to Doom performance-wise - I can run that trick on Doom as well, or Source Engine titles (of course).

Paul MaudDib fucked around with this message at 07:36 on Feb 13, 2018

MagusDraco
Nov 11, 2011

even speedwagon was trolled
Meanwhile back before I quit shortly after giant's shadow I would average 30fps sometimes depending on the map. I would never consistently hit 60 that's for sure

i5-3550 and 16 gigs of ddr3 ram at 2200mhz running the game at 2560x1440 with a 1070 with a stock overclock.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
That doesn't necessarily say anything, were you GPU-bound or CPU-bound?

I'd really think that even a locked Ivy i5 would be able to do close to 60 fps if it was CPU-bound.

Dude Sweet
Jul 26, 2010
BF1 has a console command graph that will show performance (not sure what metric it is, maybe time spent waiting for data?) for CPU and GPU as yellow and green lines respectively.
My GPU will generally hover around a certain level, but CPU will move up/down a lot more and have a spike way above the top of the graph at least once or twice out of every ten seconds, and CPU just runs higher on average.

Depending on map and weather effect I can see stable FPS in the range of 70-90, or a stuttering mess of below 40 where my aim itself seems to stutter as I look around. Changing between Ultra and low/medium doesn't make a difference when it's slow.
It's not always the same map/weather combinations producing poor performance either, I've had the same conditions produce good and bad performance on different days
Patches to BF1 and different GPU drivers do seem to have some impact as well, one day I was putting up with 40-50fps with stuttering aim, the next day after a patch was rock-solid 90fps.
I've got a 4670 non-k with a RX480 8GB Nitro+, with 16GB of DDR3 at 1333MHz, running at 1440p.

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Paul MaudDib posted:

That doesn't necessarily say anything, were you GPU-bound or CPU-bound?

I'd really think that even a locked Ivy i5 would be able to do close to 60 fps if it was CPU-bound.

Dx11 would have CPU utilization at 100% the entire time.

Dx12 would not but had its own problems with stuttering.

Beyond that I didn't check/know how to check if I was CPU or GPU bound. Can't really check now since the game is not installed and I don't have SSD space for it.

MagusDraco fucked around with this message at 14:35 on Feb 13, 2018

repiv
Aug 13, 2009

Reuters: The new [Nvidia] gaming chip, code named Turing, is expected to be unveiled next month.

:thunk:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Might be legit, an ARM processor startup run by some ex-Intel employees called Ampere Computing exited stealth mode a few weeks ago, I've been wondering how the trademark thing would play out.

repiv
Aug 13, 2009

That's true, although Turing is an awkward choice since they already did Tesla and named a bunch of chips using the GTxxx pattern.

Enos Cabell
Nov 3, 2004


Smirk posted:

Try this: in the settings menu of the monitor itself, in the Other section, turn Power Saving off.

Holy crap, thanks for this! I can now turn both displays off, let them sleep, whatever and it will always come back up with the windows in the correct position.

axeil
Feb 14, 2006
Hey question for you folks, I've been reading around and it seems like it's possible to flash an RX480 to an RX580.

Is this a crazy idea or is it something people have done safely? I remember the days when you could turn a Radeon 9600 into a Radeon 9800 fairly easily.

This is the exact model of card I have. Already has a stock overclock to 1328 mhz

https://www.techpowerup.com/gpudb/b3684/xfx-rx-480-black-edition

Someone on techpowerup reported doing it successfully:

https://www.techpowerup.com/forums/threads/xfx-rx480-8gb-flashed-to-saphire-rx580.232489/post-3643576

and this guy on overclock.net walks through hex editing the bios:

http://www.overclock.net/forum/67-amd-ati/1634872-bios-mod-rx480-rx580-conversions-how-rx470-rx480-rx570-rx580.html

quote:

I want to share my results, too. :) I flashed my XFX RX 480 "Custom Backplate Edition 1328 MHz" (RX-480M8BBA6) with this PowerColor RX 580 Red Devil BIOS. All Ports are working fine. With the Sapphire BIOS, one DisplayPort was dead. With the PowerColor it works like a charm. Superposition runs through but Mass Effect 4 dies right after the loading screen. I'll look into it, and I think I'll figure out some stable settings. :)

All in all it's a success I would say.

edit: It seems like a crazy idea to get 5-10% more performance, I just wanted folks here to let me know if my initial hunch is correct.

axeil fucked around with this message at 16:50 on Feb 13, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
You need a VBIOS for a card with the exact same hardware configuration, including the VRM setup, so only a few cards can do it. As such it's functionally identical to overclocking it manually and/or playing with the memory timings. The only actual new thing in the 500 series was adding support for an intermediate P-state for media playback between idle and gaming clocks, so you can save 10-15W while watching a movie.

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!




quote:

“We do think that cryptocurrency has been a very significant factor in both revenue and pricing, creating a shortage environment that is boosting pricing,” Morgan Stanley analyst Joseph Moore said in a note after the results.

This man's a genius. Give him a raise. Wow. Such insight. So smart.

axeil
Feb 14, 2006

Paul MaudDib posted:

You need a VBIOS for a card with the exact same hardware configuration, including the VRM setup, so only a few cards can do it. As such it's functionally identical to overclocking it manually and/or playing with the memory timings. The only actual new thing in the 500 series was adding support for an intermediate P-state for media playback between idle and gaming clocks, so you can save 10-15W while watching a movie.

Ah thanks for this. Sounds like I can get equivalent performance without risking bricking my card by just playing around with WattMan?

I had posted about a month earlier about trying some stuff but it turned out to be unstable and the card crashed/reverted to default and I haven't had the time yet to try again.

Star Man
Jun 1, 2008

There's a star maaaaaan
Over the rainbow
I am willing to pay $400 for an Asus Stryx 6 GB GTX 1060.

Is that good enough for a 2K display and 60 fps on medium settings or should I sell my sister's kidneys for a 1080 instead?

Overminty
Mar 16, 2010

You may wonder what I am doing while reading your posts..

Star Man posted:

I am willing to pay $400 for an Asus Stryx 6 GB GTX 1060.

Is that good enough for a 2K display and 60 fps on medium settings or should I sell my sister's kidneys for a 1080 instead?

I run a rx480 on a 2k high refresh monitor and I'd probably say yes you're fine. High settings will probably be fine for a lot of games too.

Star Man
Jun 1, 2008

There's a star maaaaaan
Over the rainbow
I don't have a high refresh monitor nor does it have G-sync.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Star Man posted:

I am willing to pay $400 for an Asus Stryx 6 GB GTX 1060.

Is that good enough for a 2K display and 60 fps on medium settings or should I sell my sister's kidneys for a 1080 instead?

You'll probably be fine if 60 is your target, though a 1070 would be a better choice. My brother has basically your exact setup though in 1080p and usually gets in the 80s on high settings.

Funkysock
Aug 8, 2011


Clapping Larry

That would explain Steve Burke's comments on one of the Gamersnexus videos a few days ago. He seemed to have information about upcoming Nvidia products not named Ampere that he couldn't share yet.

https://youtu.be/9QG5hvy24RU
This one

Funkysock fucked around with this message at 19:34 on Feb 13, 2018

Kramjacks
Jul 5, 2007

Funkysock posted:

That would explain Steve Burke's comments on one of the Gamersnexus videos a few days ago. He seemed to have information about upcoming Nvidia products not named Ampere that he couldn't share yet.

https://youtu.be/9QG5hvy24RU
This one

Yeah, he's said a few times that the next Nvidia consumer cards might not be named what people thought they were, in his typical kinda sarcastic way.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Finally, a graphics card as gay as me.

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:

repiv posted:

code named Turing, is expected to be unveiled next month.[/url]

This is it; the year AI becomes a thing and kills us all. :tinfoil:

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Joke's on us, Turing is an architecture optimized for deep learning and computation, not video games.

AMD was ahead of the game!

redeyes
Sep 14, 2002

by Fluffdaddy
They named the compute card Ampere and the gaming card Turing. Makes perfect sense.

redeyes fucked around with this message at 20:35 on Feb 13, 2018

Anime Schoolgirl
Nov 28, 2002

Mister Facetious posted:

This is it; the year AI becomes a thing and kills us all. :tinfoil:
about time

Rastor
Jun 2, 2001

Here's the D&D thread about AI becoming a thing and killing us all, for those who want to discuss:

https://forums.somethingawful.com/showthread.php?threadid=3800017

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Rastor posted:

Here's the D&D thread about AI becoming a thing and killing us all, for those who want to discuss:

https://forums.somethingawful.com/showthread.php?threadid=3800017

That would require going into D&D.

Adbot
ADBOT LOVES YOU

Anime Schoolgirl
Nov 28, 2002

lol if you read D&D and not C-SPAM

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply