Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Malcolm XML
Aug 8, 2009

I always knew it would end like this.

The_Franz posted:

OpenGL has actually had things like multi-draw-indirect for a few years now which lets you concatenate thousands or even tens of thousands of draw calls into one. People are only just noticing now because Valve has started dragging the entire industry away from it's Windows-centric viewpoint.

What will be really nice is when all of the vendors finally support ARB_BINDLESS and concepts like texture units and texture array limitations become a distant bad memory.

It's a lot less valve and a lot more iPhone that's been the drive toward opengl

the issue is that ensuring that extensions are supported on all of your client platforms is a complete pain. DirectX at least lets you know that if you program to DX11, you get all of DX11 features.

I suspect we'll see implementations where one is optimized for XB1/DX and one for whatever PS4 runs (Mantle?) before we see people explicitly optimizing for OpenGL on the desktop.

Adbot
ADBOT LOVES YOU

Nephilm
Jun 11, 2009

by Lowtax

Malcolm XML posted:

It's a lot less valve and a lot more iPhone that's been the drive toward opengl

the issue is that ensuring that extensions are supported on all of your client platforms is a complete pain. DirectX at least lets you know that if you program to DX11, you get all of DX11 features.

I suspect we'll see implementations where one is optimized for XB1/DX and one for whatever PS4 runs (Mantle?) before we see people explicitly optimizing for OpenGL on the desktop.

PS4 doesn't use mantle; I think it uses a variant of OpenGL, actually.

The_Franz
Aug 8, 2003

Malcolm XML posted:

It's a lot less valve and a lot more iPhone that's been the drive toward opengl

the issue is that ensuring that extensions are supported on all of your client platforms is a complete pain. DirectX at least lets you know that if you program to DX11, you get all of DX11 features.

I suspect we'll see implementations where one is optimized for XB1/DX and one for whatever PS4 runs (Mantle?) before we see people explicitly optimizing for OpenGL on the desktop.

OpenGL and OpenGLES on mobile devices are two different beasts. Most of the techniques and optimizations they've been discussing at recent developer conferences aren't applicable to ES. Fragmentation isn't really a problem as long as you pick a core spec and stick to it. It only becomes an issue if you really want to use some of the bleeding-edge or vendor-specific features.

None of the consoles support Mantle and probably never will. The XB1 strictly uses D3D and the PS4 has it's own proprietary low-level API similar to the PS3 as well as an OpenGL implementation (and I think a D3D emulator on top of that) for more easily porting games that aren't bleeding-edge AAA titles. On the desktop, more and more companies are already starting to target modern OpenGL targets. Some slides during a presentation at Valve's dev conference gave away that Epic has been working with Nvidia to optimize the OpenGL backend in Unreal Engine 4 and both AMD and Nvidia have been improving OpenGL support in their profiling tools as well as porting them to Linux, so there is definitely some developer demand for it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
4K seems to be on the horizon at this point. The ASUS PB287Q is going to do 4K @ 60hz for $800 sometime in Q2. What kind of GPU are you going to need to drive something like that at reasonable (non-Ultra) settings?

ShaneB
Oct 22, 2002


Paul MaudDib posted:

4K seems to be on the horizon at this point. The ASUS PB287Q is going to do 4K @ 60hz for $800 sometime in Q2. What kind of GPU are you going to need to drive something like that at reasonable (non-Ultra) settings?

YOU MOVED YOUR POST! :)

A very very powerful one. It's literally 4x the pixels of 1080p so.... Stuff like a 290x doesn't even really make high-end games at 4k very playable: http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650-17.html

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Paul MaudDib posted:

4K seems to be on the horizon at this point. The ASUS PB287Q is going to do 4K @ 60hz for $800 sometime in Q2. What kind of GPU are you going to need to drive something like that at reasonable (non-Ultra) settings?

Any chance we'll see g-sync in these 4K displays? Would a 4GB 770 be able to handle gaming @ 4k?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

fletcher posted:

Any chance we'll see g-sync in these 4K displays? Would a 4GB 770 be able to handle gaming @ 4k?

As to the former, who knows, but the 30fps floor makes it an unlikely use case for any single card to handle well. G-Sync is ideal for framerates between 30fps and 144fps, but it ceases to function at 30fps and below (reverts to traditional VSync). Given that these displays are 60fps displays, that would mean going back to very low framerates, and without the direct quality that G-Sync gives for the mid-40s FPS in the demos that we've seen so far.

I'd think two high-end cards would be the minimum for acceptable gaming quality at 4K resolution. But, we're also at a point where two high-end cards can actually make 4K pretty nice - check Anand's 2x780Ti vs. 2x290X GPU bench comparison, which helpfully includes some 4K tests. Many games are at nicely playable average framerates. That high of a resolution will bring out any AMD-advantage games very substantially because of the higher pixel throughput - anything strictly geometry bound will benefit from the substantial ROP advantage that the 290X enjoys, 128 combined ROPs vs. less than 100 ROPs from nVidia's GK110. But, then, that's "last gen" hardware from nVidia still keeping up quite well. I'm not even going to try to gently caress with the value question since prices are all over the place for AMD hardware but ideally you'd see the original street price of the R9 290, $400 stock but add more for vendor specific cooling, leading the price:performance charge with nearly the same level of performance seen in the 290X in that comparison for about $880 accounting for buying units with nice custom coolers from various vendors. However, individual 290 and 290X cards are going for nearly a grand now so that's all shot to poo poo.

Maybe a pair of 780s instead, they're not that far behind a 780Ti. Significantly less overclocking headroom, but overclocking is trickier when you're running multiple cards anyway, and is probably even less of a good idea than OCing for general usage.

veedubfreak
Apr 2, 2005

by Smythe
FYI my eyefinity setup is a higher resolution than 4k and BF4 plays pretty drat nice at Ultra, but this requires 3 290x cards. So I'm guessing we're still 2-3 generations out from a single card maxing out a single 4k monitor at 60 fps.

Which by that time, maybe 4k monitors will be affordable.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
Hmm, a 290x goes for that much? :raise: May have to put my unlocked, custom-cooled 290 on the market.

ShaneB
Oct 22, 2002


Ghostpilot posted:

Hmm, a 290x goes for that much? :raise: May have to put my unlocked, custom-cooled 290 on the market.

If you aren't mining, sell that thing and get a Nvidia equivalent. No one will really care about the custom cooler, though.

veedubfreak
Apr 2, 2005

by Smythe
They aren't really selling that well anymore, what with the collapse of MtGox.

ShaneB
Oct 22, 2002


veedubfreak posted:

They aren't really selling that well anymore, what with the collapse of MtGox.

BTC has rebounded over $100 since that collapse. People aren't stopping mining.

Setzer Gabbiani
Oct 13, 2004

Ghostpilot posted:

Hmm, a 290x goes for that much? :raise: May have to put my unlocked, custom-cooled 290 on the market.

I tried this with my 7950 sporting new paste, VRM/RAM heatsinks, and fans, since they're still going for $400-ish on eBay, and no one went for it without a hilarious lowball in the process. A safari through the Reddit Scrypt boards revealed that since it's mostly Lite/Dogecoin retards buying these, no one wants ANYTHING custom that would invalidate the warranty, because they're all in an endless cycle of frying their VRM's, RMA'ing the dead card, frying the RMA's VRM's, repeat. The Radeon stock problem could probably be helped to an extent if manufacturers didn't cover self-destructive users who stress the gently caress out of their cards with inhuman amounts of load 24/7

Straker
Nov 10, 2005
for like the fifth time, nobody important was mining bitcoins with GPUs anyway so mtgox doesn't really have anything to do with anything.

The cards just aren't as inflated as they used to be, at least at the high end. It might still be worth flipping crap cards if you want a cheap upgrade and your card's still under warranty. I've been checking ebay occasionally out of curiosity... I certainly don't regret my pair of 290s; this is probably the best price:performance you can get for any kind of reasonably high-end setup (aside from 3 or more 290s :v:), but I'd still happily flip them and switch to 780Tis if it made sense. 290s only go for like $450 after fees though so makes zero sense. 7950s are ~$350 before fees and 7970s a few bucks more so seems just barely worth the trouble for those, at least.

Straker fucked around with this message at 23:30 on Feb 26, 2014

SCheeseman
Apr 23, 2003

Straker posted:

for like the fifth time, nobody important was mining bitcoins with GPUs anyway so mtgox doesn't really have anything to do with anything.

From what I understand, a lot of people who mined for litecoins traded them for bitcoins and then traded with mtgox and other bitcoin banks.

ShaneB
Oct 22, 2002


SwissCM posted:

From what I understand, a lot of people who mined for litecoins traded them for bitcoins and then traded with mtgox and other bitcoin banks.

the entire cryptocoin market is tied to the exchanges. If anything happens to anything related to the exchanges the entire cryptocoin world is shaken up and value falls.

Straker
Nov 10, 2005
People do that since you can't directly cash out many altcoins (though I think you can cash out litecoins), but you haven't been able to get money or bitcoins out of mtgox for like months now, so legit miners cashing out frequently shouldn't have gotten hosed by gox, only people speculating/buying toxic goxcoins who rightfully got the gently caress burned out of them just now.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

veedubfreak posted:

Which by that time, maybe 4k monitors will be affordable.

That ASUS 28.5" 60hz 4K monitor should be hitting the streets within a couple months for $800. It's a lot of money, yeah, but people spend that much on a nice TV or a SLI graphics setup no problem.

And for non-gaming use, the prices are already there. You can pick up a 39" Seiki 4K TV set (aka a large monitor) for $500 any day of the week. They dipped down to $400 a couple weeks ago. The downside is of course it's HDMI 1.4 therefore limited to 30hz.

I'm thinking about one of those 39" screens if they go on sale again at some point. That's a bunch of working area and 113ppi isn't awful. It seems like a no-brainer for computer-folk who spend all day staring at a screen.

Paul MaudDib fucked around with this message at 23:39 on Feb 26, 2014

Straker
Nov 10, 2005
I feel like $800+ plus another $800+ worth of GPU just to run one 4K monitor acceptably is kind of a step backwards for a lot of games, if I were going to do something crazy like that (well, crazy as it stands right now) I would just as soon consider 3x1440p eyefinity or like 5 1080p debezeled monitors in portrait or something fun like that...

taking a break from posting now :)

Ignoarints
Nov 26, 2010
Man, I used to set up with seamless displays a lot for work (46" 1080p each and 5 grand a pop) and I can't believe I never thought to hook it up to a sweet computer. The average size was probably 4 high 5 across. Would have been pretty cool.

Right before I left I set up a hoist for a 20 ft x 60 ft projection screen and I really was dying to play a game on that. Needed 3x $100,000 projectors to look nice but it looked amazing.

Nephilm
Jun 11, 2009

by Lowtax

Straker posted:

for like the fifth time, nobody important was mining bitcoins with GPUs anyway so mtgox doesn't really have anything to do with anything.

mtgox goes down, bitcoin price fluctuates down as it causes uncertainty, people who mine doge/litecoins and then trade them for bitcoins (because the price of all altcoins is tied to bitcoin) are caught in the uncertainty panic.

It's not hard to understand.

Straker
Nov 10, 2005
I understand perfectly well, point is merely that gox dying yesterday didn't kill off demand for AMD cards. price on other exchanges went down but not as drastically, a lot of these people are desperate to make any amount of money without working, and even if they were smart enough to walk away it's only been a day or two, so there's that :)

If anything, the hysteria kinda started fading away as soon as December when litecoins lost about half their value...

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Nephilm posted:

Wouldn't a r9 270x 2GB crossfire be memory constrained for 1440p?
Yeah It seems insane to me to invest in a setup for 1440p gaming with only 2GB of RAM. While current games largely do fine, I don't see how you can expect that to remain true at 1440p through this summer, or 1080p for much longer than that. Maybe I'm chicken little, but it feels like people set their expectations for VRAM demand growth during a time when it was historically low due to constraints from old consoles, now that that's not an issue we're going back to "normal" so you can't get away with underbuying on VRAM the way you used to. I focus so much on avoiding VRAM paging I can't really think of anything that ruins your gaming experience like running low on VRAM does, especially in multiplayer. Having stuck with a 512MB videocard longer than I really should have, it's drat important to me to avoid living with insufficient VRAM.

SlayVus
Jul 10, 2009
Grimey Drawer

Straker posted:

I understand perfectly well, point is merely that gox dying yesterday didn't kill off demand for AMD cards. price on other exchanges went down but not as drastically, a lot of these people are desperate to make any amount of money without working, and even if they were smart enough to walk away it's only been a day or two, so there's that :)

If anything, the hysteria kinda started fading away as soon as December when litecoins lost about half their value...

How could you actually expect an online exchange called Magic The Gathering Online Exchange to be reliable? /Sarcasm

On the subject of VRAM on GPUS, Nvidia has got to step up the amount on their enthusiast line up. 2 or 3GB on a high end enthusiast card is not enough for maxing new games on Surround when you're talking 6.2 million to 12.2 million pixels.

A single 4K display has from 7 million to 16.38 million pixels. With three 4K WHXGA(5120x3200) displays in surround you're talking FORTY NINE MILLION pixels.

SlayVus fucked around with this message at 06:20 on Feb 27, 2014

forbidden dialectics
Jul 26, 2005





SlayVus posted:

How could you actually expect an online exchange called Magic The Gathering Online Exchange to be reliable? /Sarcasm

On the subject of VRAM on GPUS, Nvidia has got to step up the amount on their enthusiast line up. 2 or 3GB on a high end enthusiast card is not enough for maxing new games on Surround when you're talking 6.2 million to 12.2 million pixels.

A single 4K display has from 7 million to 16.38 million pixels. With three 4K WHXGA(5120x3200) displays in surround you're talking FORTY NINE MILLION pixels.

Well how are they going to sell you the $1000 upgrade with 6GB 3 months later, then?

Plasmafountain
Jun 17, 2008

iGe0RmN1tSqu4Z8TeXk8
1IsiW2ro2OLsnRBEGdKs
CREgaRIZT6TES8VpuJzA
rqjsHWFlu1gPWtpgFbYD
J35wjhC8v3hy8GXAil5w
o5U5uZv85TVg4lA2vUtm
Eyt8sk2qJ8flxQvgtKA5
qxoxd1gAdWG11qoce3PC
aDRWL7EJhLpNpzxK6LO1
Ke8JMWU465aPOoRrnHER

Plasmafountain fucked around with this message at 21:23 on Feb 28, 2023

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy
So got myself a 780ti classified and this thing is insane, 1150mhz boost clock with out touching a thing and 1250mhz when I give it+25mv, my only question now is how do I get more out of this? While the evga control pannel thinks its giving the card 1.2v gpuz is showing less then that even with the slider set to max.

Is there a bios thats available to let more volts into this?

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Zero Gravitas posted:

Welp I found out that the reason why my PC keeps shutting off while gaming or leaving to run CFD simulation overnight is that the cooling fan motor on my graphics card was hosed and at an idle temperature of about 60 degrees its probably not going to last too long.

I'm now looking at replacements but I'm pretty hard up for money with maybe a budget of £55 for a new card. My preferred site in the UK has a bunch of cards around and below this price point and I'm currently split between various cards with a R7 240 chip and a bunch of cards with a GT 620 or 630.

I havent really kept up with the latest in GPU developments - which is likely to give me more bang for £? :britain:

There's not going to be any banging at that price point. Do you have an intel CPU with integrated graphics to fall back to? If you do, then you should subsist on that untill you can afford to buy a real graphics card - something like a gtx750 (around 90 - 100 pounds) for bare minimum 1080p medium - low settings stuff, or a 750ti ( ranges from 100 to 130 pounds) for acceptable medium to high settings performance.

Gwaihir
Dec 8, 2009
Hair Elf

Paul MaudDib posted:

That ASUS 28.5" 60hz 4K monitor should be hitting the streets within a couple months for $800. It's a lot of money, yeah, but people spend that much on a nice TV or a SLI graphics setup no problem.

And for non-gaming use, the prices are already there. You can pick up a 39" Seiki 4K TV set (aka a large monitor) for $500 any day of the week. They dipped down to $400 a couple weeks ago. The downside is of course it's HDMI 1.4 therefore limited to 30hz.

I'm thinking about one of those 39" screens if they go on sale again at some point. That's a bunch of working area and 113ppi isn't awful. It seems like a no-brainer for computer-folk who spend all day staring at a screen.

30hz screens, even for web and text, are like rubbing sandpaper on your eyes. They're loving awful.

(Queue Coredump to come along and say it's totally fine now).

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness
Crosspost from the short questions thread:

quote:

How come when I play Assassin's Creed 4 I get a lot of tearing at 45 fps, but when I play Need for Speed Rivals I get no tearing at all at same fps? Why does it seem like some games suffer more from tearing than others?

veedubfreak
Apr 2, 2005

by Smythe
So I installed the beta 14.2 drivers last night and my computer did not explode. In fact MW:O actually seemed to run better.

Rastor
Jun 2, 2001

Alereon posted:

Yeah It seems insane to me to invest in a setup for 1440p gaming with only 2GB of RAM. While current games largely do fine, I don't see how you can expect that to remain true at 1440p through this summer, or 1080p for much longer than that. Maybe I'm chicken little, but it feels like people set their expectations for VRAM demand growth during a time when it was historically low due to constraints from old consoles, now that that's not an issue we're going back to "normal" so you can't get away with underbuying on VRAM the way you used to. I focus so much on avoiding VRAM paging I can't really think of anything that ruins your gaming experience like running low on VRAM does, especially in multiplayer. Having stuck with a 512MB videocard longer than I really should have, it's drat important to me to avoid living with insufficient VRAM.

SlayVus posted:

On the subject of VRAM on GPUS, Nvidia has got to step up the amount on their enthusiast line up. 2 or 3GB on a high end enthusiast card is not enough for maxing new games on Surround when you're talking 6.2 million to 12.2 million pixels.

A single 4K display has from 7 million to 16.38 million pixels. With three 4K WHXGA(5120x3200) displays in surround you're talking FORTY NINE MILLION pixels.


Did somebody say VRAM?

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

veedubfreak posted:

So I installed the beta 14.2 drivers last night and my computer did not explode. In fact MW:O actually seemed to run better.

WHAT impossible

On my single 1080p screen I'm still mad that it's CPU-limited and that DX11 support hasn't been finished yet, but who am i kidding to expect anything from those devs anymore

also not having used UI 2.0 before, it took me 10-15 e: minutes to sell my free centurion. i also shamefully do not have a Stalker

Sidesaddle Cavalry fucked around with this message at 16:42 on Feb 27, 2014

Ignoarints
Nov 26, 2010
I have an ASUS 660 TI bios modded, repasted, and overclocked, but it runs super cool and with great numbers. I'm buying two MSI 660 TI's used that were never overclocked and are about just as old.

Should I SLI the two MSI's or should I revert the ASUS back to stock bios and add one MSI? Would there be any difference in ease of installation or compatibility at all? I'd just go ahead and do both MSI's but I feel as if the ASUS has been devalued for resale in general despite being really awesome. I'm going to be selling one of them, just have to figure out which one.

Wistful of Dollars
Aug 25, 2009

Michymech posted:

So got myself a 780ti classified and this thing is insane, 1150mhz boost clock with out touching a thing and 1250mhz when I give it+25mv, my only question now is how do I get more out of this? While the evga control pannel thinks its giving the card 1.2v gpuz is showing less then that even with the slider set to max.

Is there a bios thats available to let more volts into this?

Go to overclock's nvidia forum, there's a classified thread there with bios to try. Also, there is a program called the classified voltage tool that let's you shove up to 1.5v in there. From my cursory reading it seems a lot of people don't see much gain beyond 1.3v but your mileage may vary.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Having fun with my 750ti, I modded the BIOS without blowing it up... but adding 10 watts to the max TDP seems to have done absolutely nothing to help overclocking. Probably has to do with the voltage cap.

Anyhoo, this is pretty funny, 750ti passive cooling mod: http://www.tomshardware.com/reviews/geforce-gtx-750-ti-passive-cooling,3757.html

INTJ Mastermind
Dec 30, 2004

It's a radial!
Everyone seems to be focused on over-volting for higher clock speeds. But assuming I'm limited by TDP, would undervolting theoretically allow me to push further?

Ignoarints
Nov 26, 2010

INTJ Mastermind posted:

Everyone seems to be focused on over-volting for higher clock speeds. But assuming I'm limited by TDP, would undervolting theoretically allow me to push further?

How do you get higher clock speeds by undervolting if you're already using all your tdp? I'm not trying to say you're wrong because I want you to be right lol

forbidden dialectics
Jul 26, 2005





Ignoarints posted:

How do you get higher clock speeds by undervolting if you're already using all your tdp? I'm not trying to say you're wrong because I want you to be right lol

Undervolting will use less power which in turn gives you more headroom to reach the TDP cap. It would be really difficult to find your max clocks this way since undervolting could potentially reduce stability, and you're changing at least 2 variables at the same time.

Adbot
ADBOT LOVES YOU

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

INTJ Mastermind posted:

Everyone seems to be focused on over-volting for higher clock speeds. But assuming I'm limited by TDP, would undervolting theoretically allow me to push further?

Undervolting will not allow you to get high clock speeds unless the stock volts are higher then they really need to be

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply