Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
EoRaptor
Sep 13, 2003

by Fluffdaddy
So it looks like DirectX 12 will be a thing.

I'm not sure if Microsoft is really going to be able to get this off the ground. They seem to want to have it for mobile and desktop platforms, which risks making a fat API that is difficult to learn and starved for features.

Also, I'd put odds on them making it Windows 8+ only, even though this would effectively make it stillborn. They probably aren't going to learn from the last times they've tired this.

Adbot
ADBOT LOVES YOU

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

EoRaptor posted:

So it looks like DirectX 12 will be a thing.

I'm not sure if Microsoft is really going to be able to get this off the ground. They seem to want to have it for mobile and desktop platforms, which risks making a fat API that is difficult to learn and starved for features.

Also, I'd put odds on them making it Windows 8+ only, even though this would effectively make it stillborn. They probably aren't going to learn from the last times they've tired this.

Since 11.2 is 8.1 only there is no reason to think 12 wont be 8.1 or 9 only.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Don Lapre posted:

Since 11.2 is 8.1 only there is no reason to think 12 wont be 8.1 or 9 only.

Specially since they are pushing for 9 to replace XP. I don't mind the shorter dev cycles if it means more cutting edge features as long as they don't gouge for updates, hopefully MS makes 9 a sub $60 upgrade.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Man, I'm going to have to split my work into two computers for this poo poo. Driver stability and compatibility (hardware and software) in the audio industry is just now at a solid level for Windows 7 64-bit, but I like my shiny poo poo in vidyagames, so no more all-in-one workstation for me. That blows. I guess I'm fortunate in that it just means I'll need to get another case and a couple of HDDs, I could be much worse off for a "I want my computer to do work and play!" situation I guess. I'll save my bitching, but I won't stop hoping that we get a more robust OpenGL - it's been the playground for forward looking features for about a year and a half now anyway, maybe it'll save the day :sigh:

Agreed fucked around with this message at 20:53 on Mar 6, 2014

Ignoarints
Nov 26, 2010

Agreed posted:

Man, I'm going to have to split my work into two computers for this poo poo. Driver stability and compatibility (hardware and software) in the audio industry is just now at a solid level for Windows 7 64-bit, but I like my shiny poo poo in vidyagames, so no more all-in-one workstation for me. That blows. I guess I'm fortunate in that it just means I'll need to get another case and a couple of HDDs, I could be much worse off for a "I want my computer to do work and play!" situation I guess. I'll save my bitching, but I won't stop hoping that we get a more robust OpenGL - it's been the playground for forward looking features for about a year and a half now anyway, maybe it'll save the day :sigh:

dual boot?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

norg posted:

Ok I'm actually gonna do this I think. I have a vented side panel so the stock fan should work. Is that fan ok or is it better to replace it anyway? And is the VRAM cooled adequately with this setup?

VRAM really doesn't run that hot. I checked my card with an IR thermometer and I didn't have any temperature problems even before I nailed down proper airflow.

I can't speak to the G10 fan, really. NZXT makes competent if slightly noisy fans in general. I have a discontinued high-airflow Enermax fan that I run at 50%. Moves air noticeably and keeps things cool.

Agreed posted:

Man, I'm going to have to split my work into two computers for this poo poo. Driver stability and compatibility (hardware and software) in the audio industry is just now at a solid level for Windows 7 64-bit, but I like my shiny poo poo in vidyagames, so no more all-in-one workstation for me. That blows. I guess I'm fortunate in that it just means I'll need to get another case and a couple of HDDs, I could be much worse off for a "I want my computer to do work and play!" situation I guess. I'll save my bitching, but I won't stop hoping that we get a more robust OpenGL - it's been the playground for forward looking features for about a year and a half now anyway, maybe it'll save the day :sigh:

Which Z87 board do you have again? Sabertooth or Gryphon? If it's the Gryphon you could play with an Obsidian 350D and crack decade-old mini-me jokes.

EoRaptor
Sep 13, 2003

by Fluffdaddy

deimos posted:

Specially since they are pushing for 9 to replace XP. I don't mind the shorter dev cycles if it means more cutting edge features as long as they don't gouge for updates, hopefully MS makes 9 a sub $60 upgrade.

Yeah, they really need to get off the price horse, especially for upgrades. Most new windows sales are via OEM, so lowering the upgrade price and turning over older versions is probably a net win for them.

Yes, it's more complex than a 60 dollar game, but you aren't going to be able to ignore the change in peoples perceptions of what software should cost in the face of Apple. Sell subscription services a la OneDrive backup and take the hit on the core OS.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

EoRaptor posted:

They probably aren't going to learn from the last times they've tired this.

But... what exactly is there to "learn"? It's not like there is a team of executives in Redmond rolling around in a pool of money while cackling maniacally about taking away features from users.

Direct3D 10 and 11 rely heavily on the WDDM, introduced in Vista. There is no way in hell they would get D3D 10-11 to work without it, especially not since they had committed to phasing out XP eventually. I don't miss the days of having to handle device driver losses manually in every single possible situation. The good olden days where alt-tabbing out of a game was a gamble to see if it would crash or not!

This part of the argument might be more tenous past this now that the switch to WDDM has already been made, but consider that driver writers still have to deal with the Direct3D Device Driver Interface which is currently pretty much crippling any chance at having efficient draw calls and multithreading. Most likely reducing the overhead will require a significant platform revision that would be simply infeasible to retrofit to earlier Windows versions. In essence, they have to deal with conflicting engineering objectives in reducing both overhead and maintaining the robustness allowed by WDDM.

And it's not like this is frilly stuff that developers and users won't care about. Mantle is the proof that there is a demand for more efficient drivers and APIs. The current console generation's focus on multiple cores forces developers to multithread everything, including rendering. This can be done efficiently with the PS4's command lists and, to an extent, the Xbox One's deferred contexts. But the same deferred contexts are a net loss by default on PC because every single draw call carries so much drat overhead. The only way around this currently is to have AMD or nVidia hand code optimizations for every game that decides to multithread its rendering.

You chose to latch on the multiple platform part of the article as justification that the API somehow will be bloated and starved for features. The operative words here are not "PC, tablet, phone and console", but "closer to the metal". If they manage to reduce the DDI overhead and allow multithreading to become a reasonable approach on PC, then the workload involved for console developers to port to PC will be considerably reduced, which is good for everyone involved.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Dumb question about nVidia drivers. Which drivers should I be going with, beta or the normal ones? I'm used to AMD where the rule was always beta because WHQL was a hilariously old build compared to the advancement made in the betas. Is it the same with nVidia?

Ignoarints
Nov 26, 2010

Endymion FRS MK1 posted:

Dumb question about nVidia drivers. Which drivers should I be going with, beta or the normal ones? I'm used to AMD where the rule was always beta because WHQL was a hilariously old build compared to the advancement made in the betas. Is it the same with nVidia?

I've never used beta drivers for nVidia, but only because they've been very clear about the differences of the beta driver and stable driver. Usually it's just added support for certain games, and more often than not there aren't even beta drivers to be had (for me). It's not like AMD. Not to say I wouldn't use them but I haven't had any reason to yet

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Endymion FRS MK1 posted:

Dumb question about nVidia drivers. Which drivers should I be going with, beta or the normal ones? I'm used to AMD where the rule was always beta because WHQL was a hilariously old build compared to the advancement made in the betas. Is it the same with nVidia?

Just go with whatever is newest.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

nVidia has, for the most part, done a really stellar job of making sure that their beta drivers are very stable by the time the end user gets them, and you're usually just getting better performance in new games sooner than WHQL driver users. It used to be a lot less stable, and there was the period starting with the 320 branch that just loving went bonkers too, but generally speaking there's very little harm and quite a bit of good to come from using the beta drivers since they're basically "these are stable but we are still working on more poo poo before we go through the hassle of a WHQL release."

God knows what insane bugs they have to stab to death before they release the betas, but it's a nice way to get stuff early, like the recent ~15%, +/-3% performance improvement on most higher end 600/700 series cards in a few particularly demanding games (and some they were just showing off in, it's a mixed bag).

Did you guys know that was a thing? They did a big driver optimization with the last WHQL and the beta branch it came off of had a bunch of really big performance improvements for some reason. I'd kind of thought we'd be past the point where drivers could do jack poo poo for a GTX 770 especially since it's basically a 680 with faster VRAM, and based on prior experience I also thought we had already seen the perf improvements for the GK110 chips from drivers. Nope, they squeezed some nice extra performance out of them again. Could be the third time, for GK104, I believe it's the second all-around improvement for GK110 series. I'm not counting releases that have performance gains in one or two specific games because the previous driver was essentially broken for that game, just "here's a bunch of games that will see performance improvements, have fun with that" general optimizations.

I can confirm that they work well in play, too, at least as far as the 780Ti goes. I got more performance for free and didn't change my overclock at all. nVidia driver team did well this go around.


AMD owners, how are they doing lately? Do they have a single driver out yet or are they still running multiple branches? That's always lovely for the end user, and confusing - I remember back with nVidia's 320s when their official position was "go back to 314 if you're not using a GTX 700 series card." Laaaaaame. AMD back to one driver to rule them all yet, or still a driver hydra?

The_Franz
Aug 8, 2003

deimos posted:

Specially since they are pushing for 9 to replace XP. I don't mind the shorter dev cycles if it means more cutting edge features as long as they don't gouge for updates, hopefully MS makes 9 a sub $60 upgrade.

It still won't help with the China problem, where a whole lot of people are using modern hardware but running XP.

Jan posted:

And it's not like this is frilly stuff that developers and users won't care about. Mantle is the proof that there is a demand for more efficient drivers and APIs. The current console generation's focus on multiple cores forces developers to multithread everything, including rendering. This can be done efficiently with the PS4's command lists and, to an extent, the Xbox One's deferred contexts. But the same deferred contexts are a net loss by default on PC because every single draw call carries so much drat overhead. The only way around this currently is to have AMD or nVidia hand code optimizations for every game that decides to multithread its rendering.

Nvidia's been championing multi-draw-indirect + uniform buffer objects on OpenGL for a while now. Rather then multithread things in the driver to handle applications making lots of individual draw calls, you just build a big list of geometry in the application and then send it off to the video card with one call.

beejay
Apr 7, 2002

Agreed posted:

AMD owners, how are they doing lately? Do they have a single driver out yet or are they still running multiple branches? That's always lovely for the end user, and confusing - I remember back with nVidia's 320s when their official position was "go back to 314 if you're not using a GTX 700 series card." Laaaaaame. AMD back to one driver to rule them all yet, or still a driver hydra?

In terms of desktop cards (R9 200, R7 200, HD 7000, HD 6000 and HD 5000 Series) - one beta, one WHQL. Then there's a mobility, and of course workstation cards.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

beejay posted:

In terms of desktop cards (R9 200, R7 200, HD 7000, HD 6000 and HD 5000 Series) - one beta, one WHQL. Then there's a mobility, and of course workstation cards.

Yeah the only big thing was that the microstutter fixes didn't make it into a WHQL that came out after a beta that had them but I think that has more to do with the certification process.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

The_Franz posted:

Nvidia's been championing multi-draw-indirect + uniform buffer objects on OpenGL for a while now. Rather then multithread things in the driver to handle applications making lots of individual draw calls, you just build a big list of geometry in the application and then send it off to the video card with one call.

I suppose that would could be handy to squeeze out some quick and easy performance on a PC port, yeah. I'll admit that as many game developers, I have a bit of a blind spot when it comes to OpenGL -- the lack of standardisation and the proliferation of extensions makes it a really impractical API to work with. Whereas the Xbox One uses the same Direct3D API as Windows, and since the PS4 API tries to model itself very closely to the Direct3D API, using that as a common ground for cross-platform rendering is standard practice.

But I was not referring to multithreading at a driver level, but rather at an application level. If I try to use the aforementioned Direct3D-like API to multithread rendering, I'll find myself issuing command lists in parallel on different cores then playing those in order on the render thread to render the actual frame. The caveat is that on Windows, there is so much overhead between Direct3D and the driver itself that every single draw call ends up with a large cost both when it is first issued to the deferred context and when it is actually played back on the immediate context.

This puts a huge damper on any PC porting efforts, and I'm sorry to say that we probably won't see much in the way of quality PC tech as long as this lasts. Apart from exceptions like Frostbite or CryEngine, few developers bother targeting their engine to PC because it is a miniature fraction of the market.

On the bright side, with the amount of effort Valve has been putting into making Steam into a fully fledged console platform, we might see a resurgence of OpenGL and other non "mainstream" tech. But I wouldn't hedge my bets on that as SteamOS consoles might remain a marginal thing.

On the other hand, if Microsoft manages the impossible with Direct3D 12 by providing an API that avoids all the pitfalls mentioned, I'd be much more optimistic about seeing more good console to PC ports. And my initial point was, if the cost of this means forcing a part of the already fractional PC gaming market to upgrade to newer versions of Windows, then I suppose it is a small price to pay, both as a developer and consumer.

vvvvvvvvvvvvvv

spasticColon posted:

My question is how long before we see DirectX 12 hardware from AMD or Nvidia?

There might not even be DirectX 12 hardware, it could very well be a software specific change. Though they'd have a hard time justifying the version increase then, I suppose.

Remember that the most important change from DirectX 10 and 11 was doing away with the fixed pipeline. In doing this, hardware-specific architectural changes suddenly become unnecessary. Even DirectX 11 only introduced the tessellation pipeline, which hardly sees any serious use these days because the functionality it provides is entirely optional. And also introduces a considerable performance cost just from being active, even if your domain and hull shaders are plain pass-throughs.

Jan fucked around with this message at 04:34 on Mar 7, 2014

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
My question is how long before we see DirectX 12 hardware from AMD or Nvidia?

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

deimos posted:

Specially since they are pushing for 9 to replace XP. I don't mind the shorter dev cycles if it means more cutting edge features as long as they don't gouge for updates, hopefully MS makes 9 a sub $60 upgrade.

With the rumoured "Bing Edition", $15 OEM licenses to sub-$250 PC's, Nadella's focus on services and the realities of the market, I think $100-$200 Windows licenses will not exist in short order, certainly not for consumers. There's no way a $100 OS license will be tenable as a product that any significant number of consumers would purchase in the future with so many computing options at their disposal now.

SlayVus
Jul 10, 2009
Grimey Drawer

Happy_Misanthrope posted:

With the rumoured "Bing Edition", $15 OEM licenses to sub-$250 PC's, Nadella's focus on services and the realities of the market, I think $100-$200 Windows licenses will not exist in short order, certainly not for consumers. There's no way a $100 OS license will be tenable as a product that any significant number of consumers would purchase in the future with so many computing options at their disposal now.

If Microsoft is going to do such a short development cycle on the OS front I feel they need to drop their price to like $50-$60. 7 to 8 I feel was too fast to justify the $100 price tag when you look at XP to Vista/7.

Ignoarints
Nov 26, 2010
I haven't even bought Windows 8 and probably never will at this rate :x I've been using 8.1 enterprise evaluation and resetting my timer. I'll have to reinstall once but... call me weird but I like to reinstall windows at least twice as year anyways. It's so easy and relatively fast now too it's hardly even a hassle.

And to Agreed above, everytime I look at the check drivers tab I notice that kind of random list of performance improvements in the last driver. Pretty cool they are able to still do that. Unfortunately it's not for anything I play, but I do remember that was the last beta.

(USER WAS PUT ON PROBATION FOR THIS POST)

PC LOAD LETTER
May 23, 2005
WTF?!

Agreed posted:

AMD owners, how are they doing lately? Do they have a single driver out yet or are they still running multiple branches?
The rule of thumb with AMD is: If you're in a office/Real Serious Computing environment= latest (IOW old as hell) WHQL.
If you're a gamer= latest beta.

Some 290 owners had trouble with the 14.1's but generally they've been working fine otherwise.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

SlayVus posted:

If Microsoft is going to do such a short development cycle on the OS front I feel they need to drop their price to like $50-$60. 7 to 8 I feel was too fast to justify the $100 price tag when you look at XP to Vista/7.

Woah, keep in mind Win8 upgrade was $40 for a long rear end time.

ShaneB
Oct 22, 2002


Factory Factory posted:

VRAM really doesn't run that hot. I checked my card with an IR thermometer and I didn't have any temperature problems even before I nailed down proper airflow.

I can't speak to the G10 fan, really. NZXT makes competent if slightly noisy fans in general. I have a discontinued high-airflow Enermax fan that I run at 50%. Moves air noticeably and keeps things cool.

I don't hear the G10's fan. I had to mount a second silent Enermax fan on mine, as the 270x has the power delivery management stuff on the left side of the card, but those brackets have it set up for the right side.



It gets the job done and doesn't look TOO hacky. I don't have a side mesh area on my windowed case (but I wish I did.) Temps are in the mid 60C's at core and VRMs now, even at full load.

Ignoarints
Nov 26, 2010
I don't know if this is common knowledge or not but I was scrolling through my motherboard manual and there is a "ATX4P" power connector for the PCI slots that I never plugged in, or knew about, or ever even heard of. It eliminated all my occasional frame drops :yayclod:

also now vsync smooth never drops to 30 and since thats the only mode with bearable input lag I can finally use vsync!

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Ignoarints posted:

I don't know if this is common knowledge or not but I was scrolling through my motherboard manual and there is a "ATX4P" power connector for the PCI slots that I never plugged in, or knew about, or ever even heard of. It eliminated all my occasional frame drops :yayclod:

also now vsync smooth never drops to 30 and since thats the only mode with bearable input lag I can finally use vsync!
What motherboard and cards? That's probably the dedicated +12V rail for your CPU. And yes, generally plugging in all the power cables is required for things to work correctly ;)

Ignoarints
Nov 26, 2010

Alereon posted:

What motherboard and cards? That's probably the dedicated +12V rail for your CPU. And yes, generally plugging in all the power cables is required for things to work correctly ;)

Gigabyte Z87X-UD4H and 2 660TI's. I've never plugged in a SATA power cable to my motherboard before it is a brand new thing to me. A quick google search told me people run without it even with better cards but ... I don't know why. It definitely made an improvement for me. It's a dedicated power slot for PCI slots is what it says.

That combined with frame limiting and my cards never get over 50 degrees now at max everything I can set. Couldn't be happier so far. Definitely more of a hassle to setup than 1 card but it's all good now. I have a feeling my SLI bridge cable is a little loose I had to wiggle it to get a picture that wasn't flashing blue at first, I wonder if I should get a new cable. Should it be tight?

Ignoarints fucked around with this message at 18:17 on Mar 7, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Ignoarints posted:

Gigabyte Z87X-UD4H and 2 660TI's. I've never plugged in a SATA power cable to my motherboard before it is a brand new thing to me. A quick google search told me people run without it even with better cards but ... I don't know why. It definitely made an improvement for me. It's a dedicated power slot for PCI slots is what it says.
That's a bizarre place for a power connector, no wonder you missed it. Two weird things are that first, the SATA connector and cable don't carry enough power to be used to supplement videocards in the PCI-Express slots, it provides 1.5A each at 3.3V, 5V, and 12V. That's 18W of 12V power. Second is that your videocards use 150W each, but also have two 6-pin connections to the power supply totaling 150W each, so there should be very minimal power draw from the motherboard. It might make more sense with GTX 660s where the 140W power draw is only fed with 75W from a 6-pin from the power supply with the rest coming from the motherboard, but even then I don't see how another 1.5A is going to make up that ~11A deficit.

My guess is Gigabyte cheaped out on power delivery (as they tend to do) and this was needed to maintain stability. I really don't get it though, as that would seem to imply that if you add any additional PCI-Express card at all, like a soundcard, that your poo poo will break again.

Ignoarints
Nov 26, 2010

Alereon posted:

That's a bizarre place for a power connector, no wonder you missed it. Two weird things are that first, the SATA connector and cable don't carry enough power to be used to supplement videocards in the PCI-Express slots, it provides 1.5A each at 3.3V, 5V, and 12V. That's 18W of 12V power. Second is that your videocards use 150W each, but also have two 6-pin connections to the power supply totaling 150W each, so there should be very minimal power draw from the motherboard. It might make more sense with GTX 660s where the 140W power draw is only fed with 75W from a 6-pin from the power supply with the rest coming from the motherboard, but even then I don't see how another 1.5A is going to make up that ~11A deficit.

My guess is Gigabyte cheaped out on power delivery (as they tend to do) and this was needed to maintain stability. I really don't get it though, as that would seem to imply that if you add any additional PCI-Express card at all, like a soundcard, that your poo poo will break again.

The only information on it is

"ATX4P (PCIe Power Connector)
The power connector provides auxiliary power to the onboard PCI Express x16 slots. When two or more
graphics cards are installed, we recommend that you connect the SATA power cable from the power supply
to the ATX4P connector to ensure system stability. For 4-way CrossFireX™/SLI, you must connect the ATX4P
connector."

It seems weird to me that 18 watts would enable a four way SLI to work or not work. Now that you mention that, it's pretty weird that it even makes a difference for me. It seems like 18 watts would be trivial if I used even a tiny bit more voltage for overclocking (which I am doing, slowly increasing). Maybe there is something else happening :confused:

Edit: Well I just read that Sata has 3 12volt power pins at 1.5 amps each so perhaps it is actually 54 watts of 12v. Still? Dunno. There are also 6 other power pins at the other voltages

quote:

That connector, referred to in the manual as ATX4P, provides extra power to the graphics card (PCI-Express x16) slots. Not all motherboards have this connector (or equivalent), but if it is present and you are using graphics card(s) then it must be connected, even if the card has its own auxiliary power connector (which must also be connected). This is usually due to the design of the board, specifically the current carrying capabilities of the printed tracks, and not the limitations of the main ATX connector. By having this extra connector, it is possible to spread the power load over more tracks than would otherwise be possible without it.

quote:

But did you connect a SATA power connector from the power supply unit to the motherboard's ATX4P power connector?

The graphics cards also draw some of their power from the +3.3V and +12V power lines in the PCIe x16 Slot. The ATX4P connector on the motherboard is suppose to supply the extra power to the PCIe bus as required when you are using more than two graphics cards.

If this can be believed, perhaps it uses the 3.3v pins as well, but more interestingly it might be for providing power in a different place that otherwise wouldn't be available. As far as I can tell though there is no official detailed information of any kind anywhere though

Ignoarints fucked around with this message at 19:39 on Mar 7, 2014

Straker
Nov 10, 2005

Ignoarints posted:

Edit: Well I just read that Sata has 3 12volt power pins at 1.5 amps each so perhaps it is actually 54 watts of 12v. Still? Dunno. There are also 6 other power pins at the other voltages
Pretty sure spinning drives have always used up to ~25W to spin up, so that 18W figure doesn't really make sense. Never seen a motherboard with a SATA power connector, but plenty used to have molex connectors for supplemental PCIe power, so guess it's not that weird.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Ignoarints posted:

Edit: Well I just read that Sata has 3 12volt power pins at 1.5 amps each so perhaps it is actually 54 watts of 12v. Still? Dunno. There are also 6 other power pins at the other voltages
Good catch, there are three pins for a total of 4.5A or 54W. That quote above confirms Gigabyte cheaped out on power delivery, I guess they're back to their old tricks.

Straker posted:

Pretty sure spinning drives have always used up to ~25W to spin up, so that 18W figure doesn't really make sense. Never seen a motherboard with a SATA power connector, but plenty used to have molex connectors for supplemental PCIe power, so guess it's not that weird.
It's weird because the Molex connector provides more than double the power of the SATA power connector (even using the correct 3-pin values), that connector was really only present on ancient boards. Modern boards SHOULD be able to power the slots, or for extreme high-end setups they use dedicated PCI-Express Graphics power connectors on the board for additional +12V power.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:
I've got a media center PC connected to my HDTV out in my living room right now. We are shopping for a new TV and have been looking at some 4k (or 'Ultra HD') options. Since HDMI 2.0 has been finalized we now have HDMI that can support beyond 1080p resolution beyond 30hz on 4k TVs, including the one we are looking at.
My problem lies with the fact that it seems GPUs do not have HDMI 2.0 support which means I can't run my (plenty capable) media center PC at the native resolution of a 4k TV and instead will be 'stuck' at 1080p, even if I was willing to upgrade the GPU for this. Do I have this right and if I do, is there a way around this? Can I really not run a media center PC at the native resolution of a 4k ultra-def tv?

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

KakerMix posted:

I've got a media center PC connected to my HDTV out in my living room right now. We are shopping for a new TV and have been looking at some 4k (or 'Ultra HD') options. Since HDMI 2.0 has been finalized we now have HDMI that can support beyond 1080p resolution beyond 30hz on 4k TVs, including the one we are looking at.
My problem lies with the fact that it seems GPUs do not have HDMI 2.0 support which means I can't run my (plenty capable) media center PC at the native resolution of a 4k TV and instead will be 'stuck' at 1080p, even if I was willing to upgrade the GPU for this. Do I have this right and if I do, is there a way around this? Can I really not run a media center PC at the native resolution of a 4k ultra-def tv?

You should be able to use DisplayPort to get what you're looking for, as long as the TV you get supports that as an output.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Alereon posted:

Good catch, there are three pins for a total of 4.5A or 54W. That quote above confirms Gigabyte cheaped out on power delivery, I guess they're back to their old tricks.
To be fair Asrock utilizes molex connectors on their mid-level multi-GPU boards also, while MSI is utilizing 6-pin PCIe cable for additional power as well on some of their boards. I believe in all cases it's for "stability" when using multi-GPU setups.

Gigabyte still has plenty of issues though (such as DPC latency problems).

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

SourKraut posted:

To be fair Asrock utilizes molex connectors on their mid-level multi-GPU boards also, while MSI is utilizing 6-pin PCIe cable for additional power as well on some of their boards. I believe in all cases it's for "stability" when using multi-GPU setups.

Gigabyte still has plenty of issues though (such as DPC latency problems).
It makes sense to use a Molex or PEG connector for high-power-draw systems, it's mostly problematic on the Gigabyte board because they need supplemental power in situations where they shouldn't and other boards don't. If you have two cards powered by +12V from your power supply via PEG connectors with negligible draw from the board, it shouldn't need a SATA power connector to work right.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Alereon posted:

It makes sense to use a Molex or PEG connector for high-power-draw systems, it's mostly problematic on the Gigabyte board because they need supplemental power in situations where they shouldn't and other boards don't. If you have two cards powered by +12V from your power supply via PEG connectors with negligible draw from the board, it shouldn't need a SATA power connector to work right.

Oh ok, that makes more sense then (sorry for the ignorance!). I didn't realize that using SATA instead of molex or PCIe was such a drawback. I guess then my question would be, is it really saving that much money for them to go that route vs. the molex or PEG connectors?

Guni
Mar 11, 2010

SourKraut posted:

Oh ok, that makes more sense then (sorry for the ignorance!). I didn't realize that using SATA instead of molex or PCIe was such a drawback. I guess then my question would be, is it really saving that much money for them to go that route vs. the molex or PEG connectors?

I think that's the whole point. Hence the lack of recommendations for Gigabyte in the parts picking thread.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

deimos posted:

Specially since they are pushing for 9 to replace XP. I don't mind the shorter dev cycles if it means more cutting edge features as long as they don't gouge for updates, hopefully MS makes 9 a sub $60 upgrade.

I could see them doing a cheap upgrade like they did when 8 came out. I remember getting a voucher to upgrade to 8 for like $40 when it released, sure beats shelling out like $130 or more for full OEM.

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe

Guni posted:

I think that's the whole point. Hence the lack of recommendations for Gigabyte in the parts picking thread.

Kind of makes me wonder why so many people recommend Gigabyte for Hackintosh builds, but that's for another topic.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



kode54 posted:

Kind of makes me wonder why so many people recommend Gigabyte for Hackintosh builds, but that's for another topic.

It's because most Gigabyte boards do not require a DSDT since OS X has drivers to cover most of the components. It typically ends up only requiring an audio and physical lan kext file, with most other components working natively. Most other motherboard manufacturer's products require a decent amount of DSDT and kext editing to get working with OS X.

Adbot
ADBOT LOVES YOU

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe

SourKraut posted:

It's because most Gigabyte boards do not require a DSDT since OS X has drivers to cover most of the components. It typically ends up only requiring an audio and physical lan kext file, with most other components working natively. Most other motherboard manufacturer's products require a decent amount of DSDT and kext editing to get working with OS X.

Ah, then I guess I just got lucky with my choice of MSI board. Only needed audio and IntelE1000e.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply