Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

chippy posted:

I didn't realise Sniper Elite v2 was such a demanding game. Is it really nice looking?

It's demanding because it needs to render your bullets going through Nazi testicles.

Adbot
ADBOT LOVES YOU

chippy
Aug 16, 2006

OK I DON'T GET IT
Nice.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Since I guess this is where HPC coprocessors get talked about, AnandTech has details on the Intel Xeon Phi. 62 modified Pentium cores per die (at least one disabled on shipping cards, two disabled on the 225W SKU). The card is a system-on-a-board, presenting an IP address to the local system for work to be pushed to it. Otherwise, it's a full x86 system running Linux (with a modified kernel to handle its non-standard vector formats).

Fun fact: each of the cores is hyperthreaded for four threads at once. Why? Because the cores are so darn slow at memory latency, and hyperthreading helps hide that. But because the cores are also large and internally parallel, the best case is that two full threads can be fully executed per core (one vector and one scalar, or two low-ILP (instruction-level paralleism) scalar threads).

Much like Big Kepler, the topped-off cards are built for memory-bound workloads, with a fully-populated 512-bit GDDR5 memory interface. Max theoretical DP FLOPs is about 1 TFLOPS on the nose. Later next year, compute-bound SKUs will be released with 384-bit memory but clock speeds well above the current batch's ~1 GHz. In comparison, the Tesla K20X has a theoretical max of ~1.3 DP TFLOPs, and the K20 about 1.17.

Pricing comes in at, well, enterprise-grade, but cheaper per card than Teslas by a good $500 or so.

Phis are currently being loaded into supercomputers, including Stampede at the Texas Advanced Computing Center. This machine already uses 6400 server nodes with 2 petaFLOPs worth of octocore Sandy Bridge Xeons. The Xeon Phi cards are expected to kick things to 10 petaFLOPs. Visualization is handled by a small batch (128) of Tesla K20 GPUs.

Relatedly, the Green500 list of most efficient supercomputers has been updated (direct link). A Xeon Phi machine at the University of Tennessee has the top spot, pairing 225W Phis with SNB Xeon E5 octocore CPUs in blades for 2,499.44 MFLOPs per watt. It's a very small system, though, running only ~45 KW total. In comparison, the Titan supercomputer that debuted the Tesla K20 (along with 16-core Bulldozer Opterons) and is the current #1 on Top500 manages the #3 spot with 2,142.77 MFLOPs/W on 8,209 KW.

Not to be left out, AMD's FirePro S10000 dual-GPU Tahiti card helps a relatively small Suadi Arabian supercomputer (also 8-core Xeon E5s) hit #2 on the Green500 list (#52 on Top500).

Because Phi runs unmodified x86 code, the time investment for writing programs for a Xeon Phi machine is lower than a Tesla or FirePro system. Porting an OpenMP app to CUDA or OpenCL and debugging it can be an enormous pain, and the app has to be EXTREMELY parallel to really take advantage of a GPU's architecture. A Xeon Phi can just run the same drat code you use on the SNB or Westmere or whatever Xeons making up the system's core. And those execution units are built into only 60-ish cores, rather than having to pound through 2600 CUDA cores or 2000 GCN cores, so if you can't pull a lot of ILP out of your workload, it's not the end of the world for performance.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
AMD hasn't given an ETA on Catalyst 12.11, have they? I'm loving the performance boost but the frequent blue screens in Win 8 aren't quite worth it.

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?

Factory Factory posted:

Since I guess this is where HPC coprocessors get talked about, AnandTech has details on the Intel Xeon Phi.
I'm hanging out to see how the workstation cards end up, mainly for my guys running x86 MPI code on dual socket workstations

Mr Chips fucked around with this message at 05:00 on Nov 17, 2012

Star War Sex Parrot
Oct 2, 2003

lkz posted:

Real World Tech is also a pretty good site for some of those nitty-gritty details on CPU/GPU architectures. This is getting a little off topic but here's a pretty recent article from them on Haswell architecture.
The most interesting thing I've ever learned at a CES is that David Kanter is a loving weirdo.

lkz
May 1, 2009
Soiled Meat

Star War Sex Parrot posted:

The most interesting thing I've ever learned at a CES is that David Kanter is a loving weirdo.

Ok, so now I have to know more.

Paradox Personified
Mar 15, 2010

:sun: SoroScrew :sun:
What would cause a random black square on my Radeon 5770 HD to make a clicking sound, force one of the attached metal pins to blow itself off the card, and shoot an at-least 2-inch jet of flame off of that specific square on the card that took at least half a minute to subdue?
I've messed with computers for years now and never saw something like that... The CPU temp I remember wasn't over 130F while running a game, dunno the GPU temp.
Some drunk dumbass tripped over the cord from the monitor to the case, so I had to fix with a VGA cable... Had been successfully in place for weeks before the volcano incident,... Could it have been from the DVI-to-VGA adapter that had to be attached on the end of the new VGA cable? Some type of short event?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Paradox Personified posted:

What would cause a random black square on my Radeon 5770 HD to make a clicking sound, force one of the attached metal pins to blow itself off the card, and shoot an at-least 2-inch jet of flame off of that specific square on the card that took at least half a minute to subdue?
I've messed with computers for years now and never saw something like that... The CPU temp I remember wasn't over 130F while running a game, dunno the GPU temp.
Some drunk dumbass tripped over the cord from the monitor to the case, so I had to fix with a VGA cable... Had been successfully in place for weeks before the volcano incident,... Could it have been from the DVI-to-VGA adapter that had to be attached on the end of the new VGA cable? Some type of short event?

Overcurrent on a MOSFET, causing it to blow. If you were overclocking, it could've been too aggressive for too long. Otherwise, you're looking at a bad PSU or a plain old bad card. Any way it goes, the card is dead.

E: Looked about like this, except not in slow motion, right?

https://www.youtube.com/watch?v=NOc_alkgXYQ

Factory Factory fucked around with this message at 01:23 on Nov 20, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Anandtech has a quick WiiU teardown and hardware analysis. There's a POWER7 CPU (I guess that puts paid to the idea that POWER7 isn't power-efficient enough for consoles), Radeon HD 4850 GPU, some eDRAM, and 2GB of single-channel DDR3-1600. Right now these are all discrete 40nm-class dies, though I'd expect to see them share a die when 28nm fab capacity is available. It's interesting how precisely matches expectations, making me think that the Xbox 720 will also be the expected POWER7+AMD GPU. It'a also interesting how little memory bandwidth the WiiU has, the eDRAM will make up for that somewhat, but I think this reflects a design choice to not support pushing around a lot of texture data.

I think we'll see the gaming capabilities of the WiiU surpassed on very short order by tablets and smartphones. Current generation tablets have as much memory bandwidth (not counting the eDRAM), and the iPad 4 has an astounding number of GPU shader cores for a mobile device (and a much higher resolution display than the WiiU will ever drive). While the raw performance still lags significantly behind the 360 and PS3, mobile devices will only get more flexible and efficient. And they do it in 1% of the power.

It's also funny to point just how far behind smartphones the WiiU is in browsing performance. While that's largely due the older WebKit code, it shows that efficient use of limited horsepower will always beat throwing hardware inefficiently at a problem.

Alereon fucked around with this message at 01:41 on Nov 20, 2012

Paradox Personified
Mar 15, 2010

:sun: SoroScrew :sun:

Factory Factory posted:

Overcurrent on a MOSFET, causing it to blow. If you were overclocking, it could've been too aggressive for too long. Otherwise, you're looking at a bad PSU or a plain old bad card. Any way it goes, the card is dead.

E: Looked about like this, except not in slow motion, right?

https://www.youtube.com/watch?v=NOc_alkgXYQ

Yes! Ah, god, I'll have nightmares 'till the day I die. Thanks. I never overclock anymore, not since high school 10 years ago..
I really don't want to risk anything ever again. Thanks for that. I'll go with an old card, the PSU seems to be good, as far as I can tell. The rig will POST, the HD spins up, monitor gets no signal.
Thanks again. I hate having unanswered questions.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Paradox Personified posted:

Yes! Ah, god, I'll have nightmares 'till the day I die. Thanks. I never overclock anymore, not since high school 10 years ago..
I really don't want to risk anything ever again. Thanks for that. I'll go with an old card, the PSU seems to be good, as far as I can tell. The rig will POST, the HD spins up, monitor gets no signal.
Thanks again. I hate having unanswered questions.
I shorted a resistor on a GPU once while trying to make a custom VRM cooler. Shot flames out of the bottom of the card but didn't damage anything else. Looked neat, though I was out a spare 4870.

What power supply model are you using?

Chuu
Sep 11, 2004

Grimey Drawer

Alereon posted:

Anandtech has a quick WiiU teardown and hardware analysis. There's a POWER7 CPU (I guess that puts paid to the idea that POWER7 isn't power-efficient enough for consoles)

Do people care about the power efficiency of their consoles? Other than the size of the brick I'm not sure most people are conscious of it.

syzygy86
Feb 1, 2008

Chuu posted:

Do people care about the power efficiency of their consoles? Other than the size of the brick I'm not sure most people are conscious of it.

It's more of an issue with heat dissipation than consumer concerns with power consumption. People want small, quiet consoles, which makes removing heat difficult.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
What is the difference between SSAO and HBAO? Which one is more expensive and which one gives better looking ambient occusion?

Shaocaholica
Oct 29, 2002

Fig. 5E
Do the latest Nvidia drivers with 6XX GPUs support nearest neighbor resolution scaling?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Endymion FRS MK1 posted:

What is the difference between SSAO and HBAO? Which one is more expensive and which one gives better looking ambient occusion?

HBAO is a more accurate simulation of real light behavior, but it's more computationally intensive than SSAO. The difference is algorithmic, and you're welcome to try to make sense of the SIGGRAPH HBAO presentation, because I can't translate it from math to English.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Shaocaholica posted:

Do the latest Nvidia drivers with 6XX GPUs support nearest neighbor resolution scaling?

They support very high quality scaling, and you can choose whether to scale while maintaining aspect ratio, or stretch-to-fit, with other options available as well. It is dramatically superior in quality to the mediocre nearest-neighbor quickie scaling done by typical monitors when operating at non-native resolutions, but I forget exactly which type of scaling method is employed. Both companies have had on-board high quality scaling that's basically free (in terms of performance and latency) for a long, long time now, the 600-series with latest drivers are no exception. People with high resolution screens have remarked on their successful preference for setting demanding games' internal resolution lower and then allowing the graphics card to scale that up so that they can play with visual goodies enabled that would otherwise kill the steady performance they enjoy.

Any reason in particular you're looking for nearest neighbor...?

Agreed fucked around with this message at 07:42 on Nov 25, 2012

Shaocaholica
Oct 29, 2002

Fig. 5E

Agreed posted:

Any reason in particular you're looking for nearest neighbor...?

Running half res on a FHD 13" laptop LCD for superior sharpness. I haven't been all that impressed with bicubic or whatever scaling thats been around for ages. It just can't beat nearest neighbor in terms of sharpness and half res on a FHD 13" LCD should have plenty good DPI and sharpness.

Joink
Jan 8, 2004

What if I told you cod is no longer a fish :coolfish:

The wiiu really is just a new gaming console and nothing more. It definitly does have limitations where is shouldn't such as memory bandwidth but games made for it do run at 1080p and 60fps. (NSMBU does anyway. only game i own) I think once developers start putting out games that utilize the gpu over the cpu well have a better understanding on the graphical limits of the system. Current consoles rely on a fast cpu and are held back by terrible gpu hardware.

What i find surprising is its sold at a small loss given its hardware and price. Based on that i really don't see how MS/Sony could produce a system that outperforms the wiiu on such a scale that were looking at wii vs 360 performance all over again. Unless ms/Sony want to sell $600 consoles or take much larger losses on hardware sold, the wiiu's hardware should hold up fine vs new consoles. Given Sony's current finances, its hard to say what kind of console hardware they will produce. But for the wiiu right now its a matter of figuring out the hardware for developers. A few of the launch titles that are ports are terrible while others like COD/AC3 seem like solid ports.

PC LOAD LETTER
May 23, 2005
WTF?!

Alereon posted:

There's a POWER7 CPU (I guess that puts paid to the idea that POWER7 isn't power-efficient enough for consoles)
Searching for "POWER7" turns up nothing in that article though? He does mention the CPU is PowerPC based (so is the Wii's CPU, based on the 750CL apparently) but the 2 aren't the same thing even though they're associated. There is also some new info. that has come to light. If what some of the people in the B3D thread are saying is true than the WiiU's CPU is just a updated version of the Wii's CPU, and not a very good one at that. Its apparently quite a bit slower than the X360's CPU so the WiiU probably won't be able to run some ports from current consoles.

Anyways with a CPU die of only 33mm2 on a 45nm process and a power usage so low that the whole system only uses around 33w when in use its safe to say that even if the CPU isn't just a mildly updated Broadway it still isn't POWER7 based. For reference on the same process POWER7 chewed up 567mm2 of die space and 100w at its slowest speed. Its not a apples to oranges comparison of course. The WiiU's CPU is supposedly a tri core chip while POWER7 was quad core at a minimum, lots of differences in cache too and the bus and the WiiU's CPU supposedly has a clock of 1.29-1.6Ghz while POWER7's slowest available clock was 2.4Ghz.

Joink posted:

What i find surprising is its sold at a small loss given its hardware and price.
The WiiU tablet controller is probably what is causing that. Apparently if you buy 1 game Nintendo more than makes up the loss though so it'll probably still be profitable from day 1 for them. Given the rumors we've been hearing about what the PS4 (glorified Trinity APU of sorts) and X720 (AMD 67xx class GPU + maybe AMD CPU or multi-core ARM variant, or a bit of both + 4-8GB of RAM) I don't think its unreasonable to believe any console manufacturer is going to push the limits financially as much as they did back when the PS3/X360 launched.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Agreed posted:

Both companies have had on-board high quality scaling that's basically free

Caveat emptor: AMD/ATI's GPU scaling option has a completely idiotic usability issue where you can't change the scaling type (i.e.: Maintain aspect ratio, Stretch to full size or Center image) unless it currently is active, and the default is to stretch. And, of course, if you alt-tab away from a game to change the GPU scaling option, you revert to your native resolution so it's not active anymore. So you have to lower your resolution before starting whatever fullscreen game that will use that lower resolution, set it to Maintain aspect ratio (or other choice), then start the game, and restore your native resolution whenever you're done.

:psyduck:

Dominoes
Sep 20, 2007

Repost: Do you use a multi-monitor gaming setup? If so, two questions:

1: Do you use ATI or Nvidia?
2: Does vsync work in fullscreen?


Myself: ATI, no. I found someone on a different forum who said it works with nvidia, but am looking for confirmation. (Or if someone did get it working on ATI, which I'm starting to think is won't happen)

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Agreed posted:

They support very high quality scaling, and you can choose whether to scale while maintaining aspect ratio, or stretch-to-fit, with other options available as well. It is dramatically superior in quality to the mediocre nearest-neighbor quickie scaling done by typical monitors when operating at non-native resolutions, but I forget exactly which type of scaling method is employed. Both companies have had on-board high quality scaling that's basically free (in terms of performance and latency) for a long, long time now, the 600-series with latest drivers are no exception. People with high resolution screens have remarked on their successful preference for setting demanding games' internal resolution lower and then allowing the graphics card to scale that up so that they can play with visual goodies enabled that would otherwise kill the steady performance they enjoy.

This should (or a variation of) should go in the op.

Also, should my digital color format be anything other than RGB? My monitor can do the YcBcCr settings, but I lose my mouse pointer (a bug perhaps). The colors look very vibrant...slightly over saturated but nothing an adjustment couldn't fix.

incoherent fucked around with this message at 19:49 on Nov 25, 2012

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

PC LOAD LETTER posted:

The WiiU tablet controller is probably what is causing that. Apparently if you buy 1 game Nintendo more than makes up the loss though so it'll probably still be profitable from day 1 for them. Given the rumors we've been hearing about what the PS4 (glorified Trinity APU of sorts) and X720 (AMD 67xx class GPU + maybe AMD CPU or multi-core ARM variant, or a bit of both + 4-8GB of RAM) I don't think its unreasonable to believe any console manufacturer is going to push the limits financially as much as they did back when the PS3/X360 launched.

I hadn't thought of the tablet controller, but that would explain some of it. I agree on the rest: no one's going to be trying to challenge high end gaming PCs this generation, and there's nothing like the HD-DVD/Blu-Ray standards war driving hardware specs from outside the gaming industry either. So no one's going to be releasing fantastically expensive consoles still sold at a steep loss. The WiiU is going to be the weakest of its generation, but it should be closer to the GameCube vs. the PS2/Xbox than the Wii vs. PS3/360. Its RAM concerns me more than anything. At 2GB it's far larger than the Wii in any case, but only 1GB available to games means it's not that huge a leap over the PS3/360, and it will be well behind their successors. Memory seems to be a huge and often overlooked factor in game design and porting, especially as consoles age, so I expect it might be one of the big concerns with the system a few years from now.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
2GB is a healthy enough amount, but how on earth did they require 1GB for the OS? Isn't the the Xbox 360 OS limited to 16MB?

hobbesmaster
Jan 28, 2008

HalloKitty posted:

2GB is a healthy enough amount, but how on earth did they require 1GB for the OS? Isn't the the Xbox 360 OS limited to 16MB?

I imagine that is OS managed memory and the other 1GB is what you can fill from your own malloc calls.

Unfortunately Nintendo has some pretty restrictive NDAs so that kind of stuff isn't usually public for their consoles.

PC LOAD LETTER
May 23, 2005
WTF?!

Killer robot posted:

The WiiU is going to be the weakest of its generation, but it should be closer to the GameCube vs. the PS2/Xbox than the Wii vs. PS3/360.
I dunno, the guys in the B3D thread and public comments by developers working on the WiiU seem to suggest that it may actually be a fairly gimped console by next gen standards. Lots of hints have been dropped that not only is the CPU fairly slow for a new console and the RAM amount limited but there is also a dearth of internal bandwidth to play with and can be apparently high latency at that too.

I'm sure it'll still be able to put out games that look significantly better than the current consoles but it might still end up looking significantly worse than what the PS4/X720 will be able to do. If the their new controller approach doesn't take off in a big way a la Wii than the WiiU might not have much market staying power.

They'll probably still make money and definitely break even though so no need for anyone to freak out. I'm not suggesting a "they're doomed" scenario. Even if they did a mediocre/crappy job on the hardware Nintendo has obviously thought things through financially here.

Shaocaholica
Oct 29, 2002

Fig. 5E

Killer robot posted:

Memory seems to be a huge and often overlooked factor in game design and porting, especially as consoles age, so I expect it might be one of the big concerns with the system a few years from now.

It seems that with 1GB+ available for the game, you could easily port from PS4/X720 to WiiU with just texture scaling. Maybe ditch certain indirect images too. Basically don't have to do much to port the game logic but you'll just get a shittier looking game. Its the cheapest way and by virtue thats what will happen.

Now where's my 8GB PS4/X720?

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Shaocaholica posted:

It seems that with 1GB+ available for the game, you could easily port from PS4/X720 to WiiU with just texture scaling. Maybe ditch certain indirect images too. Basically don't have to do much to port the game logic but you'll just get a shittier looking game. Its the cheapest way and by virtue thats what will happen.
Depends on the CPU they use in the PS4/720. If the WiiU's CPU is as lovely as the rumours suggest (slower than the Xenos in the 360, and apparently by quite a bit) - and the PS3/720 use an X86 CPU as has been 'rumoured', that that may be a significant barrier.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Jan posted:

Caveat emptor: AMD/ATI's GPU scaling option has a completely idiotic usability issue where you can't change the scaling type (i.e.: Maintain aspect ratio, Stretch to full size or Center image) unless it currently is active, and the default is to stretch. And, of course, if you alt-tab away from a game to change the GPU scaling option, you revert to your native resolution so it's not active anymore. So you have to lower your resolution before starting whatever fullscreen game that will use that lower resolution, set it to Maintain aspect ratio (or other choice), then start the game, and restore your native resolution whenever you're done.

:psyduck:

Well, that's an awful lot of trouble that I have not had to go to, ever. I believe team green had an issue with the 8800 not scaling properly but that was drivers and hardware not talking to each other as they were supposed to; ever since my GTX 280, through the GTX 580, and now with a GTX 680 I get extremely high quality scaling for anything that isn't at the native resolution of my monitor (and it worked well on the old card at the native resolution of my previous 19" LCD 5:4 monitor, works well on my newer cards at 1080p). Certainly none of that, which frankly sounds like a huge pain in the rear end and way more than you should have to dick with to get something as simple as scaling working.

So I guess that's a point for nVidia. Image quality is, as stated, extremely high, though I really doubt it's nearest-neighbor just because it'd look like rear end if it were. Hard to find many details on it, though I believe it's related to a Shannon-Nyqist shock filtering? Sort of a less time-consuming implementation of anisotropic filtering, but I'm not expert in the specifics.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Agreed posted:

Well, that's an awful lot of trouble that I have not had to go to, ever. I believe team green had an issue with the 8800 not scaling properly but that was drivers and hardware not talking to each other as they were supposed to; ever since my GTX 280, through the GTX 580, and now with a GTX 680 I get extremely high quality scaling for anything that isn't at the native resolution of my monitor (and it worked well on the old card at the native resolution of my previous 19" LCD 5:4 monitor, works well on my newer cards at 1080p). Certainly none of that, which frankly sounds like a huge pain in the rear end and way more than you should have to dick with to get something as simple as scaling working.

So I guess that's a point for nVidia. Image quality is, as stated, extremely high, though I really doubt it's nearest-neighbor just because it'd look like rear end if it were. Hard to find many details on it, though I believe it's related to a Shannon-Nyqist shock filtering? Sort of a less time-consuming implementation of anisotropic filtering, but I'm not expert in the specifics.

My old 9800GX2 wouldn't scale if you were using HDMI. Glad to see that it works better now.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

PC LOAD LETTER posted:

Searching for "POWER7" turns up nothing in that article though? He does mention the CPU is PowerPC based (so is the Wii's CPU, based on the 750CL apparently) but the 2 aren't the same thing even though they're associated. There is also some new info. that has come to light. If what some of the people in the B3D thread are saying is true than the WiiU's CPU is just a updated version of the Wii's CPU, and not a very good one at that. Its apparently quite a bit slower than the X360's CPU so the WiiU probably won't be able to run some ports from current consoles.
My bad, I didn't notice that IBM retracted their confirmation that it was POWER7 (they had previously confirmed it used the same POWER7 architecture as the CPUs in the Power 750 Express server). It'll be interesting to see what it actually is, while merely doubling the Wii CPU's clockspeed would be a uniquely Nintendo solution, that seems monumentally idiotic even for them. I thought even Nintendo was onboard with the idea that having such weak hardware limited the utility of the console and thus revenue over the life of the product.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

chippy posted:

I didn't realise Sniper Elite v2 was such a demanding game. Is it really nice looking?

Even from that NV's chart there is no good reason why Skyrim or DA2 should run worse on a 680 than BF3 when both graphically are nowhere close to BF3.

Speaking of which, I hope reviewers actually touch instead upon graphical quality of a game with respect to its GPU demand than merely just posting FPS numbers of a card in a game, because there is a worrying trend of slipping quality of graphics (with respect to Crysis 1 in particular) but yet the requirements are still rising.

Athropos
May 4, 2004

"Skeletons are Number One! Flesh just slows you down."
I just bought a second 680 GTX for some hot SLI action and for now I'm running into issues. I havent had time to test further before being forced to go to work, but after installing my second card my monitor would not recieve any input anymore, from any DVI slot. The computer would apparently boot, the peripherals would power up, both cards fans are running (at an alarming 100% speed probably) and I do believe I heard windows bootup (will need to check again). But my screen does not show anything, it keeps doing what it was doing (blinking blue LED).

I tried to go back to one card, tried either card, and still nothing. Does changing video cards gently caress up monitors somehow? Like, does it need to be reset somehow? Would pressing F8 help? It's a korean IPS 27" display if that matters. Fairly barebones monitor, worked perfectly fine before today.

Herewaard
Jun 20, 2003

Lipstick Apathy

Athropos posted:

I just bought a second 680 GTX for some hot SLI action and for now I'm running into issues. I havent had time to test further before being forced to go to work, but after installing my second card my monitor would not recieve any input anymore, from any DVI slot. The computer would apparently boot, the peripherals would power up, both cards fans are running (at an alarming 100% speed probably) and I do believe I heard windows bootup (will need to check again). But my screen does not show anything, it keeps doing what it was doing (blinking blue LED).

I tried to go back to one card, tried either card, and still nothing. Does changing video cards gently caress up monitors somehow? Like, does it need to be reset somehow? Would pressing F8 help? It's a korean IPS 27" display if that matters. Fairly barebones monitor, worked perfectly fine before today.

Do you normally see a POST screen with that monitor? With my 2560x1440 I won't get a post screen and won't get anything until the OS loads.

And then I know some newer motherboards (like my ASUS motherboards) require you to enter the BIOS when there is a hardware change. Whenever I change hardware I have to plug in a smaller monitor and follow the stupid instructions to enter the BIOS just to get the system to boot that first time after the change.

Animal
Apr 8, 2003

Majumbo posted:

Do you normally see a POST screen with that monitor? With my 2560x1440 I won't get a post screen and won't get anything until the OS loads.

And then I know some newer motherboards (like my ASUS motherboards) require you to enter the BIOS when there is a hardware change. Whenever I change hardware I have to plug in a smaller monitor and follow the stupid instructions to enter the BIOS just to get the system to boot that first time after the change.

I dont think not getting a POST has much to do with the screen. More likely is which video card is set as primary in the BIOS.

craig588
Nov 19, 2005

by Nyc_Tattoo
I'd try replacing the DVI cable before you replace any other hardware if it comes to that. I had a VGA cable fail once and it was the most frustrating thing because I completely ignored it as a possibility until I ended up trying 3 different monitors and 2 different sources.

Squibbles
Aug 24, 2000

Mwaha ha HA ha!

Animal posted:

I dont think not getting a POST has much to do with the screen. More likely is which video card is set as primary in the BIOS.

The cheap Korean IPS's doing do image scaling, so if the video card isn't doing the scaling to native resolution then the monitor won't show anything. My nvidia 570 does do scaling but as the poster above said, not all cards do. Perhaps it's a problem with the 670 not scaling?

Adbot
ADBOT LOVES YOU

Athropos
May 4, 2004

"Skeletons are Number One! Flesh just slows you down."
Thinking of it Im not sure if I usually get a post with that monitor. These days hardware is so fast that that you barely have time to see anything before Windows starts loading.

I brought my system to a technician today and we tried many things.

1. We tested with another monitor and still there was no signal on DVI, but we had a signal on VGA and the computer would boot normally (went on the desktop and whatnot). So that rules my monitor out as being the culprit.

2. We then tried each individual card in another machine, and they would have the same behavior (no signal, 100% fan speed constantly).

3. We then tested a random PCI-E card in my system and that card would output DVI just fine. Which means I didnt fry my PCI ports or my motherboard somehow.

So that leaves me with two dead 680 GTXes, one of which is brand new. What the hell. I still can't imagine they're fried or had a critical hardware failure. Would it be possible it might still be software related? Something with the bios they put on these cards loving up when I tried SLI? I've been using the first 680 since June and it worked perfectly fine before trying SLI, now both are dead.

Athropos fucked around with this message at 23:09 on Dec 6, 2012

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply