Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Coredump
Dec 1, 2002

pixaal posted:

I see your logic, I'm going to run windows 3.1, and if I need DirectX I'll boot into windows ME.

I would be afraid to see Windows ME boot on modern hardware. I think a black hole would open...

Adbot
ADBOT LOVES YOU

Wedesdo
Jun 15, 2001
I FUCKING WASTED 10 HOURS AND $40 TODAY. FUCK YOU FATE AND/OR FORTUNE AND/OR PROBABILITY AND/OR HEISENBURG UNCERTAINTY PRINCIPLE.

Install Gentoo posted:

Reminds me of this kid who in 2011 was running XP Home (32 bit of course since 64 bit doesn't exist for it) on a 6 core AMD high end CPU, with 16 GB of RAM and two video cards, which needed to have power leads plugged into them, not having those plugged in so they were struggling by on PCI-Express power, and bragging about the system.

And in a separate forum on that same site, was asking for help for why his machine wasn't performing as good as it should, while insisting running a 32 bit 10 year old OS made it faster.

The funniest part: unless he was running Win XP x64 (which he probably wasn't), only ~3.2GB of that 16GB was actually usable.

Sprat Sandwich
Mar 20, 2009

Coredump posted:

I would be afraid to see Windows ME boot on modern hardware. I think a black hole would open...

ME? That slow piece of poo poo? Windows 98 SE or bust.

Anyhoo are the 7850s going to be comparable to 6950s? I'm thinking of picking one up when they finally drop.

Nintendo Kid
Aug 4, 2011

by Smythe

Wedesdo posted:

The funniest part: unless he was running Win XP x64 (which he probably wasn't), only ~3.2GB of that 16GB was actually usable.

I did explicitly say he was running XP home and that that is 32 bit only, yes. Also he'd already told the forum that he'd added more RAM recently to speed things up and it was "totally working" even though it of course couldn't be.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Install Gentoo posted:

I did explicitly say he was running XP home and that that is 32 bit only, yes. Also he'd already told the forum that he'd added more RAM recently to speed things up and it was "totally working" even though it of course couldn't be.

Maybe he was using totally legitimate copies of Adobe Creative Suite software which could manage its own memory and thus manage 4 GB chunks of RAM that the OS itself couldn't touch. For his Important Projects That Justify Such a Machine. This kid, doing that.

Sounds right to me.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Install Gentoo posted:

I did explicitly say he was running XP home and that that is 32 bit only, yes. Also he'd already told the forum that he'd added more RAM recently to speed things up and it was "totally working" even though it of course couldn't be.

I'm not amazed. I knew a guy who bought an Alienware desktop running XP 32 and 8GB RAM, years ago, when Geforce 6800 was hot poo poo, if memory serves.

JawnV6
Jul 4, 2004

So hot ...

necrobobsledder posted:

I totally remember that post on Hardforums. The innards of the chip were ripped out and you could see the different metal layers of the chip. It looked like an x-ray version of some of the VLSI diagrams put out in press kits but in a vertical cross section form.

Anyone have this picture?

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

JawnV6 posted:

Anyone have this picture?
Not sure if it's the same thing, but here's a pic from someone on XS who destroyed a 965 i7 trying to remove the IHS [can't host the pic right now as imgur's doing their anti-SOPA blackout] :
http://tinyurl.com/7d94rf9



Also, this piece of genius:
http://www.legitreviews.com/article/402/2/
code:
"The system would power on, but sadly it wouldn't post. 

Deathreaper
Mar 27, 2010
On a more positive AMD related note: I'm considering the purchase of 2 7970's and I was wondering if crossfire 7970's would bottleneck the 2 PCI-E 2.0 8x lanes on my P67 motherboard significantly?

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


grumperfish posted:

code:
"The system would power on, but sadly it wouldn't post. 

So what happened? The fans turned on? Congratulations you verified you're PSU is functioning.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Deathreaper posted:

On a more positive AMD related note: I'm considering the purchase of 2 7970's and I was wondering if crossfire 7970's would bottleneck the 2 PCI-E 2.0 8x lanes on my P67 motherboard significantly?
It should work well enough, but I'd recommend STRONGLY considering whether you really want to go Crossfire, especially given how powerful the 7970 already is. Also check out this TechReport article about micro-strutter.

Uberbob102000
Sep 15, 2009

Alereon posted:

It should work well enough, but I'd recommend STRONGLY considering whether you really want to go Crossfire, especially given how powerful the 7970 already is. Also check out this TechReport article about micro-strutter.

I'm agreeing with you on that, I'm looking to move to a single 7970 instead of a 6970 Crossfire setup because the of crossfire's wonkiness.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Also nothing is stopping you from buying a 2nd 7970 in 6 months if you need it for some reason. Now if you want/need to run 4+ monitors I think you can actually do that with 2 without enabling crossfire you just can't game on the ones attached to the 2nd card, but it should work fine.

Though I think anything from the 7000 series would work in this mode, at least that's how it sounded when I tried to see if a 4850 and a 5200 (or so) would work at the same time, catalyst said I needed 2 4000 or 2 5000 series cards and I could only enable 1 at a time. I really have no proof other then this though, so if you need this messed up setup do a bit of research first.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

pixaal posted:

Also nothing is stopping you from buying a 2nd 7970 in 6 months if you need it for some reason. Now if you want/need to run 4+ monitors I think you can actually do that with 2 without enabling crossfire you just can't game on the ones attached to the 2nd card, but it should work fine.

Though I think anything from the 7000 series would work in this mode, at least that's how it sounded when I tried to see if a 4850 and a 5200 (or so) would work at the same time, catalyst said I needed 2 4000 or 2 5000 series cards and I could only enable 1 at a time. I really have no proof other then this though, so if you need this messed up setup do a bit of research first.
Things have really changed since the 4000-series. On the 4000-series you can do two monitors per card OR 2 monitors total with Crossfire enabled. On current-gen cards you can do up to 6 monitors per card depending on model, though I think 6 stays the limit if you do Crossfire but I'm not sure. The idea is you should be able to use MST hubs to connect multiple monitors to each DisplayPort port, but they're not actually for sale yet. On nVidia you can do up to 2 monitors per-card OR 2 monitors total in SLI. In either case you can add additional cards to hang more monitors off of.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Good news: AMD says the upcoming Trinity APU will be fast, faster than they had initially told us to expect. They say to expect CPU performance gains of 25% over Llano, with graphics performance gains of 50%. The CPU is two Piledriver modules (improved Bulldozer optimized for desktops rather than servers) with four "cores" and an unannounced Radeon HD 7000-series VLIW4 GPU.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Alereon posted:

On nVidia you can do up to 2 monitors per-card OR 2 monitors total in SLI.

Small correction: [url=http://www.nvidia.com/object/3d-vision-surround-system-requirements.html]With higher-end cards, you can do 3 monitors total in SLI, with very limited choices of resolution.

30 TO 50 FERAL HOG
Mar 2, 2005



Why? I really don't understand how that is limitation in the year 2012. Why can a card have three outputs and not be capable of driving all of those outputs?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It has to do with how the signal is generated. DVI and HDMI use TMDS generators to create a clock signal and related circuitry on the receiving end to detect the clock signal. You need one generator per unique display. These generators cost money and take up card space, so it's not super cost effective to install more when most people use one or two monitors. Same deal with VGA connections and RAMDACs, roughly. And the connector pins are fixed-function, with distinct pins for the clock signal. You gotta work with that.

DisplayPort, OTOH, works on packet data with self-clocking. And since each connector and signal generator comes with four distinct "lanes," you can handle them with some flexibility. Either run the lanes separately and drive a 1080p60 on each one, or send a quarter of a single large display on each and aggregate them for, say, 1600p120.

Of course, you need proper driver support and hardware for this lane tomfoolery, and nVidia gives next to no shits about DisplayPort. Their driver updates regularly disable the DP output entirely until they accidentally turn it back on a month or two down the line. Never mind triple monitor on one card. nVidia gives no shits.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Factory Factory posted:

It has to do with how the signal is generated. DVI and HDMI use TMDS generators to create a clock signal and related circuitry on the receiving end to detect the clock signal. You need one generator per unique display. These generators cost money and take up card space, so it's not super cost effective to install more when most people use one or two monitors. Same deal with VGA connections and RAMDACs, roughly. And the connector pins are fixed-function, with distinct pins for the clock signal. You gotta work with that.

DisplayPort, OTOH, works on packet data with self-clocking. And since each connector and signal generator comes with four distinct "lanes," you can handle them with some flexibility. Either run the lanes separately and drive a 1080p60 on each one, or send a quarter of a single large display on each and aggregate them for, say, 1600p120.

Of course, you need proper driver support and hardware for this lane tomfoolery, and nVidia gives next to no shits about DisplayPort. Their driver updates regularly disable the DP output entirely until they accidentally turn it back on a month or two down the line. Never mind triple monitor on one card. nVidia gives no shits.

Tri-monitor works pretty well with AMD cards, you might want a display port -> DVI active converter since unless you have recently bought your monitors they probably don't have the port.

The key thing if you are going to covert is to make sure its active not passive. I guess its should be no surprise that AMD supports this with eyefinity, but I didn't want people that want to have 6 monitors feel that had no way of doing it (pretty sure this is doable with some AMD cards)

tijag
Aug 6, 2002
Not sure what to make of this.

http://semiaccurate.com/2012/01/19/nvidia-kepler-vs-amd-gcn-has-a-clear-winner/

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down


Well, there were basically two reasons for nVidia to be quiet. One is they have a poor answer that is just competition, not a win. The other is they're not ready to market yet but do have a strong answer, and it benefits them to hold their cards close to their vest so that the reveal has more impact.

Still, grain of salt until more is forthcoming. I really hope nVidia can slam-dunk something because I need CUDA more than I need vidyagames but I'm not going to take a vague "they win" as even remotely definitive, hope against hope notwithstanding.

Hopefully the timing of the release will at least allow ATI to get lots and lots of 7970s and 7950s out into the user base and improve yields so that if they do lose the performance crown again (continuing the usual pattern for some time now), they can afford to lower prices. Although if nVidia's hardware is good enough, I could see them having the balls to just price above the 7970. I'd be disappointed if that were the case, though.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

I may be working on a small sample size, but SemiAccurate seems like a site full of hysterical horseshit to me.

When Intel used a video to demo Ivy Bridge's DX11 graphics, SemiAccurates response was 1000 words of ":byodood: JESUS gently caress THIS IS AN ABOMINATION! THIS IS A CRIMINAL ACT OF DECEPTION AND FRAUD BEYOND ALL KEN, JUST LIKE THAT TIME NVIDIA USED WOOD SCREWS AMIRITE?"

AnandTech's coverage? "We saw the VLC interface during the DX11 interface demo. Whoops. Turns out the demo was slapped together last-minute. Now, I've already seen Ivy Bridge, but for posterity I asked Intel to re-run the demo on an IVB laptop, and they did. Here's the YouTube."

I'm not inclined to believe anything that site says until I hear it somewhere else as well.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Meaningless speculation mixed in with some traffic boosting bullshit is the way I see it.

Even if Kepler is amazing, this isn't a way to predict it.

HalloKitty fucked around with this message at 23:18 on Jan 19, 2012

tijag
Aug 6, 2002

Factory Factory posted:

I may be working on a small sample size, but SemiAccurate seems like a site full of hysterical horseshit to me.

When Intel used a video to demo Ivy Bridge's DX11 graphics, SemiAccurates response was 1000 words of ":byodood: JESUS gently caress THIS IS AN ABOMINATION! THIS IS A CRIMINAL ACT OF DECEPTION AND FRAUD BEYOND ALL KEN, JUST LIKE THAT TIME NVIDIA USED WOOD SCREWS AMIRITE?"

AnandTech's coverage? "We saw the VLC interface during the DX11 interface demo. Whoops. Turns out the demo was slapped together last-minute. Now, I've already seen Ivy Bridge, but for posterity I asked Intel to re-run the demo on an IVB laptop, and they did. Here's the YouTube."

I'm not inclined to believe anything that site says until I hear it somewhere else as well.

This is true mostly. I take everything he says with a huge container of salt. He does know people at foundry and engineering levels though. There is often information that comes out pretty early from him that ends up being correct later.

Not always, but often.

Mostly he's the most anti Nvidia person in the world, so if this is a serious post, then it's quite something to say 'they win handily', considering usually he finds ways to knock nvida products even when they are pretty good.

dad on the rag
Apr 25, 2010

Factory Factory posted:

I may be working on a small sample size, but SemiAccurate seems like a site full of hysterical horseshit to me.

When Intel used a video to demo Ivy Bridge's DX11 graphics, SemiAccurates response was 1000 words of ":byodood: JESUS gently caress THIS IS AN ABOMINATION! THIS IS A CRIMINAL ACT OF DECEPTION AND FRAUD BEYOND ALL KEN, JUST LIKE THAT TIME NVIDIA USED WOOD SCREWS AMIRITE?"

AnandTech's coverage? "We saw the VLC interface during the DX11 interface demo. Whoops. Turns out the demo was slapped together last-minute. Now, I've already seen Ivy Bridge, but for posterity I asked Intel to re-run the demo on an IVB laptop, and they did. Here's the YouTube."

I'm not inclined to believe anything that site says until I hear it somewhere else as well.

Charlie Demerjian usually likes to bash Intel and Nvidia, so it's pretty surprising that he came out with something positive to say about the new Nvidia cards.

Longinus00
Dec 29, 2005
Ur-Quan
When's the last time ATI/AMD has been "on top" anyway? The only thing that would be surprising is if Nvidia was able to price match ATI/AMDs top card.

Star War Sex Parrot
Oct 2, 2003

Longinus00 posted:

When's the last time ATI/AMD has been "on top" anyway? The only thing that would be surprising is if Nvidia was able to price match ATI/AMDs top card.
Define on top. Do you mean having the single fastest GPU, best overall architecture, best bang/buck?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Longinus00 posted:

When's the last time ATI/AMD has been "on top" anyway? The only thing that would be surprising is if Nvidia was able to price match ATI/AMDs top card.

Back when nVidia bought 3DFX and made some terrible guesses leading to them putting out some crappy cards under the FX 5000 series. That was the time for ATI's 9000 series to kick rear end. nVidia unfucked themselves with their 6800 cards, and ATI's cards have since been suitably competitive but not extraordinary, aiming for price over performance. They committed to a smart strategy with the (post-X1000 or whatever nomenclature) Radeon HD 3800/4800 cards, aiming for individually well-performing but not showstopper cards designed for scalable operation. nVidia at the same time (GTX 280/285) continued with the "fastest single GPU" with higher price to match, though they do keep SLI going it lags behind ATI substantially past two cards, and their multi-monitor support is crap.

ATI firing the first shot in this generation, the first REALLY NEW cards in awhile, at $600 for some SKUs and $550 base could be an interesting opportunity for nVidia, since they've conventionally done their best to ensure that their single GPU powerhouses are priced around $550 or so... They had the GTX 280 at $600+ when it launched, as I recall, but ATI punished them with the 4870x2 for far less money and superior performance in games where Crossfire was well supported (more problematic then, in terms of game support, but still excellent in games which did support it well) and nVidia ended up dramatically dropping the GTX 280's price to compensate.

Who knows how this will play out, one guy with a hit and miss reputation, even if it would be unusual coming from him, is not enough info to base an assessment on... We'll see how it goes.

Zhentar
Sep 28, 2003

Brilliant Master Genius

tijag posted:

This is true mostly. I take everything he says with a huge container of salt.

Ivy Bridge stuff needs some extra salt, as well. Charlie is a major Linux fanboy, and doesn't take well to it not being treated as a serious desktop OS contender. Sandy Bridge (and previous Intel stuff as well) graphics drivers treated Linux as a second class citizen, so Charlie is predisposed to seething rage when it comes to Intel graphics.

Deathreaper
Mar 27, 2010

Alereon posted:

It should work well enough, but I'd recommend STRONGLY considering whether you really want to go Crossfire, especially given how powerful the 7970 already is. Also check out this TechReport article about micro-strutter.

Thanks for the info. I currently run crossfire 6970's but they are not quite fast enough to play BF3 at 2560x1600 on the highest settings. I know 1 7970 would be a bit slower so that's why I was considering two of them. 2560x1600 can be a curse sometimes...

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Longinus00 posted:

When's the last time ATI/AMD has been "on top" anyway? The only thing that would be surprising is if Nvidia was able to price match ATI/AMDs top card.

You'd have to define that more carefully.

Technically, ATI has been on top for single cards for quite a while.

The fastest card last generation was the Radeon 6990. (It's the fastest now, until dual Tahiti hits, but I'd still get a single 7970 rather than a 6990 now).
Before that, it was the Radeon 5970.
I don't know how you want to deal with the other gaps in time, such as the Radeon 4870X2 vs Geforce 9800 series, the Radeon obviously winning initially, but later it had to stand against the Geforce GTX 280, 295, until the 5xxx series hit.

Basically, AMD has been aggressive in getting out pretty fast stuff soon, then doubling up, to give you the fastest single card. NVIDIA tended to come later with a monolithic design, having troubles when sticking two of them on a card.

This is no more obvious than with the Geforce 590 vs the 6990. The Geforce 590 should be faster, there's no doubt Geforce 580 SLI is faster than a 6990. But it had to be clocked down significantly, evaporating a lot of the performance.

HalloKitty fucked around with this message at 11:17 on Jan 20, 2012

Longinus00
Dec 29, 2005
Ur-Quan
I was talking about in a single chip solution. Crossfire/SLI brings its own problems (driver support required for it to work in games, misc. issues in games, micro stutter when in 2x mode, etc.).

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Viewed that way, NVIDIA has a rosier track record. This was intentional, though.




Agreed on the issues with Crossfire/SLI, more trouble than its worth, usually.

HalloKitty fucked around with this message at 18:26 on Jan 20, 2012

movax
Aug 30, 2008

Too many posts with nvidia and AMD in this thread, not enough 3dfx and Voodoo :smug:

Star War Sex Parrot
Oct 2, 2003

What's baffling to me is that there were basically no custom 7970s at CES. AMD must have really rushed this launch if partners are so far behind. I really want to see custom cards after seeing how far my reference design goes with overclocking.

Star War Sex Parrot fucked around with this message at 18:48 on Jan 20, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

movax posted:

Too many posts with nvidia and AMD in this thread, not enough 3dfx and Voodoo :smug:

Excuse me, I'd like to call your attention to exhibit A

Agreed posted:

Back when nVidia bought 3DFX

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib
Someone posted this in the system building megathread:

Maniaman posted:

If anybody's looking for some comedy related to AMD vs Intel:

http://www.facebook.com/Newegg/posts/10150552307784168

:sigh: posted:

AMD! Runs cooler, over-clock-able, expandable, compatible, affordable, genuine cores.

The fanboys really do exist.

movax
Aug 30, 2008

Agreed posted:

Excuse me, I'd like to call your attention to exhibit A

:eng99:

Touche.

VorpalFish
Mar 22, 2007
reasonably awesometm

Longinus00 posted:

I was talking about in a single chip solution. Crossfire/SLI brings its own problems (driver support required for it to work in games, misc. issues in games, micro stutter when in 2x mode, etc.).

If you define on top as the fastest single GPU card, then the last time was... right now with the 7970. Before that, I believe it would have been right after the 5000 series launch when both the 5850 and the 5870 were faster than the 285.

But if you look at it in terms of who's been making cards that people should actually buy, AMD is on top and has been on top for a long, long time. They basically won the 4800 vs 200 series generation entirely, and even into the launch of fermi, nvidia didn't make a single card worth buying until the gtx 460 launched. If you look at the market right now, nvidia has one card worth buying - the 560ti - and it's essentially deadlocked with the 6950. Meanwhile, AMD dominates the low and midrange with the 5670, 6770, 6850, and 6870, and also pretty much dominates the ultra high end with either 2x 6950s or the 7970.

AMD has been kicking rear end for a long, long time now, at least with regards to graphics.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

VorpalFish posted:

AMD has been kicking rear end for a long, long time now, at least with regards to graphics.

Too bad everybody knows their drivers suck! :haw:

:suicide:

  • Locked thread