Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Berk Berkly
Apr 9, 2009

by zen death robot
Hardcore engines like Unreal 4 is a driving factor of things like the Kepler hardware and future iterations. Just look at it.

http://www.wired.com/gamelife/2012/05/ff_unreal4/all/1?pid=2552

Adbot
ADBOT LOVES YOU

Berk Berkly
Apr 9, 2009

by zen death robot

HalloKitty posted:

What the gently caress is even the point of releasing this mess? To rip people off when they could be using Intel HD graphics for free?

Yes?

This is a marketing change. Marketing isn't about telling a consumer what his most prudent or efficient choice. Its about offering as many temptations as possible to fish money from them, if not attempting to persuade them in the absence of or in the face of more economically sound wisdom.

Berk Berkly
Apr 9, 2009

by zen death robot

Star War Sex Parrot posted:

Unreal Engine 4 stuff from E3. That lighting model is insane. The tools are drastically improved too. I'm glad Epic is trying to address the "more detail = more development costs" with better tools.

I remembered our little aside about Unreal 4 being the software that is going to leverage our GPU hardware of the future and was coming here to post that video since we only had a few preview images to go by last time.

The tools they updated it with are amazing:

Tool Highlights posted:


Make updates directly in game without ever pausing gameplay with Hot Reload. This tool allows you to quickly find and edit C++ code and see those changes reflected immediately in game.

After an update is made, Instant Game Preview gives you the power to spawn a player and play anywhere in game without needing to wait for files to save.

The all-new Code View saves you time by allowing you to browse C++ functions directly on game characters then jump straight to source code lines in Visual Studio to make changes.

Live Kismet Debugging enables you to interactively visualize the flow of gameplay code while testing your game.

Now you can quickly debug and update gameplay behaviors when they happen through the new Simulate Mode. This tool lets you run game logic in the editor viewport and inspect AI as the game characters perform actions.

View your game in full-screen within the editing environment with the Immersive View tool. This allows programmers to complete iterations on gameplay changes without added UI clutter or distractions.

Possess/Eject Features allow at any time while playing in editor to easily “eject” from the player and take control of the camera to inspect specific in-game objects that may not be behaving properly.

Berk Berkly
Apr 9, 2009

by zen death robot

Factory Factory posted:

We need a 600 pixel wide :circlefap: for that. I love that indirect lighting.

Well you could cram that many into a 2.3 inch LCD thanks to the japanese:

http://www.tomshardware.com/news/Japan-Display-Inc-Retina-Display-651ppi-pixels-per-inch,15913.html


Speaking of which, would it be too much to ask to have 300ppi++ monitors instead the ever widening displays we have? I mean, its going to be a complete waste of hardware to have less than two monitors eventually if the ppi density doesn't go up even for entry level GPUs.

Berk Berkly
Apr 9, 2009

by zen death robot

Aquila posted:

Same deal here. Though I'm thinking of what kind of graphics setup I'd need to drive 3x 2560x1440 monitors.

I'm sure a pair of GTX 690 SLI could pump out that kind pixel volume:

https://www.youtube.com/watch?v=XQAECBuDICg



Also, just as a terribly niggly thing that has been bugging me with the OP is that the tessellation subsection section header is misspelled with one 'l'.

Berk Berkly
Apr 9, 2009

by zen death robot

Glen Goobersmooches posted:

I'm getting a brand spankin' new 670 tomorrow, and I've being reading some scary things about Nvidia's last few rounds of driver releases. Is there a general consensus on the least wretched Beta version at the moment?

Its okay as far as beta releases go:

http://www.anandtech.com/show/6069/nvidia-posts-geforce-30479-beta-drivers
http://techreport.com/discussions.x/23210

Berk Berkly
Apr 9, 2009

by zen death robot

Boten Anna posted:

Even though I'm a huge nvidia fan, it sucks there's really only two competitors and I'd hate to see AMD get gutted or fold. As such I also hope this generation isn't going to sink them.

Is it really that bad? I know AMD pretty much dropped the ball hard on their CPU business side with Bulldozer but I thought they where keeping up pretty well to Nvidia, at least as far as performance/price was concerned. Are they in peril of simply dropping ball again with the next gen as Nvidia advances Kepler and Kepler++?

Berk Berkly
Apr 9, 2009

by zen death robot

dog nougat posted:

Ok, that's kinda what I suspected. The allure of shiny new things can be a but much at times. I can't reasonably justify a new psu and another video card. The 670 while alluring is still pretty pricey for such a marginal upgrade. Guess I'll just wait until the next generation of cards comes around. Which I'm guessing its still quite a ways off. What was the length of time between the 5 series release and the 6 series?

Just as an option you could potentially Ebay your 570 and use the proceeds for a 660Ti if the noise and wattage were serious issues. Depending on what you can get for your specific model the difference you make up is returned in a quieter, cooler, more power efficient, current gen tech.

As for the timeline gaps. The 500 series popped into the market around the beginning of November in 2010 and the 600 series in March, 2012, though it was hard to find them.

Berk Berkly fucked around with this message at 22:37 on Aug 26, 2012

Berk Berkly
Apr 9, 2009

by zen death robot
It would be really nice if there were more quality competition around the $100 and $150 price points. Right now you have choices from AMD 7750, 7770, and the 6850 cards but all you really have from Nvidia are some of the lower end GTX560 cards at best.

Is it too much to ask Nvidia to chip down the 660Ti for a 650Ti and compete at the $200 mark to try and drive down the Fermi gen prices down? Who wouldn't like to see some of the higher quality 560 Ti's down below the $150 mark to help squeeze into those budget gaming boxes?

Berk Berkly fucked around with this message at 23:22 on Aug 26, 2012

Berk Berkly
Apr 9, 2009

by zen death robot

Factory Factory posted:

The GK104 in the 660 Ti is the same chip as in the GeForce 680, though. It'd have to be cut down a LOT to make a meaningful 650 Ti-type card, further than even the OEM 660. I'm not sure TSMC's 28nm process is quite so bad as to actually have a supply of chips that wrecked, which means providing such a card would require cutting down a chip that might have worked just fine in a more expensive SKU (which are selling just fine, thanks). In the Fermi generation, the 580 and 570 were one chip (later joined by the 560 Ti-448), the 560 Ti, 560, and 560 SE (plus a refreshed 460) were another chip, and the 550 Ti was a third chip entirely. There is indeed a gap in the Kepler generation.

Arg, so I guess it IS to much to ask and the 660 OEMs are the bottom of the bin for the GK104s.

Do we actually believe the GK107s are going to be competitive/trade blows with the GTX560? I could see it popping up at $150 and getting niched.

quote:

Retailers and partners that sell directly?

Well, when I rhetorically asked "Who?" it was a 'royal who' in that I meant us the consumers/end-users. :v: Of course the people making bank on us don't like it when their gravy is watered down.

Berk Berkly
Apr 9, 2009

by zen death robot
GeForce 306.02 BETA Drivers are out:

http://www.geforce.com/drivers

Edit: Fixed so it wouldn't point you directly to the XP drivers. Thanks for the heads-up.

Berk Berkly fucked around with this message at 07:47 on Aug 28, 2012

Berk Berkly
Apr 9, 2009

by zen death robot

The Lord Bude posted:

Your Pricing is reasonable if I was interested in buying midrange, value for money stuff the way everyone on SA is obsessed with. I'm not. Value for money is irrelevant to me. I simply don't care about spending twice as much for a 10% performance improvement. When it comes to buying things, I have a much simpler philosophy. I identify the best performing products, then I buy them. If I can't afford to do that, I hold off my purchase until I can.

This is a especially terrible philosophy for buying tech. If you can't afford things now its at least partially because you spent twice as much for that 10% performance improvement before multiple times already.

If you had nigh-unlimited funds your purchasing strategy might make sense.

There is a good reason why goons put such an emphasis on high price/performance, even has high end gamers. We know eventually our poo poo is going to be outdated, outperformed, and eventually obsolete and paying premium now barely slows that inevitability down.

quote:

Speaking of which, any word on when AMD is going to return fire with the 8xxx series? I remember reading that they wanted to get them out by the end of the year.

Optimistically in time for Christmas shopping, but more likely in the first quarter 2013 to take advantage of the time before Nvidia can online a 700 series Kepler refresh.

Berk Berkly fucked around with this message at 08:24 on Aug 28, 2012

Berk Berkly
Apr 9, 2009

by zen death robot

LittleBob posted:

Speaking of Borderlands 2 - am I likely to be able to squeeze PhysX out of a 680 running a 1440p display?

Being one of those annoying 'must play on ultra' freaks, Sleeping Dogs has me tempted to go 680 SLI, although I'm not sure if that's simply a driver issue, haven't had time to check yet.

It likely is a driver disconnect as far as Sleeping Dog and Nvidia card performance right now. From what I recall they didn't get a chance to optimize for it yet, just add SLI support.

Berk Berkly
Apr 9, 2009

by zen death robot

Fanelien posted:

Continuing the PhysX discussion in regards to Borderlands 2, I have a pair of 570s in SLI at the moment. Is this likely to handle high settings with GPU PhysX at 1080p? Or would I be looking for another card to slot in for PhysX? If so, price - performance as a dedicated PhysX processor what should I be looking at, I was considering a used 460 or similar.

Baring issues with SLI in general, that is still a rather beefy setup and you likely won't see any really painfully noticeable improvements with any single card solution baring the 690. A pair of 670's in SLI would do it too of course.

Berk Berkly
Apr 9, 2009

by zen death robot

The Lord Bude posted:

For those who are interested the first reviews are up for the 7990. As far as I know AMD hasn't made a reference design for the 7990, but rather given free reign to the various partners to slap together a couple of 7970s however they like and call it a 7990. Powercolor is the first to release something:

http://www.hardwareheaven.com/reviews/1561/pg1/powercolor-devil-13-hd-7990-graphics-card-review-introduction.html

http://www.techpowerup.com/reviews/Powercolor/HD_7990_Devil_13/

Eeh. Not looking to swell out of the gate:

Techpowerup posted:

All data in this review was obtained after I repaired a major design error of the card.
...
The card was unusable out of the box.

So for an Epeen card, their flaccid opener is disappointing.

Berk Berkly fucked around with this message at 20:51 on Sep 1, 2012

Berk Berkly
Apr 9, 2009

by zen death robot
A part of that is in thanks to consoles considerations holding back a lot of game developers as opposed to going hog wild like the original Crysis. The last big step-up in general was the DX9<->DX11.1 transition, but plenty have anchored/distinct compatibility with DX9. Bundled hardware like PhysX is seen more as neat eye-candy than a "got to play it for that feature" tech.

Berk Berkly fucked around with this message at 21:19 on Sep 1, 2012

Berk Berkly
Apr 9, 2009

by zen death robot

Fanelien posted:

Did some testing with SLI/Physx gpu/cpu with Batman Arkham City:

I just realised I have an 8800 Ultra 768mb sitting in a box around here somewhere. As I understand the 8 series cards are the first generation of physx capable. Might try it with that when I next clean the dust out of my rig.

From what I know what is going to happen is that the 8800 Ultra is going to bottle neck your other two cards badly even just as a dedicated PhysX slave. They will be waiting on his slow rear end for the calculation to keep on during their jobs.

Berk Berkly
Apr 9, 2009

by zen death robot

Rigged Death Trap posted:

:catstare:
Give me one NOW.

Avatar/Post combo win.

Haswell is due out around June/July next year? That almost feels too good to be true. At that point I'm curious if we will even have cards like the 7750 or even 7770 when you can just get very decent quality graphics without a discrete card at native resolution 1080p.

quote:

Me and another guy have very similar setups: P8Z77 mobos, I5 3570k's and 8 gigs of ram. The main difference is he is running an MSI 660ti and I'm running the EVGA 660ti. We both ran Heaven 3.0 to get a comparison.

Whatever the difference is I like the results your setup better. A min 30FPS means you should hold up much better during the harshest/demanding points of gameplay. Anything over 60FPS starts to have greatly diminishing returns in terms of visual experience, so the tens of frames difference in the hundreds is trivial.

Berk Berkly fucked around with this message at 14:31 on Sep 12, 2012

Berk Berkly
Apr 9, 2009

by zen death robot

Rosoboronexport posted:

So, the non-oem GTX 660 is benched by TweakTown. It's a new chip, GK106, instead of binned GK104 so there is notable difference to OEM version. OEM has ~10 % more shaders but lower clock speed. The memory configuration in 2 gb models is same as 660Ti which means lower bandwidth to the final 512 mb of RAM.
Performance-wise it's 20 % slower than 660Ti and trades blows with 7870 and GTX 580. I wonder if they're going to offer it in 1.5 GB configuration and will the card get by with 1 PCIe power connector. Depending on the price that's the card I'm eyeing at.

I kind of doubt it. Card makers always want to make bigger margins by splashing on extra memory and charging more.

That is why we have plenty of 3GB GTX660TIs but no svelte 1.5GB models.

Berk Berkly fucked around with this message at 11:45 on Sep 13, 2012

Adbot
ADBOT LOVES YOU

Berk Berkly
Apr 9, 2009

by zen death robot

ScienceAndMusic posted:

So if I wanted to get a new nVidia card, is there a recommended card that everyone agrees is the best price to performance sweet spot?

As of right now? No higher than a GTX670 and no lower than the GTX660, with the only card in between being the 660TI. The 680 is far to close to the 670 performance to justify the extra $100 it costs, and the 690 is just a doubled up 680. It is a matter of how much you want to spend.

Just don't pay premium for GTX660/660TIs with more than 2GBs of VRAM.

Berk Berkly fucked around with this message at 20:21 on Sep 18, 2012

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply