|
I think it's really going to be "Jaguar + <foo>" where foo might be instruction sets etc that are more tailored to the logic we see in games which would be a lot easier than "3570k + <foo>." Just a hunch.
|
# ? Jan 23, 2013 14:54 |
|
|
# ? Jun 3, 2024 22:25 |
|
wipeout posted:I wonder what priorities made them choose the Jaguar approach. AMD falling all over themselves to do it cheaper - and having APUs available for prototype/dev kits immediately? Just wild speculation on my part.
|
# ? Jan 23, 2013 15:35 |
|
roadhead posted:AMD falling all over themselves to do it cheaper - and having APUs available for prototype/dev kits immediately? Just wild speculation on my part. I guess AMD would be a one stop shop for your CPU & GPU needs as well.
|
# ? Jan 23, 2013 16:00 |
|
At the very worst this is good for AMD and thus everyone else.
|
# ? Jan 23, 2013 16:37 |
|
Space Gopher posted:Well, for one, the current generation of consoles has a hard time running at native 1080p. The standard trick is to render the 3D scene at 720p (or sometimes even less!), scale it to 1080p, and then put 2D UI stuff over the top of that. There's plenty of room to play with GPGPU processing; neither current console supports it. And, in the rest of the system, more RAM means nicer textures, larger levels, and so forth. A faster CPU (and that GPGPU integration) means more sophisticated procedural animation and more complex gameplay. There's plenty of room for improvement. I don't doubt that there is room for improvement. I'm just saying, when was the last time visuals were the selling point of a game? They are going to have to do more to the next gen console line than stuff more HP under the hood. If the increased processing power doesn't translate to anything other than traditional gameplay with more spit and polish, I don't think the uptake is going to be all that swift.
|
# ? Jan 23, 2013 17:31 |
|
bull3964 posted:I don't doubt that there is room for improvement. I'm just saying, when was the last time visuals were the selling point of a game? Pretty much all the loving time on PC, and probably even consoles. Skyrim comes to mind.
|
# ? Jan 23, 2013 17:34 |
|
But when was the last time visuals have really improved on a game? Even with high quality texture packs and a kick rear end PC, skyrim still leaves a lot to be desired. The biggest thing that I can see skyrim using in particular is proper fluid dynamics so we could have actually flowing rivers and falling waterfalls. That would be truly revolutionary. Everything we have now is about more polygons, more anti-aliasing tricks, reflections, HDR and crap like that, and with all AAA titles tied to 7 year old consoles requirements, (not to mention standard PC hardware specs instead of top of the line) we don't see the ultra high poly models and lighting that we want. For gods sake, we can still barely get games where you leave footprints or get WET when you fall in water. Plants are still sprites. RAGE had an interesting idea with their megatexturing thing, but in practice it didn't work out.
|
# ? Jan 23, 2013 18:00 |
|
SocketSeven posted:But when was the last time visuals have really improved on a game? When they weren't forced to downscale the totality of their engine to work on archaic consoles with 350MB of RAM to work with between CPU and GPU? You guys are arguing against needing new consoles because we don't have any better graphics, when it's the other way around.
|
# ? Jan 23, 2013 18:04 |
|
How is nvidia optimus supposed to work for apps that detect GPUs? Is there a chance an app won't find the discrete GPU because its off? I recently downloaded the trial of PhotoZoom Pro 5 which is supposed to be GPU accelerated. I set it to use the discrete GPU in nvidia settings as well. When I run the app, it does an optimization step which I'm assuming is compiling OpenCL for the target GPU. Well when I run the app on an image, the GPU load on the nvidia GPU (620M) is zero. I'm wondering if it can't find the 620M for whatever reason.
|
# ? Jan 23, 2013 19:19 |
|
Skyrim is a pretty awesome example of just what fans can do graphically for a game if not bottlenecked by console hardware, let alone game developers.
|
# ? Jan 23, 2013 19:33 |
|
Jan posted:Pretty much all the loving time on PC, and probably even consoles. Skyrim comes to mind. Most commercials for games involve elaborate pre-rendered scenes of scripted events. Real gameplay and graphics hardly EVER make it any sort of marketing materials. About the only game in 2012 that I remember graphics being any part of the marketing was Far Cry 3 touting the lush environments. quote:You guys are arguing against needing new consoles because we don't have any better graphics, when it's the other way around. It's not that so much as "Are the vast majority of mainstream gamers going to drop $500+ for a system and accessories for improved graphics." I get that PC gamers are frustrated at the bottleneck that consoles tend to place on games and that SO much more could be done with modern hardware. I just don't know that it's a hugely marketable feature nor something that's going to greatly increase profit with the game studios. bull3964 fucked around with this message at 19:43 on Jan 23, 2013 |
# ? Jan 23, 2013 19:36 |
|
Jan posted:When they weren't forced to downscale the totality of their engine to work on archaic consoles with 350MB of RAM to work with between CPU and GPU? Yeah, that's pretty much it. High end game development is so expensive now that the only people who have reason to sink that kind of cash into single-platform development rather than maximize their target audience are console manufacturers looking to grow market share through exclusives. And multiplatform development means an end result defined by the weakest link. The end result is a market where technology advance has been hampered by a console generation lasting so long and developers having to trim things back to fit - sometimes unsuccessfully, as seen in the Skyrim DLC.
|
# ? Jan 23, 2013 19:38 |
|
Shaocaholica posted:How is nvidia optimus supposed to work for apps that detect GPUs? Is there a chance an app won't find the discrete GPU because its off? Ok, so maybe this is optimus trying to be too smart. I have my app setup to use optimus but when I start the app the optimus systray notifier says there are no apps using the discrete GPU. Do apps need to register themselves somehow in order for optimus to assign the discrete GPU to them? Maybe the app isn't coded for optimus compatibility?
|
# ? Jan 23, 2013 20:54 |
|
I'm terribad at Optimus because I have literally never used laptop switching graphics except once for a week with an Asus laptop I returned because they weren't working... BUT... I believe that Optimus keeps track of which GPU to use via the driver itself. If you right-click on an icon, it should present the option to run on a particular GPU. Absent that, the Nvidia control panel will let you add the executable to the list of "always run with dGPU" programs.
|
# ? Jan 23, 2013 21:26 |
|
Factory Factory posted:I'm terribad at Optimus because I have literally never used laptop switching graphics except once for a week with an Asus laptop I returned because they weren't working... BUT... It sounds like he has done that, but I think the issue is a flaw with Optimus. I know switching does not trigger yet with OpenGl and OpenCL calls and since the program does not do any rendering I don't think forcing it will work since Optimus is trying to be smart and save battery life. I think the Intel HD 4000 opencl performance is close enough or better to the OpenCL performance of the 620m that it is better to run it on the Intel HD 4000.
|
# ? Jan 23, 2013 22:00 |
|
OpenCL on a 620M would be similar to HD4000? What about the dedicated memory of the 620M(1G)? Anyway, the company rep says that they will have a new build of their app soon which will have a manual picker for the GPU to use and to compile OpenCL against. Edit: Seems like the 620M should be considerably faster than the HD4000 (3517U) http://clbenchmark.com/compare.jsp?config_0=13283286&config_1=13671906 Shaocaholica fucked around with this message at 00:26 on Jan 24, 2013 |
# ? Jan 24, 2013 00:19 |
|
Shaocaholica posted:OpenCL on a 620M would be similar to HD4000? What about the dedicated memory of the 620M(1G)? That a comparison of just the cpu but it being a ultra low voltage version could put the 620m ahead. Here is the 620m vs the HD4000. I'm not sure if it's the ULV version or not. http://clbenchmark.com/compare.jsp?config_0=13283286&config_1=11977159 Right now dedicated dd3 memory does not make as much of a difference anymore with the large l3 cache that the HD 4000 has access to and the main memory is fairly fast dd3 memory as well.
|
# ? Jan 24, 2013 00:39 |
|
Oh interesting, thanks. Still, I think the ULV part may be a big factor like you said. I'll test both manually and report. Edit: Well ran my own tests using clbenchmark and the results are mixed. The 620M is way better in one test while the ULV HD4000 is way better in another test. The rest are mixed but very close. Very interesting considering that the 620M will smoke the HD4000 in anything thats a game. Shaocaholica fucked around with this message at 02:33 on Jan 24, 2013 |
# ? Jan 24, 2013 00:42 |
|
Walp guys, the "leaked" info was fake: http://x-surface.tumblr.com/post/41282771026/x-surface-dont-believe-everything-you-read
|
# ? Jan 24, 2013 14:24 |
|
Wozbo posted:Walp guys, the "leaked" info was fake: That reminds me that I haven't been watching the news on next gen consoles for a while now. Are there any leaks that are confirmed or any actual statements? Are we going to finally get 8GB+ of main memory and force all game devs to go 64bit?
|
# ? Jan 24, 2013 19:16 |
|
Shaocaholica posted:That reminds me that I haven't been watching the news on next gen consoles for a while now. Are there any leaks that are confirmed or any actual statements? Are we going to finally get 8GB+ of main memory and force all game devs to go 64bit? The closest thing to 'factual' information we've had was the leaked design document from 2010 back in June, and the Durango dev kit that got onto eBay a couple months later that had specs "similar to a high-end PC". That said, it seems we're at the point that next-gen specs can be safely speculated on given currently available PC hardware specs and what's reasonable to expect in a ~$400 console, unless Sony pulls another Sony and has some overengineered Cell processor-type technology waiting in the wings. One thing to note about Wozbo's post - while I'm in no way endorsing the journalistic chops of most gaming sites, as far as the supposed specs of the next gen Xbox go, the faked rumor substantiated the majority of the specs, only embellishing the CPU's clock speed. The software and X-Surface parts were obviously complete bullshit on the hoaxer's part, but you can't completely discredit the (mostly reasonable) specs just because some guy made up a rumor about an Xbox tablet. Though, again, given the state of the majority of games 'journalism', it's pretty advisable to take all of this with a grain of salt.
|
# ? Jan 25, 2013 02:43 |
|
Finally picked up a 7870 Myst (Tahiti LE), coming from a GTX 560. Upsides: +The performance is a straight upgrade over the 7870, I'd say about 10% at stock and can meet the 7950 Boost with a little overclocking. +Got it for $235, same price as the 7870's so an excellent value. Plus came with FC3 and Sleeping Dogs +Good variety of video outputs. DVI, HDMI and 2 DisplayPort. Also came with a DisplayPort->HDMI adapter. +It can be Crossfired with 7950's. I don't plan on doing that because of the power usage, but I guess it does leave an option open in the future. Downsides: -Card is HUGE. Probably the biggest card I've ever had. Had to re-arrange my drives/PSU in my case to get it to fit. -Uses more power. I don't really care about that since my PSU can handle it. -The big one: loving fan is loud. It does its job well - I don't even break 70C running Furmark. Making a custom Fan profile in Afterburner fixes this issue somewhat though -Voltage locked apparently, but I've also heard it is just an issue of software not recognizing the card so who knows. Overall I'm happy with it. The regular 7870's make more sense though if you want something quieter/smaller and fairly close in performance. Alpha Mayo fucked around with this message at 09:07 on Jan 27, 2013 |
# ? Jan 27, 2013 09:05 |
Where are the GTX780s? Where are they?
|
|
# ? Jan 27, 2013 09:27 |
|
teh z0rg posted:Where are the GTX780s? Where are they?
|
# ? Jan 27, 2013 15:14 |
|
I am playing around with Adaptive VSynch on my 670 (1440p). How do you guys go about it? I am thinking focrcing Adaptive on the nvidia CP, and choosing VSynch off in the games so as not to cause any type of conflict.
|
# ? Jan 28, 2013 03:27 |
|
Speaking of Vsync, is there any way to enable it without also getting the horrendous mouse acceleration?
|
# ? Jan 28, 2013 04:23 |
|
Endymion FRS MK1 posted:Speaking of Vsync, is there any way to enable it without also getting the horrendous mouse acceleration? That depends on the way a game engine handles its updates. If they do it intelligently, then VSync won't affect mouse input. If they do it badly, you're SOL and nothing you choose will do anything about it.
|
# ? Jan 28, 2013 04:38 |
|
Endymion FRS MK1 posted:Speaking of Vsync, is there any way to enable it without also getting the horrendous mouse acceleration? Yes, change the Maximum pre-rendered frames setting in the nvidia control panel from 3 to 1 or 0. I just have mine on 'Use the 3D application setting' with Adaptive vSync and I've not noticed an issue. I used to run D3DOverrider to force triple buffering and always had that setting on 1 to stop mouse lag.
|
# ? Jan 28, 2013 11:31 |
|
bull3964 posted:I don't doubt that there is room for improvement. I'm just saying, when was the last time visuals were the selling point of a game? They are going to have to do more to the next gen console line than stuff more HP under the hood. If the increased processing power doesn't translate to anything other than traditional gameplay with more spit and polish, I don't think the uptake is going to be all that swift. When I play games for the past few years I don't ask myself "could the graphics be better in this game? "but rather things like "Geez, how goddamn tedious SWTOR was", "does D3 has to run like poo poo on a 2500K + HD5850 despite how mediocre the visuals are?" or "why the gently caress are there so many stupid intro screens?" Palladium fucked around with this message at 14:58 on Jan 28, 2013 |
# ? Jan 28, 2013 14:56 |
|
What's with the delay of the next gen ATI/Nvidia cards? Shouldn't they be out by now or should we have some kind of update?
|
# ? Jan 28, 2013 15:58 |
|
Tab8715 posted:What's with the delay of the next gen ATI/Nvidia cards? Shouldn't they be out by now or should we have some kind of update? The 680 didn't come out till March 2012
|
# ? Jan 28, 2013 16:14 |
|
Don Lapre posted:The 680 didn't come out till March 2012 I want to replace my 680, but not with a $900 card that is basically a 690 if the "titan" stuff is true. I thought we moved past the ludicrous "ultra" line Nvidia?
|
# ? Jan 28, 2013 16:24 |
|
uhhhhahhhhohahhh posted:Yes, change the Maximum pre-rendered frames setting in the nvidia control panel from 3 to 1 or 0. I just have mine on 'Use the 3D application setting' with Adaptive vSync and I've not noticed an issue. Forgot to mention, I have a 7950.
|
# ? Jan 28, 2013 16:55 |
|
zer0spunk posted:I want to replace my 680, but not with a $900 card that is basically a 690 if the "titan" stuff is true. I thought we moved past the ludicrous "ultra" line Nvidia? They arnt going to replace the architecture every year.
|
# ? Jan 28, 2013 17:01 |
|
Endymion FRS MK1 posted:Forgot to mention, I have a 7950. RadeonPro has some special VSync settings, namely an equivalent to Adaptive VSync. Give it a try.
|
# ? Jan 28, 2013 17:09 |
|
zer0spunk posted:I want to replace my 680, but not with a $900 card that is basically a 690 if the "titan" stuff is true. I thought we moved past the ludicrous "ultra" line Nvidia? May I ask why? I have a 680 and a 2550x1600 monitor and I don't have any problems.
|
# ? Jan 28, 2013 18:32 |
|
Don Lapre posted:They arnt going to replace the architecture every year. GeForce GTX 480 March 26, 2010 GF100 GeForce GTX 580 9 November 2010 GF110 GeForce GTX 680 March 22, 2012 GK104 The titan thing falls in line with the 480/580 release schedule. I'm hoping whatever the real 8xxx line is from AMD gets them to fast track, but who knows. mayodreams posted:May I ask why? I have a 680 and a 2550x1600 monitor and I don't have any problems. For that reason, I'm on a 2560x1600 panel as well and it gets taxed in certain situations, but nothing worth going to SLI for. Hence I'm hoping the next gen single card solution is the last card I'll need to buy for a few years. Plus, smaller fab, less power consumption, and everything that goes with it. Edit: and hopefully less gimped CUDA wise, because the Tesla stuff is out of my range. zer0spunk fucked around with this message at 18:46 on Jan 28, 2013 |
# ? Jan 28, 2013 18:34 |
|
mayodreams posted:May I ask why? I have a 680 and a 2550x1600 monitor and I don't have any problems. The current video cards make my room way too goddamn warm.
|
# ? Jan 28, 2013 18:37 |
|
zer0spunk posted:
Well, from what I've read, Big Kepler isn't that great for gaming. Yes, it has more CUDA cores (at least in Quadro/Telsa dressing), but that does not equate to gaming performance. I doubt the clocks will be comparative with the GK104, and it certainly won't bee cooler if it has more transistors, which I'm sure it will. GK104 (GTX 680) - Clocks in MHz -> (1006 Base, 1058 Avg Boost, 1110 Max) and 195W TDP GK110 (K20) - Clock is 745MHz and 225W TDP Edit: This is also the first time Nvidia has not started a new arch with the big monolithic die, so you cant compare the other releases to this one, because the 680 was the reduction. mayodreams fucked around with this message at 19:10 on Jan 28, 2013 |
# ? Jan 28, 2013 18:52 |
|
|
# ? Jun 3, 2024 22:25 |
|
Tab8715 posted:The current video cards make my room way too goddamn warm. My 2 260s used to keep my room warm in the winter (unbearable in the summer). Sometimes I wonder if my 670 is even on.
|
# ? Jan 29, 2013 09:50 |