|
Sedisp posted:Oh he's absolutely not going to kill it anytime soon but rehabilitation is also extreme. I'd bet he's going to wait and see and if things don't turn around without plunging mountains of cash into it start winding it down. I think they're going to rehab. It needs a facelift if it's going to appeal to a wider audience because, like you said, relying on technically inclined DINKs who happen to be in a place where it's time to make an expensive upgrade to the entertainment center is not a reasonable strategy.
|
# ? May 5, 2014 20:29 |
|
|
# ? Jun 10, 2024 04:05 |
|
CAPT. Rainbowbeard posted:Isn't the ESRAM thing basically just the Cell thing from last gen? As in, a hardware quirk that PC devs took a while to figure out if they even did at all? To my understanding, it is something people can figure out and get better at. But the real reason they won't be considered similar in potential is because the ESRAM came at a large cost. The choice between a standard processor and a Cell was two ways to accomplish similar goals. It was odd and alien and they didn't have nice tools for it so it was a liability for awhile. But it didn't come at the cost of graphical power or anything, just different. They were also rigid about their memory which was kind of an additional hurdle. But the ESRAM isn't just a solution to memory speeds and bandwidth. If it was just a chip off the side that helped boost GDDR3 to achieve greater things, it would perhaps be a similar "different solution, equivalent results". But because of the need to hit so many components it takes up a lot of space on the board so it cost them graphical power purely for real estate reasons. So really it's a very expensive way to make GDDR3 competitive with the PS4 and when it turned out they didn't even benefit from an advantage in total amount of RAM they were doubly screwed. Same amount of memory only slower and less graphical power in the tradeoff. And more expensive as well. If the devs totally get on the ball with ESRAM and find tricks to optimize it for 1080p, they're still dealing with a substantially smaller GPU. Edit: Xbone PS4 Ape Agitator fucked around with this message at 21:07 on May 5, 2014 |
# ? May 5, 2014 21:05 |
|
CAPT. Rainbowbeard posted:Isn't the ESRAM thing basically just the Cell thing from last gen? As in, a hardware quirk that PC devs took a while to figure out if they even did at all? You're basically saying "This 4.5 litre V8 is basically just the same thing as this 50cc moped engine, eventually my scooter will be as fast as a ferrari the tuning shop just needs to work out the kinks!" Regardless of anyone's personal feelings about the Xbone it's objectively got no feature that's going to bring parity to the PS4 in terms of raw power.
|
# ? May 5, 2014 21:39 |
|
Sedisp posted:You really overestimate how much Microsoft likes spending money to try and maintain a division that has been hemorrhaging money. During the 360's reign they had people talking about axeing the division because of how much money they had to sink into it to make it profitable. Reminder: Either I'm misinterpreting this graph, or it shows that the 360 was not remotely "hemorrhaging money." Not that $3 bn of profit over 5 years is a ton of money to MS either, since Office probably makes that in a week, but on the other hand the total loss of $3bn over 10 years is almost as inconsequential.
|
# ? May 5, 2014 22:42 |
|
Saladman posted:Reminder: If I recall correctly during the 360s time Microsoft had a fuckton of profit from licensing to Android that they reported under the same division as the 360 (an overall devices division?) which is commonly accepted to have been done to mask huge Xbox losses. I am not a financial analyst though so I could be mistaken. Edit: this guy is http://www.gamesindustry.biz/articles/2013-11-07-huge-xbox-losses-hidden-by-patent-royalties-says-analyst Basically Xbox division loses 2 billion a year and its offset by 2 billion in Android royalties. History Comes Inside! fucked around with this message at 22:58 on May 5, 2014 |
# ? May 5, 2014 22:47 |
|
Doesn't this guy run the xbox part? Why cant he say, yea we are gonna work on this. https://twitter.com/XboxP3/status/463137600455192576
|
# ? May 5, 2014 23:27 |
|
Don Lapre posted:Doesn't this guy run the xbox part? Why cant he say, yea we are gonna work on this. Difficult to renegotiate at this point I'm guessing? Plus it lets him take his sweet time to do it rather than making it a hard deadline to get moving. Its corporate speak for "we think this is a good idea but we're still in feasibility testing"
|
# ? May 5, 2014 23:28 |
|
Barudak posted:Difficult to renegotiate at this point I'm guessing? Plus it lets him take his sweet time to do it rather than making it a hard deadline to get moving. Seems it would be better to just not reply. It makes him sound like he doesn't actually have any power.
|
# ? May 5, 2014 23:30 |
|
Don Lapre posted:Seems it would be better to just not reply. It makes him sound like he doesn't actually have any power. Nah, its better for him to say what he said so people can say "they're looking into it" but he doesn't actually step on partners toes if they're opposed. Plus its the usual testing the waters response where they can see if there is strong consumer interest.
|
# ? May 5, 2014 23:33 |
|
Bobtista posted:You're basically saying "This 4.5 litre V8 is basically just the same thing as this 50cc moped engine, eventually my scooter will be as fast as a ferrari the tuning shop just needs to work out the kinks!" Is it really as bad as that, though? Is the PS4 that much faster? I don't see how what are effectively similar hardware configurations makes for a Ferrari versus a moped analogy. Another question: is raw computing power an excuse for less efficient coding? Consoles have always needed to work within certain limitations. I'm looking at reasons why things may work out, not why they wont. I can see those reasons.
|
# ? May 5, 2014 23:41 |
|
CAPT. Rainbowbeard posted:Is it really as bad as that, though? Is the PS4 that much faster? I don't see how what are effectively similar hardware configurations makes for a Ferrari versus a moped analogy. The PS4 is unambigulously more powerful. Its not Ferrari to moped but its distinct and given their price differences if raw power is the only reason you buy consoles you would be a moron to not snag the PS4. Note that Microsoft's strategy changes and differentiation has made 0 overtures to graphics and instead are pushing other features on a device that while I love it is about 200 dollars minimum overpriced for the consumer target they should be hitting.
|
# ? May 5, 2014 23:45 |
|
CAPT. Rainbowbeard posted:Is it really as bad as that, though? Is the PS4 that much faster? I don't see how what are effectively similar hardware configurations makes for a Ferrari versus a moped analogy. Fundamentally, the problem is . Specifically, multiformat developers will almost certainly make the game for one format and then port to other formats from there. The big problem (for Microsoft) is that the PS4 is looking set to be the target format for pretty much all multiformat games for basically all of this new gen, because it's easier to work with and there are more units sold in pretty much every country where it's available versus the Xbone. Hence, they're going to have to port down to fit on the weaker system, which results in sub-1080p resolutions and effects downgrades and FPS downgrades and various other junk we've already seen in multiformat games. Imagine you were given the choice of a bog standard VW Golf and a VW Golf GTi for a track day. Oh, and the GTi was 20% cheaper to rent. You would pick the GTi any day of the week. Sure, the regular Golf would get you around the track, but the GTi would accelerate faster, corner better and will be cheaper overall. It is objectively the better pick. The problem the Xbone has is that in this analogy, it is the bog standard Golf, and the PS4 is the GTi. Of course, , and it's not perfect since there's going to be complications (e.g. exclusive games), but ultimately, for the vast majority of bread and butter multiformat games, the PS4 version will be better. Delusibeta fucked around with this message at 00:01 on May 6, 2014 |
# ? May 5, 2014 23:57 |
|
CAPT. Rainbowbeard posted:Is it really as bad as that, though? Is the PS4 that much faster? I don't see how what are effectively similar hardware configurations makes for a Ferrari versus a moped analogy. I was using hyperbole to make the point more obvious, but it still boils down to the fact there's literally nothing that can be done with the current xbone hardware to match the ps4 on performance assuming developers make the same effort on both platforms. They are so similar you can virtually look at a ps4 as being xbone+1. No matter what can be done with the xbone hardware the ps4 will edge it out by virtue of just bring objectively better hardware. This isn't a case like the cell where it was a decent chip that just took almost an entire console life cycle to figure out due to being stupidly esoteric.
|
# ? May 6, 2014 00:06 |
|
The current performance gap between the two should get smaller though, as developers learn to use the ESRAM effectively among other things like DX12, OS Improvements, etc. I'd expect AAA Title X released in 2015 to run at 1080p PS4, 900p xbone instead of the current 720p.
|
# ? May 6, 2014 00:22 |
|
hanyolo posted:learn to use the ESRAM effectively You mean learn to deal with the crippling bottleneck in the rendering pipeline?
|
# ? May 6, 2014 00:54 |
|
hanyolo posted:The current performance gap between the two should get smaller though, as developers learn to use the ESRAM effectively among other things like DX12, OS Improvements, etc. I'd expect AAA Title X released in 2015 to run at 1080p PS4, 900p xbone instead of the current 720p. Oh yeah, no doubt the gap will get smaller. People just have to accept they aren't going to see parity for multiplats (and most likely equivalent first party titles too since an in-house dev is generally going to push the hardware to the limit) unless Sony poo poo the bed big time and the xbone becomes the preferred development platform so PS4 games are gimped by devs not wanting to put in extra work.
|
# ? May 6, 2014 00:56 |
|
Bobtista posted:Oh yeah, no doubt the gap will get smaller. People just have to accept they aren't going to see parity for multiplats (and most likely equivalent first party titles too since an in-house dev is generally going to push the hardware to the limit) unless Sony poo poo the bed big time and the xbone becomes the preferred development platform so PS4 games are gimped by devs not wanting to put in extra work. People will learn how to use the Xbone's hardware more efficiently but they're doing the same with the PS4. The gap may not get smaller, it might just get a higher base.
|
# ? May 6, 2014 00:58 |
|
hanyolo posted:The current performance gap between the two should get smaller though, as developers learn to use the ESRAM effectively among other things like DX12, OS Improvements, etc. I'd expect AAA Title X released in 2015 to run at 1080p PS4, 900p xbone instead of the current 720p. That does suggest flat performance for the PS4. But just as the PS3 got better over the lifetime of the system, the 360 also saw dramatic improvement in the quality and complexity of its games. Later in life, Xbox games will look more impressive but so will the PS4s. CoD at the start of the 360 compared to CoD at the end of the 360 is dramaticly improved. Edit: CAPT. Rainbowbeard posted:I'm looking at reasons why things may work out, not why they wont. I can see those reasons. At its core, I think the only proper solution is exclusives. If there's nothing to compare against, the performance difference won't matter. It's an expensive proposition, but buying up all the promising games and keeping them away from not only Sony but also the 360 and PC is the path to success. It is essentially the WiiU strategy but at least the gulf in performance isn't so vast that I think multiformat developers would ever stop making ports to it. If it was the only place to play Metal Gear and Call of Duty, I think people would end up buying in because those are the huge draws. But it also requires continuing to fling more money at it simply to increase the player base, not to make actual profit. Ape Agitator fucked around with this message at 01:07 on May 6, 2014 |
# ? May 6, 2014 01:01 |
|
GENDERWEIRD GREEDO posted:People will learn how to use the Xbone's hardware more efficiently but they're doing the same with the PS4. The gap may not get smaller, it might just get a higher base. I'm thinking in an ideal world for xbone. Its within the realms of possibility that since it seems the PS4 hardware is much more straightforward to work it could be argued there's less headroom for working out new tricks because its the same x86 platform people have been working on for years. It'll still beat the xbone hands down but they could do something neat with the ESRAM to close a little bit of the gap, maybe. Of course it could just turn out they can't do poo poo with it and the gap gets even wider as PS4 development gets easier and easier and the ESRAM bottleneck leaves the xbone crippled. Time will tell Ape Agitator posted:That does suggest flat performance for the PS4. But just as the PS3 got better over the lifetime of the system, the 360 also saw dramatic improvement in the quality and complexity of its games. Later in life, Xbox games will look more impressive but so will the PS4s. CoD at the start of the 360 compared to CoD at the end of the 360 is dramaticly improved. The ps3 got better because devs worked out how to get the most out of the CPU. The only thing devs have to struggle with on xbone is really the ESRAM (and the slower RAM in general but let's not complicate things) and no matter what they do with the ESRAM its not going to compensate fully for having slower *every other component here*. Both consoles will see graphical improvement but the xbone will never reach parity. In an ideal world the gap will get smaller and that's the best Microsoft can hope for short of Sony loving something up so badly that people abandon the PS4 as a platform.
|
# ? May 6, 2014 01:09 |
|
Bobtista posted:Oh yeah, no doubt the gap will get smaller. People just have to accept they aren't going to see parity for multiplats (and most likely equivalent first party titles too since an in-house dev is generally going to push the hardware to the limit) unless Sony poo poo the bed big time and the xbone becomes the preferred development platform so PS4 games are gimped by devs not wanting to put in extra work. You guys need to be prepared for the idea that PS4 multiplats could as easily be even better looking compared to Xbox One games in the future as otherwise. Things will happen that will make Xbox One games look better than now, but Sony will also be working on making PS4 games look better. If the various secret sauces people are proposing for Xbox don't cancel Sony's new developments out, expect the gap to widen as developers concentrate on the more popular platform. Edit: I should have refreshed. Fergus Mac Roich fucked around with this message at 01:17 on May 6, 2014 |
# ? May 6, 2014 01:15 |
|
Back to the install thing, Tomb Raider was ready to go from digital download at about 15% or something, and I never got to a point where I needed to wait for it to download more so I could keep playing. Hopefully as time goes on and developers streamline processes (and the OS improves), insert disc->play game will be a definite Thing again.
|
# ? May 6, 2014 01:26 |
|
CAPT. Rainbowbeard posted:Another question: is raw computing power an excuse for less efficient coding? Consoles have always needed to work within certain limitations. I think it was the Doom guy that is all about the Oculus VR thing that said the longass 360/ps3 gen was just what developers needed because it made them stop being lazy.
|
# ? May 6, 2014 06:11 |
|
Bobtista posted:Of course it could just turn out they can't do poo poo with it and the gap gets even wider as PS4 development gets easier and easier and the ESRAM bottleneck leaves the xbone crippled. Time will tell This. The esram may be speedy, but it is too small for 1080p 60fps. The PS4 can already hit 1080p reliably and do a solid 30fps. Further optimizations for the ps4 will mean even better looking games. Further optimizations for the xb1 will mean that it might reliably hit 1080/60. If they truly believe that people can't tell the difference between 1080p and 720p, it would be better if they just accepted the performance limitation and make games look good at 720p instead of some weird not-720p resolution.
|
# ? May 6, 2014 06:54 |
|
Goon posted:Further optimizations for the xb1 will mean that it might reliably hit 1080/60. As far as I've read, 32MB ESRAM is not enough to buffer 1080p frames at 60 per second. It's not like someone asked earlier about whether it's like the Cell; that was a case of a more powerful system that took time for devs to learn how to use it. This is a literally weaker system that devs are going to have to find a way around. You can see why they put the effort into developing with Cell.
|
# ? May 6, 2014 07:40 |
|
A shrubbery! posted:As far as I've read, 32MB ESRAM is not enough to buffer 1080p frames at 60 per second. It is, but not with post processing effects like AA/AF etc that modern games have to stop them looking like terrible swimmy jagged messes.
|
# ? May 6, 2014 08:19 |
|
On the other hand, art direction is so much vastly more important than processing power at this point, it's hard to believe anyone but pixel counters can tell. Sure the Golf GTi is a nicer car than the Golf standard, but if I drive to work with the music up am I ever going to loving notice the difference in handling and acceleration? I drive a Prius but my parents have nice Mercedes, and I generally don't notice the difference in engine power at all, it's not like I'm at a racetrack all the time. I agree it's lame the bone is more expensive and less powerful, but christ, art direction is so much more important than processing power. For instance, The Last of Us looks amazing, while Dark Souls 2 looks almost the same as late-era PS2 games like Shadow of the Colossus (minus the 5 fps issues Shadow had). I've been going between BF3 (360) and BF4 (bone) and honestly do not "actively" notice the difference in graphics at all. It's such a minor jump from 360 to Bone (I do see it if I'm looking but normally don't care) that when I eventually get a PS4, I'm 99% sure I won't be able to succeed in a blind test for which system I'm on if I'm sitting on my couch (±10 feet from 55''). E: Oh but the $100 price difference is a pisser. If it was the same price I'd think that everyone who really cared or chose their system based on a ~20% processing power difference was probably brain damaged. Saladman fucked around with this message at 09:54 on May 6, 2014 |
# ? May 6, 2014 09:48 |
|
Art direction isn't going to make up for memory limitations and other factors affected by system specs that aren't just graphical settings.
|
# ? May 6, 2014 10:04 |
|
Saladman posted:Oh but the $100 price difference is a pisser. If it was the same price I'd think that everyone who really cared or chose their system based on a ~20% processing power difference was probably brain damaged. Because wanting the more powerful (and thus more capable) product is stupid. Got it.
|
# ? May 6, 2014 13:27 |
|
Saladman posted:On the other hand, art direction is so much vastly more important than processing power at this point The nice thing is, on other platforms, one doesn't exclude the other - it's great
|
# ? May 6, 2014 13:44 |
|
Saladman posted:On the other hand, art direction is so much vastly more important than processing power at this point, it's hard to believe anyone but pixel counters can tell. Sure the Golf GTi is a nicer car than the Golf standard, but if I drive to work with the music up am I ever going to loving notice the difference in handling and acceleration? I drive a Prius but my parents have nice Mercedes, and I generally don't notice the difference in engine power at all, it's not like I'm at a racetrack all the time. You bought a lovely product, please stop trying to justify it on internet forums to strangers.
|
# ? May 6, 2014 13:51 |
|
We are kinda at the point where slightly higher res textures and resolutions and framerates only hold so much sway for people. I was playing Wind Waker HD on the Wii U the other day and was as impressed by the graphics in that game as any I have ever seen. I appreciate Second Son being completely absurd on a technical level and I realize that game sets the standard that other AAA titles will be chasing. Its a fantastic looking game. But the art direction in Wind Waker HD is such that despite pixel counts and running on hardware significantly less powerful than the bone, its incredibly pretty and impressive. I'm not saying that I wouldnt prefer the bone to be as powerful as possible, but the bone is significantly more powerful than the WiiU and my PC is significantly more powerful than a PS4. If I want the best looking games, ill get them for my PC. Its just not the most important thing to me most of the time. WinnebagoWarrior fucked around with this message at 14:06 on May 6, 2014 |
# ? May 6, 2014 14:02 |
|
BARONS GAMES WHINER posted:It is, but not with post processing effects like AA/AF etc that modern games have to stop them looking like terrible swimmy jagged messes. And if you are using it for a 1080p frame buffer you cant then go and store all your textures in there like people want them to.
|
# ? May 6, 2014 14:53 |
|
WinnebagoWarrior posted:We are kinda at the point where slightly higher res textures and resolutions and framerates only hold so much sway for people. I was playing Wind Waker HD on the Wii U the other day and was as impressed by the graphics in that game as any I have ever seen. I appreciate Second Son being completely absurd on a technical level and I realize that game sets the standard that other AAA titles will be chasing. Its a fantastic looking game. Sounds like you could have saved some money with the SD version of that game though.
|
# ? May 6, 2014 15:44 |
|
Fergus Mac Roich posted:Sounds like you could have saved some money with the SD version of that game though. Wind Waker HD isnt just Wind Waker (an excellent looking game) at a higher resolution, its a full graphical facelift. It looks much better than the original for reasons other than pixel counts. I know what you are getting at with this statement but there isnt a lot that a more power system could have done with the art direction of that game. 1080p60 and everything, I know, its been said a thousand times in this thread. But when I play that game I am not thinking "this is ok but I wish it had more pixels" I am thinking "why the gently caress didnt PS3 and 360 games look this great" since crossplatform games look better on the 360 than they do on the WiiU. You can look at Digital Foundry comparisons for a game like Call of Duty and see that the 360 version of the game looks better than the WiiU version. There is absolutely no question. The 360 is a more powerful system in many ways. But a lot of WiiU games look much better than 360 and PS3 games not because of pixels and framerates but because of their incredible art direction. \/\/\/\/ I actually do agree with this type of thing. Framerates and pixel density get brought up way more than anything else but I, personally, find fancy particle effects and physics and lighting to be waaaay more impressive. Of course, those things need more powerful hardware too. But I am always going to get the best version of that on my PC, I just cant play Forza or Mario 3D world on my PC. WinnebagoWarrior fucked around with this message at 16:38 on May 6, 2014 |
# ? May 6, 2014 16:14 |
|
Its not just about pixels, honestly the biggest upgrade this gen is lighting engine stuff, which is what made infamous look like more than just another decent looking game.
|
# ? May 6, 2014 16:20 |
|
Does anyone know what's supposed to be so great about dx12 that it doubles the efficiency of the One's graphics output? I remember back when the original GeForce was starting to show its age & Nvidia released drivers that made the card relevant again- so much so that, in some cases, it turned titles that ran like slideshows on it into playable games- but that was five or ten, maybe fifteen if you were lucky, extra frames, so while I know expressing worthwhile gains via software refinement isn't out of the realm of possibility, their claim seems kinda like a... mild... exaggeration.
|
# ? May 6, 2014 16:53 |
|
WinnebagoWarrior posted:I know what you are getting at with this statement but there isnt a lot that a more power system could have done with the art direction of that game. You keep saying that but having a less powerful machine doesn't mean you'll get a treasure trove of style over "pixels". Consoles not made by Nintendo haven't historically been a great place for style over graphics. That may change because of id@Xbox but as a lot of indie deva have indicated Microsoft is being really lovely about the whole thing. Microsoft atm doesn't exactly have a library of exclusives that value art style. Sedisp fucked around with this message at 17:30 on May 6, 2014 |
# ? May 6, 2014 17:23 |
|
mysterious frankie posted:Does anyone know what's supposed to be so great about dx12 that it doubles the efficiency of the One's graphics output? I remember back when the original GeForce was starting to show its age & Nvidia released drivers that made the card relevant again- so much so that, in some cases, it turned titles that ran like slideshows on it into playable games- but that was five or ten, maybe fifteen if you were lucky, extra frames, so while I know expressing worthwhile gains via software refinement isn't out of the realm of possibility, their claim seems kinda like a... mild... exaggeration. Best part is people constantly saying consoles can perform better since you can program "down to the metal" and then stating that an abstraction layer is going to be the savior.
|
# ? May 6, 2014 17:27 |
|
mysterious frankie posted:Does anyone know what's supposed to be so great about dx12 that it doubles the efficiency of the One's graphics output? I remember back when the original GeForce was starting to show its age & Nvidia released drivers that made the card relevant again- so much so that, in some cases, it turned titles that ran like slideshows on it into playable games- but that was five or ten, maybe fifteen if you were lucky, extra frames, so while I know expressing worthwhile gains via software refinement isn't out of the realm of possibility, their claim seems kinda like a... mild... exaggeration. Nothing. It won't. People just need something to cling to.
|
# ? May 6, 2014 17:28 |
|
|
# ? Jun 10, 2024 04:05 |
|
Aphrodite posted:Nothing. It won't. The only way I can see it doing everything they promise, performance wise, is if they did a monumentally lovely job optimizing DX11 for the hardware and 12 is going to be the new ground up optimized api.
|
# ? May 6, 2014 17:39 |