|
BangersInMyKnickers posted:We are so far away from the point where IE8 won't be supported and therefor need to get off XP to update it that it isn't even worth considering. Other factors will drive it before that. It really depends on how quickly GPU acceleration of the browser becomes a requirement. The Amazon shelf demo is horribly clunky without GPU acceleration (around 6fps vs 60fps with acceleration.) If these types of rich web apps become common quickly, the push will be there.
|
# ? Sep 16, 2010 20:06 |
|
|
# ? May 13, 2024 11:05 |
|
bull3964 posted:It really depends on how quickly GPU acceleration of the browser becomes a requirement. The Amazon shelf demo is horribly clunky without GPU acceleration (around 6fps vs 60fps with acceleration.) If these types of rich web apps become common quickly, the push will be there. Why the hell is a business going to care if Amazon.com is running slowly?
|
# ? Sep 16, 2010 20:18 |
|
BangersInMyKnickers posted:I understand that. It just doesn't have enough oomph to do Aero composition well which is why we've been going with the 256MB addin to handle it. Ahh. Well this is true, it is quite gutless.
|
# ? Sep 16, 2010 20:50 |
|
PC LOAD LETTER posted:Remember when some people once thought that only 640K of memory was needed? Alereon posted:It was actually clocked at ~400fps, and that includes decoding, scaling, and encoding simultaneously. (I can get >100FPS decoding, scaling and encoding an HD video to 480p)
|
# ? Sep 16, 2010 23:15 |
|
BangersInMyKnickers posted:Why the hell is a business going to care if Amazon.com is running slowly? Domino effect. Today Amazon does it, tomorrow Myspace, next week Google. Like lots of things, it'll lake time to establish but eventually you will see it on drat near every big site on the Internet. Then some big CEO demands that they use the Next Big Thing (tm) and there you go. Is it going to happen tomorrow? No, but it will happen. E: and i could see it being used for menu and drop down boxes for internal sites. You don't see it today, but if you use your imagination you could see how nicer easy menus that are graphically driven could be easier for business use. KKKLIP ART fucked around with this message at 00:29 on Sep 17, 2010 |
# ? Sep 17, 2010 00:27 |
|
BangersInMyKnickers posted:Why the hell is a business going to care if Amazon.com is running slowly? That's just a single thing. The idea is the GPU can do heavy UI lifting and web UIs are getting more complex all the time. With intel making it possible for every PC to have a decent GPU in them and Microsoft wanting to promote the latest and greatest IE, I think it's very likely that GPU browser acceleration is going to play a large part in future MS web technologies. I wouldn't be at all surprised to see Sharepoint eventually take advantage of GPU acceleration in the browser as well as the web client for Exchange. Then you have things like Reporting and Analysis services for SQL server for creating graphically and data heavy reports that could benefit from enhanced UI acceleration. That's just in MS's ecosystem, that's saying nothing about the ton of other web applications provided to businesses that could benefit from an enhanced UI. All I'm saying is we may see businesses start ditch XP in droves as progress accelerates since it won't be able to keep up with web technologies.
|
# ? Sep 17, 2010 03:31 |
|
~Coxy posted:I'm a bit disappointed by the lack of PCI-E lanes and USB 3 support. Forget the USB Embrace the light peak However i'd agree the PCIE lane issue as a minor sticking point. The home user / business wont notice.
|
# ? Sep 17, 2010 04:56 |
|
I'm still skeptical of the value proposition for Light Peak and its chances for adoption. Costs are going to be astronomically higher than any existing solution, since you have to install an optical transceiver for each lane, rather than just connecting a wire to a pin on a chip. USB3.0 maintains backwards compatibility with an incredible base of installed USB2.0 devices, and is cheap enough to implement everywhere. It's at 5Gbps now, and has the proven ability to scale to 10Gbps, and can probably scale further if desired. A lot of noise is made about having a single connector type for every application, but it's not like we have a huge array of competing connectors now. You have one cable going to your display that can carry video, audio, Ethernet, control data, and soon USB. You have Ethernet for network connectivity (assuming you're not using Wireless), and USB for everything else. Ethernet isn't going away for obvious reasons, and USB3.0 is already fast enough to carry a 1080p60 uncompressed video stream. It kind of sucks that we have competing DisplayPort/HDMI standards, but you can convert DisplayPort to HDMI/DVI using a cheap passive adapter as long as the system has a TMDS connected to the port. Apple wants Light Peak because it will let them instantly obsolete all of your peripherals, forcing you to buy new (probably Apple-branded) hardware or expensive adapters. Intel is happy to oblige them because they get a sweet royalty check every time someone makes a Light Peak device.
|
# ? Sep 17, 2010 06:12 |
|
Aleksei Vasiliev posted:I don't remember this because it never happened
|
# ? Sep 17, 2010 06:46 |
|
incoherent posted:Forget the USB I read about this. Ridiculous.
|
# ? Sep 17, 2010 08:20 |
|
Nomenklatura posted:Gaben must be dancing over this (before getting tired and eating another pizza...) Hah, that gabe newell is one heavy-set dude, thats for sure. This is probably what he really does when he reads news like this. LOL :o)
|
# ? Sep 17, 2010 18:22 |
|
Ryokurin posted:I'm interested in Sandy Bridge, but I'm more interested in Zacate as it's benching faster than a i5m chip. It's giving me hope that Llano will be nice hardware. I'm extremely interested in Zacate as well. I think Zacate is going to takeover the mobile and HTPC dept and may become the standard for cheap, low power, low profile desktops as well. Nothing can compete with it at 18W.
|
# ? Sep 17, 2010 21:54 |
|
Factory Factory posted:Will we never again feel the thrill of a desktop or laptop upgrade with a significant boost in capability? Are we doomed forever to only get our "new toy" excitement when Apple releases their version of a previously unpopular product and revitalizes that market? I think like other people have been saying its simply a matter of time before software catches up to hardware. Or other bits of hardware catch up to the hardware. In the meantime have you seen how fast cellphone technology is moving? And keep in mind that poo poo like memristors are probably going to hit within a decade. What the gently caress are people going to be able to do with nonvolatile memory an order of magnitude denser, and order of magnitude lower-power, and two orders of magnitude faster than flash that work better as they get smaller, can store non-binary states, can be used as programmable analog circuits, can implement neural networks in hardware, are likely stackable in 3D, and which can do computation without a CPU? I don't know man. Nobody knows. But I do know that it's going to loving. Rule.
|
# ? Sep 17, 2010 23:51 |
|
Spime Wrangler posted:And keep in mind that poo poo like memristors are probably going to hit within a decade. What the gently caress are people going to be able to do with nonvolatile memory an order of magnitude denser, and order of magnitude lower-power, and two orders of magnitude faster than flash that work better as they get smaller, can store non-binary states, can be used as programmable analog circuits, can implement neural networks in hardware, are likely stackable in 3D, and which can do computation without a CPU?
|
# ? Sep 18, 2010 00:14 |
|
Alereon posted:Apple wants Light Peak because it will let them instantly obsolete all of your peripherals, forcing you to buy new (probably Apple-branded) hardware or expensive adapters. Intel is happy to oblige them because they get a sweet royalty check every time someone makes a Light Peak device. I think if (maybe when) Light Peak falls flat on its face, the biggest benefits will be gained industry experience in the R&D and manufacturing of mass-market consumer optical-devices (kind of like how Toslink took off super fast and is in drat near every piece of A/V equipment now). Then when the next-next generation of consumer insanity hits, we shall be ready!
|
# ? Sep 18, 2010 21:44 |
|
movax posted:I think if (maybe when) Light Peak falls flat on its face, the biggest benefits will be gained industry experience in the R&D and manufacturing of mass-market consumer optical-devices (kind of like how Toslink took off super fast and is in drat near every piece of A/V equipment now). The thing with the optical transceivers is that if you're going to make them in batches of 100,000 for mass consumer level stuff, they get really stupidly cheap. The only really lovely part will be the fact that you need a fusion splicer to make cables.
|
# ? Sep 19, 2010 00:22 |
|
Methylethylaldehyde posted:The thing with the optical transceivers is that if you're going to make them in batches of 100,000 for mass consumer level stuff, they get really stupidly cheap. The only really lovely part will be the fact that you need a fusion splicer to make cables. Sounds like a place where Monster can actually have a legitimate market rather than just selling to people who don't know better.
|
# ? Sep 19, 2010 00:28 |
|
So while Intel is making an one chip solution, since normal people don't use the most of their graphics adapter, Nvidia is betting on technologies like CUDA, hoping people will now use fast graphics adapters for more than just game. Someone is going to be really wrong about this.
|
# ? Sep 19, 2010 01:45 |
|
You realize that people just using Intel's onboard graphics has been norm for nearly a decade now right? The Sandy Bridge stuff seems to be more about snagging people who some gaming as well.
|
# ? Sep 19, 2010 01:52 |
|
fishmech posted:You realize that people just using Intel's onboard graphics has been norm for nearly a decade now right? The Sandy Bridge stuff seems to be more about snagging people who some gaming as well. It also lays the groundwork for people who are casual gamers, so Pop Cap can make even prettier games that will work awesomely on the new integrated video chips.
|
# ? Sep 19, 2010 02:15 |
|
I thought Intel said earlier this year or late last year that they were getting out of the GPU market? If so, why the sudden change in direction
|
# ? Sep 19, 2010 02:48 |
|
CommieGIR posted:I thought Intel said earlier this year or late last year that they were getting out of the GPU market? If so, why the sudden change in direction
|
# ? Sep 19, 2010 02:49 |
|
Disgustipated posted:They said discrete GPU market, IIRC. Basically they killed Larrabee and that was it. Ah makes more sense.
|
# ? Sep 19, 2010 02:56 |
|
the ohmebaglod flag posted:So while Intel is making an one chip solution, since normal people don't use the most of their graphics adapter, Nvidia is betting on technologies like CUDA, hoping people will now use fast graphics adapters for more than just game.
|
# ? Sep 19, 2010 05:58 |
|
Plorkyeran posted:It's not really a question of being wrong; they're actively trying to force things in different directions, not guessing which way things will go. It is if they can't get a strong market to form around their architecture on which they went all-in. Personally, I think the AMD approach is actually much more revolutionary. They made a real good move acquiring ATI and getting in-house experience in high-end graphics processor design. Too bad Nvidia is working them over on the GPGPU software side of things and intel on the CPU performance side. Still, I wouldn't be surprised to see ATI the dominant player in a few years once they get the bugs worked out in fusion and someone figures out what the gently caress to do with widespread GPGPU abilities. Anyone want to take bets on Intel avoiding programmable graphics hardware because they know they would just get their rear end handed to them if they helped the market take off too soon?
|
# ? Sep 19, 2010 07:12 |
|
Alereon posted:It was actually clocked at ~400fps, and that includes decoding, scaling, and encoding simultaneously. Keep in mind also that since this is being done in dedicated fixed-function hardware, there won't be much of a performance impact on the system. You could have a video transcoding in the background while playing a game without the performance of either being impacted. Besides, the current generation of GPU-accelerated encoders look like crap, and most people don't really care if they're just putting videos on their mobile devices or uploading to Youtube.
|
# ? Sep 19, 2010 08:03 |
|
the ohmebaglod flag posted:So while Intel is making an one chip solution, since normal people don't use the most of their graphics adapter, Nvidia is betting on technologies like CUDA, hoping people will now use fast graphics adapters for more than just game. Or they are two separate market segments?
|
# ? Sep 19, 2010 08:15 |
|
Doc Block posted:Unfortunately there are a lot of businesses with no plans whatsoever to upgrade beyond XP in the immediate future. Where I work, we recently picked up a few new business customers after our main competitor dropped support for Windows 2000. Some of them are currently planning their migrations TO Windows XP. The only reason any of my customers even have anything as fast as a Core2 is because they ran out of spare motherboards for their GX280s. The vast majority are still perfectly happy plodding along on Pentium 4s in the low 2GHz range. As a result of this. a 2GHz P4 running XP is still the baseline that we have to develop for, including new software. As for Sandy Bridge, another loving new socket? It seems like only 5 minutes since Intel brought out 2 new sockets for i3/i5/i7 and now they're obsolete already. I'm glad I decided to sit this round out and just stick a 3GHz C2Quad into my X48 board.
|
# ? Sep 19, 2010 09:49 |
|
~Coxy posted:I'm a bit disappointed by the lack of PCI-E lanes and USB 3 support. MrBond posted:I thought Handbrake and x264 shun 2-pass encodes in favor of the "quality based" ones now? Combat Pretzel fucked around with this message at 12:03 on Sep 19, 2010 |
# ? Sep 19, 2010 11:57 |
|
Combat Pretzel posted:If you need to hit a certain file size, there's no way around two passes. I don't need either of those. And I'm pretty sure encoding to a constant quality and limiting the max bitrate means you don't need 2-pass for device-compatible encoding, too.
|
# ? Sep 19, 2010 13:24 |
|
The day this starts supporting GPGPU is the day I'll blow my load. Right now, I'm wanting to re-jig some computational fluid dynamics simulations that I have running on CPU to run on a GPU. It would be so stupidly faster it's unbelievable, but I have the major trouble of bandwidth to/from storage, because GPUs lack the RAM I need. If this supports GPGPU and I can use a fast bus to a 12-24GB RAM box, then my problems vanish to a large extent. This will be a wonderful day indeed.
|
# ? Sep 19, 2010 14:55 |
|
Spime Wrangler posted:It is if they can't get a strong market to form around their architecture on which they went all-in. What choice did they have? AMD couldn't afford them, Intel would be a legal nightmare, and VIA's x86 license was questionable up until late. Better to go all in now while you still have major influence than wait until the dust settles.
|
# ? Sep 19, 2010 15:23 |
|
Nam Taf posted:The day this starts supporting GPGPU is the day I'll blow my load. I thought GPUs could access system ram as-needed since the AGP days?
|
# ? Sep 19, 2010 15:23 |
|
BangersInMyKnickers posted:I thought GPUs could access system ram as-needed since the AGP days?
|
# ? Sep 19, 2010 15:43 |
|
and only at the maximum amount set by the AGP aperture size. It was basically meant as a last resort move if the video card's memory was full.
|
# ? Sep 19, 2010 15:47 |
|
Aleksei Vasiliev posted:I don't need either of those. And I'm pretty sure encoding to a constant quality and limiting the max bitrate means you don't need 2-pass for device-compatible encoding, too. And in regards to the Intel Media Engine, I'm pretty sure it won't hold its water to software implementations in terms of encoding quality and complexity.
|
# ? Sep 19, 2010 16:39 |
|
Intel is bringing a feature from mainframes to the desktop: software-upgradeable processors. For $50, you can buy a scratch-off code that can be used with the Intel Upgrade Application to enable HyperThreading and an additional 1MB of L3 cache on a Pentium G6951. This seems like a pretty lovely deal, since it basically takes a $100 processor and still doesn't make it as good as a $115 Core i3 processor. If it turned the CPU into a real Core i5 with Turbo and the regular GPU clock speeds, in addition to HT and the rest of the cache, that might be a worthwhile upgrade.
|
# ? Sep 19, 2010 19:44 |
|
Alereon posted:Intel is bringing a feature from mainframes to the desktop: software-upgradeable processors. For $50, you can buy a scratch-off code that can be used with the Intel Upgrade Application to enable HyperThreading and an additional 1MB of L3 cache on a Pentium G6951. This seems like a pretty lovely deal, since it basically takes a $100 processor and still doesn't make it as good as a $115 Core i3 processor. If it turned the CPU into a real Core i5 with Turbo and the regular GPU clock speeds, in addition to HT and the rest of the cache, that might be a worthwhile upgrade. ...and this is why I'm going to stick with AMD, they haven't tried to pull poo poo like this yet
|
# ? Sep 19, 2010 19:45 |
|
Combat Pretzel posted:If CBR would churn out decent results, people wouldn't have invented two-pass modes. Also CBR looks great with enough bitrate, it's just ridiculously wasteful and should only be used in broadcast/streaming.
|
# ? Sep 19, 2010 19:55 |
|
|
# ? May 13, 2024 11:05 |
|
CommieGIR posted:...and this is why I'm going to stick with AMD, they haven't tried to pull poo poo like this yet Yeah AMD tells you you should pay full price for a complete replacement CPU from the same batch that was binned differently.
|
# ? Sep 19, 2010 20:07 |