Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


BangersInMyKnickers posted:

We are so far away from the point where IE8 won't be supported and therefor need to get off XP to update it that it isn't even worth considering. Other factors will drive it before that.

It really depends on how quickly GPU acceleration of the browser becomes a requirement. The Amazon shelf demo is horribly clunky without GPU acceleration (around 6fps vs 60fps with acceleration.) If these types of rich web apps become common quickly, the push will be there.

Adbot
ADBOT LOVES YOU

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

bull3964 posted:

It really depends on how quickly GPU acceleration of the browser becomes a requirement. The Amazon shelf demo is horribly clunky without GPU acceleration (around 6fps vs 60fps with acceleration.) If these types of rich web apps become common quickly, the push will be there.

Why the hell is a business going to care if Amazon.com is running slowly?

Nebulis01
Dec 30, 2003
Technical Support Ninny

BangersInMyKnickers posted:

I understand that. It just doesn't have enough oomph to do Aero composition well which is why we've been going with the 256MB addin to handle it.

Ahh. Well this is true, it is quite gutless.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

PC LOAD LETTER posted:

Remember when some people once thought that only 640K of memory was needed?
I don't remember this because it never happened

Alereon posted:

It was actually clocked at ~400fps, and that includes decoding, scaling, and encoding simultaneously.
I don't care if it's four times faster than x264 if it looks like utter poo poo.
(I can get >100FPS decoding, scaling and encoding an HD video to 480p)

KKKLIP ART
Sep 3, 2004

BangersInMyKnickers posted:

Why the hell is a business going to care if Amazon.com is running slowly?

Domino effect. Today Amazon does it, tomorrow Myspace, next week Google. Like lots of things, it'll lake time to establish but eventually you will see it on drat near every big site on the Internet. Then some big CEO demands that they use the Next Big Thing (tm) and there you go. Is it going to happen tomorrow? No, but it will happen.

E: and i could see it being used for menu and drop down boxes for internal sites. You don't see it today, but if you use your imagination you could see how nicer easy menus that are graphically driven could be easier for business use.

KKKLIP ART fucked around with this message at 00:29 on Sep 17, 2010

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


BangersInMyKnickers posted:

Why the hell is a business going to care if Amazon.com is running slowly?

That's just a single thing. The idea is the GPU can do heavy UI lifting and web UIs are getting more complex all the time. With intel making it possible for every PC to have a decent GPU in them and Microsoft wanting to promote the latest and greatest IE, I think it's very likely that GPU browser acceleration is going to play a large part in future MS web technologies.

I wouldn't be at all surprised to see Sharepoint eventually take advantage of GPU acceleration in the browser as well as the web client for Exchange. Then you have things like Reporting and Analysis services for SQL server for creating graphically and data heavy reports that could benefit from enhanced UI acceleration.

That's just in MS's ecosystem, that's saying nothing about the ton of other web applications provided to businesses that could benefit from an enhanced UI.

All I'm saying is we may see businesses start ditch XP in droves as progress accelerates since it won't be able to keep up with web technologies.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

~Coxy posted:

I'm a bit disappointed by the lack of PCI-E lanes and USB 3 support.
Do we have a rough date for the 1366 replacement?

Forget the USB

Embrace the light peak

However i'd agree the PCIE lane issue as a minor sticking point. The home user / business wont notice.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I'm still skeptical of the value proposition for Light Peak and its chances for adoption. Costs are going to be astronomically higher than any existing solution, since you have to install an optical transceiver for each lane, rather than just connecting a wire to a pin on a chip. USB3.0 maintains backwards compatibility with an incredible base of installed USB2.0 devices, and is cheap enough to implement everywhere. It's at 5Gbps now, and has the proven ability to scale to 10Gbps, and can probably scale further if desired. A lot of noise is made about having a single connector type for every application, but it's not like we have a huge array of competing connectors now. You have one cable going to your display that can carry video, audio, Ethernet, control data, and soon USB. You have Ethernet for network connectivity (assuming you're not using Wireless), and USB for everything else. Ethernet isn't going away for obvious reasons, and USB3.0 is already fast enough to carry a 1080p60 uncompressed video stream. It kind of sucks that we have competing DisplayPort/HDMI standards, but you can convert DisplayPort to HDMI/DVI using a cheap passive adapter as long as the system has a TMDS connected to the port.

Apple wants Light Peak because it will let them instantly obsolete all of your peripherals, forcing you to buy new (probably Apple-branded) hardware or expensive adapters. Intel is happy to oblige them because they get a sweet royalty check every time someone makes a Light Peak device.

PC LOAD LETTER
May 23, 2005
WTF?!

Aleksei Vasiliev posted:

I don't remember this because it never happened
OK my bad. Still, I think its short sighted to disregard GPGPU because it doesn't make sense for business apps.

4 Day Weekend
Jan 16, 2009

incoherent posted:

Forget the USB

Embrace the light peak

I read about this. Ridiculous.

brap
Aug 23, 2004

Grimey Drawer

Nomenklatura posted:

Gaben must be dancing over this (before getting tired and eating another pizza...)

Hah, that gabe newell is one heavy-set dude, thats for sure. This is probably what he really does when he reads news like this. LOL :o)

Medikit
Dec 31, 2002

que lástima

Ryokurin posted:

I'm interested in Sandy Bridge, but I'm more interested in Zacate as it's benching faster than a i5m chip. It's giving me hope that Llano will be nice hardware.

http://www.anandtech.com/show/3920/amd-benchmarks-zacate-apu-2x-faster-gpu-performance-than-core-i5

http://www.anandtech.com/show/3933/amds-zacate-apu-performance-update

I still think that at least for next year, most of us are still going to want discrete cards, so what I'm shopping for is for HTPCs, which sandy bridge may be a little bit overkill for. if Zacate can handle 1080i without the need for a discrete card I will be overjoyed. I have to ignore all the atoms or most onboard solutions nowadays because no one cares about 1080i.

I'm extremely interested in Zacate as well. I think Zacate is going to takeover the mobile and HTPC dept and may become the standard for cheap, low power, low profile desktops as well. Nothing can compete with it at 18W.

Spime Wrangler
Feb 23, 2003

Because we can.

Factory Factory posted:

Will we never again feel the thrill of a desktop or laptop upgrade with a significant boost in capability? Are we doomed forever to only get our "new toy" excitement when Apple releases their version of a previously unpopular product and revitalizes that market?

:smith:

I think like other people have been saying its simply a matter of time before software catches up to hardware. Or other bits of hardware catch up to the hardware. In the meantime have you seen how fast cellphone technology is moving?



And keep in mind that poo poo like memristors are probably going to hit within a decade. What the gently caress are people going to be able to do with nonvolatile memory an order of magnitude denser, and order of magnitude lower-power, and two orders of magnitude faster than flash that work better as they get smaller, can store non-binary states, can be used as programmable analog circuits, can implement neural networks in hardware, are likely stackable in 3D, and which can do computation without a CPU?

I don't know man. Nobody knows. But I do know that it's going to loving. Rule.

BlackMK4
Aug 23, 2006

wat.
Megamarm

Spime Wrangler posted:

And keep in mind that poo poo like memristors are probably going to hit within a decade. What the gently caress are people going to be able to do with nonvolatile memory an order of magnitude denser, and order of magnitude lower-power, and two orders of magnitude faster than flash that work better as they get smaller, can store non-binary states, can be used as programmable analog circuits, can implement neural networks in hardware, are likely stackable in 3D, and which can do computation without a CPU?
This sounded like a spergy movie quote. It was loving awesome. :coal:

movax
Aug 30, 2008

Alereon posted:

Apple wants Light Peak because it will let them instantly obsolete all of your peripherals, forcing you to buy new (probably Apple-branded) hardware or expensive adapters. Intel is happy to oblige them because they get a sweet royalty check every time someone makes a Light Peak device.

I think if (maybe when) Light Peak falls flat on its face, the biggest benefits will be gained industry experience in the R&D and manufacturing of mass-market consumer optical-devices (kind of like how Toslink took off super fast and is in drat near every piece of A/V equipment now).

Then when the next-next generation of consumer insanity hits, we shall be ready!

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

movax posted:

I think if (maybe when) Light Peak falls flat on its face, the biggest benefits will be gained industry experience in the R&D and manufacturing of mass-market consumer optical-devices (kind of like how Toslink took off super fast and is in drat near every piece of A/V equipment now).

Then when the next-next generation of consumer insanity hits, we shall be ready!

The thing with the optical transceivers is that if you're going to make them in batches of 100,000 for mass consumer level stuff, they get really stupidly cheap. The only really lovely part will be the fact that you need a fusion splicer to make cables.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

Methylethylaldehyde posted:

The thing with the optical transceivers is that if you're going to make them in batches of 100,000 for mass consumer level stuff, they get really stupidly cheap. The only really lovely part will be the fact that you need a fusion splicer to make cables.

Sounds like a place where Monster can actually have a legitimate market rather than just selling to people who don't know better.

niggerstink420
Aug 7, 2009

by T. Fine
So while Intel is making an one chip solution, since normal people don't use the most of their graphics adapter, Nvidia is betting on technologies like CUDA, hoping people will now use fast graphics adapters for more than just game.

Someone is going to be really wrong about this.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
You realize that people just using Intel's onboard graphics has been norm for nearly a decade now right? The Sandy Bridge stuff seems to be more about snagging people who some gaming as well.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

fishmech posted:

You realize that people just using Intel's onboard graphics has been norm for nearly a decade now right? The Sandy Bridge stuff seems to be more about snagging people who some gaming as well.

It also lays the groundwork for people who are casual gamers, so Pop Cap can make even prettier games that will work awesomely on the new integrated video chips.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug
I thought Intel said earlier this year or late last year that they were getting out of the GPU market? If so, why the sudden change in direction

Disgustipated
Jul 28, 2003

Black metal ist krieg

CommieGIR posted:

I thought Intel said earlier this year or late last year that they were getting out of the GPU market? If so, why the sudden change in direction
They said discrete GPU market, IIRC. Basically they killed Larrabee and that was it.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

Disgustipated posted:

They said discrete GPU market, IIRC. Basically they killed Larrabee and that was it.

Ah makes more sense.

Plorkyeran
Mar 22, 2007

To Escape The Shackles Of The Old Forums, We Must Reject The Tribal Negativity He Endorsed

the ohmebaglod flag posted:

So while Intel is making an one chip solution, since normal people don't use the most of their graphics adapter, Nvidia is betting on technologies like CUDA, hoping people will now use fast graphics adapters for more than just game.

Someone is going to be really wrong about this.
It's not really a question of being wrong; they're actively trying to force things in different directions, not guessing which way things will go.

Spime Wrangler
Feb 23, 2003

Because we can.

Plorkyeran posted:

It's not really a question of being wrong; they're actively trying to force things in different directions, not guessing which way things will go.

It is if they can't get a strong market to form around their architecture on which they went all-in.

Personally, I think the AMD approach is actually much more revolutionary. They made a real good move acquiring ATI and getting in-house experience in high-end graphics processor design.

Too bad Nvidia is working them over on the GPGPU software side of things and intel on the CPU performance side.

Still, I wouldn't be surprised to see ATI the dominant player in a few years once they get the bugs worked out in fusion and someone figures out what the gently caress to do with widespread GPGPU abilities.

Anyone want to take bets on Intel avoiding programmable graphics hardware because they know they would just get their rear end handed to them if they helped the market take off too soon?

ilkhan
Oct 7, 2004

I LOVE Musk and his pro-first-amendment ways. X is the future.

Alereon posted:

It was actually clocked at ~400fps, and that includes decoding, scaling, and encoding simultaneously. Keep in mind also that since this is being done in dedicated fixed-function hardware, there won't be much of a performance impact on the system. You could have a video transcoding in the background while playing a game without the performance of either being impacted. Besides, the current generation of GPU-accelerated encoders look like crap, and most people don't really care if they're just putting videos on their mobile devices or uploading to Youtube.
Oh god I hope FRAPS decides to take advantage of this. Realtime full-res encoding of FRAPS to h264 with no performance impact would be awesome. I hadn't even thought of that before. Im more concerned with no performance impact than h264, trans-coding is fine as long as the quality and speed are there to begin with.

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered

the ohmebaglod flag posted:

So while Intel is making an one chip solution, since normal people don't use the most of their graphics adapter, Nvidia is betting on technologies like CUDA, hoping people will now use fast graphics adapters for more than just game.

Someone is going to be really wrong about this.

Or they are two separate market segments?

Lum
Aug 13, 2003

Doc Block posted:

Unfortunately there are a lot of businesses with no plans whatsoever to upgrade beyond XP in the immediate future.

Where I work, we recently picked up a few new business customers after our main competitor dropped support for Windows 2000. Some of them are currently planning their migrations TO Windows XP.

The only reason any of my customers even have anything as fast as a Core2 is because they ran out of spare motherboards for their GX280s. The vast majority are still perfectly happy plodding along on Pentium 4s in the low 2GHz range.

As a result of this. a 2GHz P4 running XP is still the baseline that we have to develop for, including new software.


As for Sandy Bridge, another loving new socket? It seems like only 5 minutes since Intel brought out 2 new sockets for i3/i5/i7 and now they're obsolete already. I'm glad I decided to sit this round out and just stick a 3GHz C2Quad into my X48 board.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

~Coxy posted:

I'm a bit disappointed by the lack of PCI-E lanes and USB 3 support.
Won't the LGA2011 version have plenty of lanes? --edit: A quick google says 40 PCIe 3.0 lanes.

MrBond posted:

I thought Handbrake and x264 shun 2-pass encodes in favor of the "quality based" ones now?
If you need to hit a certain file size, there's no way around two passes.

Combat Pretzel fucked around with this message at 12:03 on Sep 19, 2010

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

Combat Pretzel posted:

If you need to hit a certain file size, there's no way around two passes.
Who needs that? The only reasons I know of are 1) Encoding a file onto media with limited space, like a CD or DVD, and 2) Scene rules.

I don't need either of those. And I'm pretty sure encoding to a constant quality and limiting the max bitrate means you don't need 2-pass for device-compatible encoding, too.

Nam Taf
Jun 25, 2005

I am Fat Man, hear me roar!

The day this starts supporting GPGPU is the day I'll blow my load.

Right now, I'm wanting to re-jig some computational fluid dynamics simulations that I have running on CPU to run on a GPU. It would be so stupidly faster it's unbelievable, but I have the major trouble of bandwidth to/from storage, because GPUs lack the RAM I need.

If this supports GPGPU and I can use a fast bus to a 12-24GB RAM box, then my problems vanish to a large extent. This will be a wonderful day indeed.

Ryokurin
Jul 14, 2001

Wanna Die?

Spime Wrangler posted:

It is if they can't get a strong market to form around their architecture on which they went all-in.


What choice did they have? AMD couldn't afford them, Intel would be a legal nightmare, and VIA's x86 license was questionable up until late. Better to go all in now while you still have major influence than wait until the dust settles.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Nam Taf posted:

The day this starts supporting GPGPU is the day I'll blow my load.

Right now, I'm wanting to re-jig some computational fluid dynamics simulations that I have running on CPU to run on a GPU. It would be so stupidly faster it's unbelievable, but I have the major trouble of bandwidth to/from storage, because GPUs lack the RAM I need.

If this supports GPGPU and I can use a fast bus to a 12-24GB RAM box, then my problems vanish to a large extent. This will be a wonderful day indeed.

I thought GPUs could access system ram as-needed since the AGP days?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

BangersInMyKnickers posted:

I thought GPUs could access system ram as-needed since the AGP days?
At less than 8GB/sec, since that's the interface bandwidth of PCI-E 2.0 x16.

Ryokurin
Jul 14, 2001

Wanna Die?
and only at the maximum amount set by the AGP aperture size. It was basically meant as a last resort move if the video card's memory was full.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Aleksei Vasiliev posted:

I don't need either of those. And I'm pretty sure encoding to a constant quality and limiting the max bitrate means you don't need 2-pass for device-compatible encoding, too.
If CBR would churn out decent results, people wouldn't have invented two-pass modes.

And in regards to the Intel Media Engine, I'm pretty sure it won't hold its water to software implementations in terms of encoding quality and complexity.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Intel is bringing a feature from mainframes to the desktop: software-upgradeable processors. For $50, you can buy a scratch-off code that can be used with the Intel Upgrade Application to enable HyperThreading and an additional 1MB of L3 cache on a Pentium G6951. This seems like a pretty lovely deal, since it basically takes a $100 processor and still doesn't make it as good as a $115 Core i3 processor. If it turned the CPU into a real Core i5 with Turbo and the regular GPU clock speeds, in addition to HT and the rest of the cache, that might be a worthwhile upgrade.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

Alereon posted:

Intel is bringing a feature from mainframes to the desktop: software-upgradeable processors. For $50, you can buy a scratch-off code that can be used with the Intel Upgrade Application to enable HyperThreading and an additional 1MB of L3 cache on a Pentium G6951. This seems like a pretty lovely deal, since it basically takes a $100 processor and still doesn't make it as good as a $115 Core i3 processor. If it turned the CPU into a real Core i5 with Turbo and the regular GPU clock speeds, in addition to HT and the rest of the cache, that might be a worthwhile upgrade.

...and this is why I'm going to stick with AMD, they haven't tried to pull poo poo like this yet

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

Combat Pretzel posted:

If CBR would churn out decent results, people wouldn't have invented two-pass modes.

And in regards to the Intel Media Engine, I'm pretty sure it won't hold its water to software implementations in terms of encoding quality and complexity.
CRF + limited max bitrate isn't even close to CBR.

Also CBR looks great with enough bitrate, it's just ridiculously wasteful and should only be used in broadcast/streaming.

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

CommieGIR posted:

...and this is why I'm going to stick with AMD, they haven't tried to pull poo poo like this yet

Yeah AMD tells you you should pay full price for a complete replacement CPU from the same batch that was binned differently.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply