Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
MOAR
Mar 6, 2012

Death! Put your jacket on or you'll get frostbite!

intheflesh posted:

TL;DR
Is there a very lightweight program that will flip a webcam's video in real-time?

My first reply was pooh! In that case only other one I can think of is "WebcamMax" but not used it.

Adbot
ADBOT LOVES YOU

Airbone Operation
Dec 22, 2007
Tosser
I have been thinking about this for the past few days and I really am curious about it. I had crossfire 5870s and my computer started acting stupid so a lot of hassles occurred and eventually I solved the issue by removing the "secondary" 5870. I installed it solo and it wouldn't work as well. I swapped it into my wife's computer and it works fine. Her computer uses hdmi and mine is dvi though.

I put the 5850 from hers into mine and have that crossfired just fine but it has been bugging me that the card works, it just won't work in my computer crossfired or alone. I did not try crossfire hdmi however, so I do not know if that would work.

negativeneil
Jul 8, 2000

"Personally, I think he's done a great job of being down to earth so far."
I'm thinking about building a new gaming rig now that i've been desperately trying to stretch the capability of my macbook pro over the last couple years while I've been travelling. I also happen to finally have an apartment and a proper HDTV.

So I want to have the PC at my desk running on a monitor and also run an HDMI cable across the living room to the TV as a secondary monitor for gaming. I've been looking into getting a videocard that has an HDMI and DVI output and it occurred to me: if I run HDMI cable from my videocard to my receiver... that's not going to carry audio, is it? My motherboard will have optical, but is there some sort of adapter that combines the optical out and hdmi out on one end and then pass that through to another HDMI cable on the other?

Will this in any way effect my ability to do 3d games on my TV?

Ceros_X
Aug 6, 2006

U.S. Marine

negativeneil posted:

I'm thinking about building a new gaming rig now that i've been desperately trying to stretch the capability of my macbook pro over the last couple years while I've been travelling. I also happen to finally have an apartment and a proper HDTV.

So I want to have the PC at my desk running on a monitor and also run an HDMI cable across the living room to the TV as a secondary monitor for gaming. I've been looking into getting a videocard that has an HDMI and DVI output and it occurred to me: if I run HDMI cable from my videocard to my receiver... that's not going to carry audio, is it? My motherboard will have optical, but is there some sort of adapter that combines the optical out and hdmi out on one end and then pass that through to another HDMI cable on the other?

Will this in any way effect my ability to do 3d games on my TV?

HDMI carries audio and video. If your monitor has an HDMI port then you can get a video card with two HDMI ports and be set. If it only has DVI you can still get one with two HDMI ports, as it's easy to go from HDMI to DVI with a cable..

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

negativeneil posted:

if I run HDMI cable from my videocard to my receiver... that's not going to carry audio, is it?

With any video card made in the past few years, it will. The GPU presents an audio device to the host system, and any sounds played through that audio device go down the HDMI cable with the video signal.

negativeneil
Jul 8, 2000

"Personally, I think he's done a great job of being down to earth so far."
awesome. thanks for the fast answers!

Ra-amun
Feb 25, 2011
So I built a computer a few weeks ago, so it's still pretty new, but I've come across an annoying issue since I added another hard drive a couple days ago. I got it from somebody in SA-Mart and it works great aside from the weird problem.

Anyways, whenever I hit the switch to turn on the computer, it powers on, spins its fans, and the phase LEDs light up; as soon as it does all this however, it shuts off one time before powering up again to finally load up Windows. It also starts the computer up again if I shut down using Windows and I have to hit the switch on the PSU to actually keep it off.

I may have jostled some cords around and that could be the issue but I just wanted to know what's up and if it's either a general problem or if the hard drive is giving me the trouble.

RocketSurgeon
Mar 2, 2008
I had sort of the same problem when I overclocked my 2500k. I tried updating the bios, double checking every setting but it still happened.

After about a month or two it resolved itself somehow and now my computer boots just fine.

I suspected it might have been the psu but a 620w seasonic should be more than enough to power the bunch(1 hdd, radeon 6850, 3 fans)

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Ra-amun posted:

So I built a computer a few weeks ago, so it's still pretty new, but I've come across an annoying issue since I added another hard drive a couple days ago. I got it from somebody in SA-Mart and it works great aside from the weird problem.

Anyways, whenever I hit the switch to turn on the computer, it powers on, spins its fans, and the phase LEDs light up; as soon as it does all this however, it shuts off one time before powering up again to finally load up Windows. It also starts the computer up again if I shut down using Windows and I have to hit the switch on the PSU to actually keep it off.

I may have jostled some cords around and that could be the issue but I just wanted to know what's up and if it's either a general problem or if the hard drive is giving me the trouble.

This is probably not it (as the rebooting to check overclock settings on boot is a likely cause), but double check that the power and reset switches aren't getting pressured somehow. I had an old case where the reset switch would randomly press itself, so the PC would randomly reboot while I was using it. That one was tough to figure out.

Boten Anna
Feb 22, 2010

What are the good midrange video cards right now, like ~$200? Is there anything that would be worth upgrading to from a GTX260? I'm really only interested in nVidia cards.

MOAR
Mar 6, 2012

Death! Put your jacket on or you'll get frostbite!

Boten Anna posted:

What are the good midrange video cards right now, like ~$200? Is there anything that would be worth upgrading to from a GTX260? I'm really only interested in nVidia cards.

Take a look at GTX560 - should be just under 200 - step up from 260 - check your power supply can handle it.

Otherwise GTX460 is the one I would go for, I think that's a really good card for the cash.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

MOAR posted:

Take a look at GTX560 - should be just under 200 - step up from 260 - check your power supply can handle it.

Otherwise GTX460 is the one I would go for, I think that's a really good card for the cash.

A GeForce 460 is a very difficult card to find exactly right - you need the 1GB RAM, 336 shader, 256-bit memory bus version. If all those three factors aren't in place, the Radeon 6850 is a superior performer and there aren't any sneaky variations.

For $200, though, you could get a full MSI or EVGA 560 Ti, after the $30 rebate.

Boten Anna
Feb 22, 2010

The 560 Tis look pretty nice (200 isn't really an upper limit, just like $2xx is what I'm considering). What are the main things to consider nowadays for a good card? Number of shaders? Shader clock? Are those better to look at than the core clock?

Srebrenica Surprise
Aug 23, 2008

"L-O-V-E's just another word I never learned to pronounce."
You can't compare specifications across entire architectures. Just look at the cards. The only times those factors come into play is when nVidia is being nVidia and repeatedly branding different cards with the same name, in which case they're the only way to figure out which is which.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Boten Anna posted:

The 560 Tis look pretty nice (200 isn't really an upper limit, just like $2xx is what I'm considering). What are the main things to consider nowadays for a good card? Number of shaders? Shader clock? Are those better to look at than the core clock?

Comparing details like the number of shaders or clock speeds across different architectures is close to meaningless. Applications interact with graphics hardware at a very high level through APIs like DirectX and OpenGL; the low-level implementation which translates geometry and bitmaps to a pretty picture on your screen actually varies dramatically between manufacturers and generations. About the only useful comparison you can make based on numbers like clockspeed and shader count is that, within the same architecture, more shaders and higher clocks are better - but even then, the effect on performance isn't necessarily easy to predict.

Marketing departments like to throw big numbers around, but the only thing that really matters to you as an end user is performance in the games or applications you want to run. Look up benchmarks on respected sites like Anandtech and Tech Report, and make a decision based on that.

Agrikk
Oct 17, 2003

Take care with that! We have not fully ascertained its function, and the ticking is accelerating.
I have recently come into a set of 5 300GB 2.5" SAS drives (15mm tall) and I'm wondering if there exists some kind of bay that I can mount them all in for a speedy little array in a desktop.

I'm thinking something like this but for 2.5" drives and that handles hot-swap SAS drives.

My google skills are failing me at the moment and I can't find anything other than this one that holds six laptop drives and won't fit the 15mm SAS drive. Anyone have a suggestion?

Agrikk fucked around with this message at 00:16 on Mar 11, 2012

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Well my main system has so far survived the meltdown of my GTX460. The problem is the 8800GT I put in there is semi-broken because it only flakes when I try to run a game on it. I submitted a RMA request to MSI so is MSI's customer service decent or am I going to waiting a while for a replacement card?

Triikan
Feb 23, 2007
Most Loved
How similar do machines have to be to share a network boot image? I have a couple computers I'll be using as HTPCs once I upgrade my current gaming computer, and I'd like to run them diskless. Both are Core2 Duos, one a 6600, one is an 8400. Motherboards are different brands, but video cards will be identical.
These will be run off a Dell Poweredge with an i5 based Quad Xeon (first generation i5) with 4gb of ram, with a Rocketraid card and 4x1.5tb cards.

I was planning on running everything on windows, but it seems there's more documentation on running machines diskless with Linux.
This guy has a great setup, but I was hoping similar results, just with more of a hodge podge setup.
http://kentonsprojects.blogspot.com/2011/12/lan-party-house-technical-design-and.html

Triikan fucked around with this message at 04:08 on Mar 12, 2012

flyboi
Oct 13, 2005

agg stop posting
College Slice
How long does it take for new models of Dell to reach the outlet section? I really want one of those Alienware X51s for my HTPC but I don't want to drop $700 on it. Yes, I'm aware it's Alienware but it's one of two HTPCs I can find that will work for my setup the other being the Asrock Vision 3D which is $1000. Also it's Asrock so no.

And before someone says it: Mac Mini will not work because I need a NVidia GPU for what I want.

However, if you know of a different platform that has an NVidia GPU and preferably a Sandy Bridge CPU I'm all ears.

Triikan
Feb 23, 2007
Most Loved

flyboi posted:

How long does it take for new models of Dell to reach the outlet section?

If its reliable and no manufacturing defects in the first shipments, a while. If its slightly less reliable or there are manufacturing hiccups, very quickly.

Autarch Kade
May 6, 2008
I recently installed a 7970 graphics card. I'm running an Eyefinity setup of three 30" monitors. Two I connect using a mini displayport to displayport cable, and the third is a DVI connection. The problem I have is that when I turn on the PC with the two mini displayport -> displayport cables plugged into the graphics card, nothing is displayed and the computer will not boot. Without those two cables in, and just the DVI connected, it boots fine. Everything works fine once it begins to boot, which leaves me unplugging and replugging in the two cables every time I power on the machine.

Is this a known issue, and is there a fix for it?

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you

Autarch Kade posted:

I recently installed a 7970 graphics card. I'm running an Eyefinity setup of three 30" monitors. Two I connect using a mini displayport to displayport cable, and the third is a DVI connection. The problem I have is that when I turn on the PC with the two mini displayport -> displayport cables plugged into the graphics card, nothing is displayed and the computer will not boot. Without those two cables in, and just the DVI connected, it boots fine. Everything works fine once it begins to boot, which leaves me unplugging and replugging in the two cables every time I power on the machine.

Is this a known issue, and is there a fix for it?
I think something weird is going on with your system because while I don't have 3 monitors hooked up to mine I definitely have one hooked up via DP and then the other using DVI and my computer boots fine.

Try not having one of the monitors hooked up and see if it fixes it but you may want to try getting support on the AMD/ATI side of things

Rukus
Mar 13, 2007

Hmph.

flyboi posted:

How long does it take for new models of Dell to reach the outlet section? I really want one of those Alienware X51s for my HTPC but I don't want to drop $700 on it. Yes, I'm aware it's Alienware but it's one of two HTPCs I can find that will work for my setup the other being the Asrock Vision 3D which is $1000. Also it's Asrock so no.

And before someone says it: Mac Mini will not work because I need a NVidia GPU for what I want.

However, if you know of a different platform that has an NVidia GPU and preferably a Sandy Bridge CPU I'm all ears.

What exactly is your setup where it wouldn't be more practical (and probably cheaper) to just build one yourself?

flyboi
Oct 13, 2005

agg stop posting
College Slice

Rukus posted:

What exactly is your setup where it wouldn't be more practical (and probably cheaper) to just build one yourself?

It's going on my tv and it needs to work with WMC. From what I've pieced out the cost is $500-600 for parts to "roll my own."

The hard part is I get my cable through WMC and there's this bug with how some content providers suck at encoding their content and it flaps between progressive and interlaced. WIthout the right GPU you lose frames on playback and it looks like poo poo. The only GPUs known to not have this issue are NVidia GPUs.

Pretty much all the "HTPC" cases won't work because either they don't have an IR port that is WMC compatible or a 3.5" bay to accomodate this. I need the IR port so my Harmony remote can control it.

The ones that do have a 3.5" bay are ridiculously large and would look silly on my TV stand. I found the Antec Minuet 350 case which would work and not be overly hideous, but with the cost of parts being just under the price of a discounted Alienware X51 it seems stupid to build something that's an eyesore. Also when you get into these tiny cases such as the Antec, you're dealing with proprietary power supplies and my history with Antec case supplies has been unfavorable.

Rukus
Mar 13, 2007

Hmph.
Since you're using a 3.5" IR receiver, you could just use a 5.25" to 3.5" panel converter. that'll give you a bit more options for finding a nicer-looking case.

For example, pair that up with a case like this: http://www.newegg.com/Product/Product.aspx?Item=N82E16811163166, which has enough room for a 11" video card as well as a standard ATX PSU. Meets all the critera you need, and it's only two inches taller than that Antec one.

Thwack!
Aug 14, 2010

Ability: Shadow Tag
Ever since I've cleaned off the dust from my PC, I've been getting this awful low buzzing sound coming from my audio output. At first, I thought it was my speakers acting out, but as I hooked my headphones on the audio output port (back and front ends) on the PC I realize that this buzzing sound is coming from my on board sound card. Is there any way to fix it? Thanks in advance.

movax
Aug 30, 2008

Thwack! posted:

Ever since I've cleaned off the dust from my PC, I've been getting this awful low buzzing sound coming from my audio output. At first, I thought it was my speakers acting out, but as I hooked my headphones on the audio output port (back and front ends) on the PC I realize that this buzzing sound is coming from my on board sound card. Is there any way to fix it? Thanks in advance.

Could perhaps be a ground loop, or just poor EMI shielding/design on the motherboard. Google for 'PC ground loop' or similar to get an idea of what that problem could entail.

BusinessWallet
Sep 13, 2005
Today has been the most perfect day I have ever seen
Last week, before all the iPad hardware specs were released, it was speculated that the new iPad would have 512 MB of RAM, the same as the iPad 2. Some folks around my office were discussing why Apple would release the iPhone 4S and not increase the RAM from the previous model, and why they would release the new iPad without updating the memory as well. They came to the conclusion that adding memory would decrease battery life in the devices, because RAM uses battery power. That makes no sense to me, so I argued that added RAM would increase battery life because it takes load off the CPU and other resources, causing the device to run more efficiently, but they would not relent.

I want to know if anyone has insight about this, because I am genuinely curious. I know that in a standard laptop, not a tablet, if you add RAM it is generally accepted that this will increase battery life because it takes load off the CPU and hard drive, so I don't understand how it would be different for mobile devices like iPhone or iPad.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

BusinessWallet posted:

Last week, before all the iPad hardware specs were released, it was speculated that the new iPad would have 512 MB of RAM, the same as the iPad 2. Some folks around my office were discussing why Apple would release the iPhone 4S and not increase the RAM from the previous model, and why they would release the new iPad without updating the memory as well. They came to the conclusion that adding memory would decrease battery life in the devices, because RAM uses battery power. That makes no sense to me, so I argued that added RAM would increase battery life because it takes load off the CPU and other resources, causing the device to run more efficiently, but they would not relent.

I want to know if anyone has insight about this, because I am genuinely curious. I know that in a standard laptop, not a tablet, if you add RAM it is generally accepted that this will increase battery life because it takes load off the CPU and hard drive, so I don't understand how it would be different for mobile devices like iPhone or iPad.

If adding memory caused the device to swap into virtual memory less, maybe. I don't think iPads use virtual memory, personally, so power delta going from 512 to 1G would depend purely on the number of chips involved and how much power each uses. That is, they may just be using twice as many chips but they may instead be using newer ones with higher density and/or lower voltage. I don't feel like it's going to be a consequential difference in any case.

Eletriarnation fucked around with this message at 16:33 on Mar 13, 2012

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.
RAM (especially DRAM) does use power of its own. For a typical laptop, the draw is close to negligible. For the iPad, though, you're dealing with a device that's supposed to squeeze 9-10 hours of runtime out of a battery that's small by laptop standards, and that big bright IPS panel is already pulling a lot of juice. Electron-pinching budgets are important in hardware like the iPad.

And, well, RAM isn't a magical CPU-workload-reducing potion. Used intelligently, it can decrease CPU workload, because it lets you cache the result of work and avoid doing it twice. But, if an iPad app needs a big static cache, it's got another place to shove things. Every iPad comes with at least 16 gigs of Flash built right in, and every app understands that it needs to store persistent information. Suspended apps can be and frequently are purged from RAM if something else needs the space, and it's the app's responsibility to resume back to where it was. Memory management in iOS is not like memory management in Windows; everything in iOS is set up to work in an environment with limited memory and power.

Despite all that, though, I doubt power has been the single overriding concern that's kept memory on iDevices low compared to a lot of competitors. Packaging concerns are another big one; there are a lot of advantages to the PoP design, but you can't exactly just throw another chip on the PCB if you want to increase the RAM. And, of course, there's simple cost reduction. Meanwhile, the only developers who are clamoring for increased memory are the games and a few specialist-app people; Yelp isn't ever going to use more than 512 megs. So, the only benefit to increased memory for most users is that suspended apps come back a little faster. It's not hard to understand why they might make the decisions they have.

BusinessWallet
Sep 13, 2005
Today has been the most perfect day I have ever seen
I guess I don't understand what you mean. My co workers said that they didn't increase the RAM in the iPhone because it would have decreased battery life. I argued that was not true. Who was right?

Nintendo Kid
Aug 4, 2011

by Smythe
There's really not much reason to believe that upgrading the RAM would have had a noticeable decrease in battery life. I'd bet anything the actual reason it wasn't upgraded was so Apple could maintain a slightly higher profit margin.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

BusinessWallet posted:

I guess I don't understand what you mean. My co workers said that they didn't increase the RAM in the iPhone because it would have decreased battery life. I argued that was not true. Who was right?

It's impossible to know for sure. Apple engineers sat down and weighed a bunch of pros and cons: power consumption, design complexity, cost, user experience, and so on. Power definitely would have been a factor, but unless you happen to know senior people at Apple with uncharacteristically loose lips or have bugs in Cupertino, you can't really say whether it was the deciding factor.

BusinessWallet
Sep 13, 2005
Today has been the most perfect day I have ever seen
Haha I'm not asking about Apple's reasoning at all, I'm just asking for clarification. My co workers argued that RAM decreases battery life, I argued that it does not.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

BusinessWallet posted:

Haha I'm not asking about Apple's reasoning at all, I'm just asking for clarification. My co workers argued that RAM decreases battery life, I argued that it does not.

More RAM chips consume more power. Larger RAM chips do not.

movax
Aug 30, 2008

BusinessWallet posted:

Haha I'm not asking about Apple's reasoning at all, I'm just asking for clarification. My co workers argued that RAM decreases battery life, I argued that it does not.

DRAM draws more power because it must be continually refreshed; each bit can be thought off as an individual capacitor (on a huge scale of course). This is cheap and easy to make but draws much more than something like SRAM.

The more of those cells you have the more power it will likely draw, but that's a gross oversimplification. The operating voltage, clock frequency and density can all affect this as well.

Mechanical Fiend
Sep 5, 2011

How'd that asshole get a white robe?!

-edit- misread thread title, sorry!

Mechanical Fiend fucked around with this message at 05:05 on Mar 14, 2012

Hadlock
Nov 9, 2004

I saw notable decreases in performance when I doubled the ram in my last three laptops. My Powerbook G4 went from 5 hrs down to 4:20, Dell Laptop went from 2 to 1:40, and my netbook went from 4:30 down to 3:55. Ish.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
I have a probably dumb question. I have an old PNY class 2 micro SD card and adapter. I am planning on buying a 16gb Pretect class 10 Micro SD card (no adapter):

http://www.newegg.com/Product/Product.aspx?Item=9SIA0KV05A0668

My question is, does the adapter affect how fast the card writes, like would my current adapter affect the clas 10 speed of the one I want? Or are all adapters the same and the speed is purely in the micro SD card?

Adbot
ADBOT LOVES YOU

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Endymion FRS MK1 posted:

I have a probably dumb question. I have an old PNY class 2 micro SD card and adapter. I am planning on buying a 16gb Pretect class 10 Micro SD card (no adapter):

http://www.newegg.com/Product/Product.aspx?Item=9SIA0KV05A0668

My question is, does the adapter affect how fast the card writes, like would my current adapter affect the clas 10 speed of the one I want? Or are all adapters the same and the speed is purely in the micro SD card?

From this website: http://www.camerahacker.com/Digital/Inside_miniSD_Adapter.shtml it appears that the adapters are just pass through with no circuitry. Therefore, the adapters should work passively with all Micro SD cards.

  • Locked thread