|
88h88 posted:How you feel about this will depend on your thoughts about what happens after you die. I'll say one thing though, the thought of someone uploading a bunch of poo poo I hated just to be a oval office would make me haunt their rear end from beyond the grave. Holy poo poo. That's like the perfect combination to take stupid people's money. Rich people already pay way too much for designer coffins. Mixing them and audiophiles is a match made in heaven.
|
# ? Aug 18, 2013 20:44 |
|
|
# ? May 15, 2024 03:06 |
|
How many
|
# ? Aug 18, 2013 21:57 |
|
Dexo posted:Holy poo poo. That's like the perfect combination to take stupid people's money. What's even better is that it quite proudly states that they use a $30 Tripath amp module.
|
# ? Aug 19, 2013 03:21 |
|
And what the hell is a 1600Mhz HDD? Do they mean RAM?
|
# ? Aug 20, 2013 00:43 |
|
I stumbled upon a new line of audiophile claims this week. Apparently if you use Directsound in Windows, the kernel mixer introduces imperfections (i.e it's not 'bit perfect' due to on-the-fly processing and resampling). They are all using ASIO or WATAPI drivers in exclusive mode instead. They claim the difference is immense, all not control tested of course. Seems kinda impractical, considering if you use ASIO or WATAPI in exclusive mode no other application will be able to play any audio simultaneously! Is this as much bullshit as I suspect? Edit: It's kinda funny that the ASIO foobar plugin they use explicitely states on the homagepage that it will not improve the sound Iamthegibbons fucked around with this message at 16:29 on Aug 26, 2013 |
# ? Aug 26, 2013 16:26 |
|
What's funnier is that plenty of sound cards aren't really designed for ASIO, so when you force them to use ASIO drivers, there could actually be more bugs and errors in the audio processing. Not sure if WATAPI is the same or not.
|
# ? Aug 26, 2013 16:49 |
|
Iamthegibbons posted:Is this as much bullshit as I suspect? On top of that you have WASAPI (WinMM in old rear end versions of Windows), which is a mixer through which all applications channel their audio via the different interfaces like DirectSound, WinMM, XAudio1/2, and so on, which sit on top of it. The WASAPI mixer operates at a set format (default is 44Khz, stereo, 16bit, but the user can change it). Anything being played in an application that doesn't match its sampling rate is going to be resampled, number of channels is going to be up-/downmixed. Application specific volume levels are applied, tho. WASAPI uses 32bit IEEE floats internally, which are then going to be dithered and converted to whatever bitdepth your output format is set to. WASAPI has an exclusive mode, which is essentially a pass-through mode to KS, which lets you select a format and play your music bit-perfect. It doesn't do any resampling or channel mixing, nor apply volume changes to the data. It blocks any other audio. If you're playing audio in the same format WASAPI is set to, the volume slider for your application is maxed and nothing else is playing, the audio should be bit-perfect, too, since WASAPI doesn't need to resample or mix channels. I doubt that going to IEEE floats and back to integers will involve any rounding errors. --edit: That said, I'm not sure about the resampling. I know for sure that it doesn't do that when recording audio. Playback, I don't know. Heard yes, back when WASAPI was introduced. Internet however says no. Combat Pretzel fucked around with this message at 20:01 on Aug 26, 2013 |
# ? Aug 26, 2013 19:54 |
|
I use WASAPI in XBMC for movie watching and music listening, It's the recommended option if it works. Directsound with bluray rips sometimes gives me lipsync issues. Is it IMMENSELY better than Directsound ? Well, given that directsound introduces timing errors and performance issues, when while watching a movie with a bunch of guests, is completely unacceptable, I'd say WASAPI is immensely better. Not because of sound quality, but because of performance. I also bitstream everything to the receiver, so I'm using high bitrate lossless "HD" audio formats usually. Edit: From http://wiki.xbmc.org/index.php?title=Windows_audio_APIs The best line from below is: quote:I myself have a dislike of Window's cutesy system sounds happening at 110db quote:Since Vista SP1, Windows has two primary audio interfaces, DirectSound and Wasapi (Windows Audio Session Application Programming Interface). The latter was a replacement for XP's Kernal Streaming mode. jonathan fucked around with this message at 20:19 on Aug 26, 2013 |
# ? Aug 26, 2013 20:14 |
|
Iamthegibbons posted:I stumbled upon a new line of audiophile claims this week. Apparently if you use Directsound in Windows, the kernel mixer introduces imperfections (i.e it's not 'bit perfect' due to on-the-fly processing and resampling). They are all using ASIO or WATAPI drivers in exclusive mode instead. They claim the difference is immense, all not control tested of course. Seems kinda impractical, considering if you use ASIO or WATAPI in exclusive mode no other application will be able to play any audio simultaneously! Is this as much bullshit as I suspect? Other forums have tested bit perfect playback methods by sampling the output from one machine directly into a second machine. It's how they test the audio to see if the output is actually bit perfect or not from the various applications. The amusing thing is people buying expensive plugins/players like Audirvana, Pure Music, etc and claiming the sound is better than X bitperfect plugin/player. When they are actually 100% the same, because, bitperfect is going to be the same no matter what platform/plugin/software you play it back with. That's kind of the point. So on Windows you can grab something like Foobar2000 for free and get the best, or JRiver MC on OSX for $50 instead of $200 or whatever those other ones cost.
|
# ? Aug 26, 2013 22:45 |
|
Some very informative answers, thanks! So there is something to be said about bit perfect it seems, at least for time-critical applications. As for sound quality, I guess it's difficult to differentiate bit perfect and bit imperfect playback on ABX testing? Usually just playing audio files isn't really time-sensitive. I don't think I ever have timing issues, even on the cheapest sound cards. Shame that WASAPI exclusive mode playback is still pretty impractical for me. I often have two sources playing audio at once on my PC (black metal just isn't the same without some Tito Puente mixed in). Makes sense for a dedicated media player though! Iamthegibbons fucked around with this message at 16:45 on Aug 27, 2013 |
# ? Aug 27, 2013 16:30 |
|
You can certainly see more detail in the recordings comparing different versions of the songs. It's how HDTracks got caught selling an upsampled Metallica album and passing it off as HD. They pulled it saying this is what got sent to them, and gave full refunds to everyone who bought it. Whether or not you can notice it is where this thread comes into play.
|
# ? Aug 27, 2013 17:01 |
|
http://support.microsoft.com/kb/2653312 There can be cases where the Windows sound system does a very poor job of sample rate conversion and adds distortion to the sound, which is a legitimate concern (and a step back since XP does this just fine apparently). More details at https://www2.iis.fraunhofer.de/AAC/ie9.html However I can't say I heard any errors on my stock Windows 7 + IE 9 install so it may have been rolled into a patch or something. So for high quality playback certainly changing the sample rate of the sound card or using a high quality converter makes sense, but I really like having a working volume control so no bit-perfect playback for me.
|
# ? Aug 27, 2013 22:10 |
|
Opensourcepirate posted:Getting back to actual science and using the correct hardware in the correct place. Not all RCA outputs are created equally. Speaking of expensive cables and actual science, I'll be spending tomorrow replacing my old RG59 cable TV wire terminated with screw-on F connectors with a remnant of Belden 1694A I got for free and some decent compression fittings. Crossing my fingers that I can get the MER up into the acceptable range for the box in my room to start working again.
|
# ? Aug 29, 2013 22:36 |
|
GWBBQ posted:I really expected a post starting with "Getting back to actual science and using the correct hardware" to at least involve an oscilloscope or multimeter to back up claims about waveforms and voltage levels. In my experience, consumer level equipment almost always outputs unbalanced audio from RCA jacks at -10dBV, and tolerances are looser than professional equipment. Professional equipment almost always outputs balanced audio from TRS, XLR, or captive screw connections at +4dBu, some units have mic/line level switches, and a lot will offer an unbalanced output on RCA jacks at -10dBv. Other then very short runs say from a wall to a box, I never do understand why cable companies will often wire a house with RG-59. I usually wire up friends house with RG-6 or better. RG-59 is only good for wall to box / tv.
|
# ? Aug 30, 2013 18:17 |
|
The RG59 was 15+ years old. With new cable and connectors, SNR improved by 20dB (I have to imagine the twist-on F connectors had a lot to do with that) and all the channels that didn't come in are back. Aside from macroblocking and losing frames due to poor signal, can bad SNR and MER degrade picture quality noticeably or is it just placebo effect that everything looks better?
|
# ? Sep 3, 2013 00:13 |
|
Digital streams are either fine, or broken (blocking, wrong vectors warping the image, etc.). There's nothing in-between.
|
# ? Sep 3, 2013 01:09 |
|
He's talking about SNR which is very much an IF/RF thing. If your SNR's are good, you're going to get better accuracy with the decoding equipment. Bit Errors are a real thing, guys. Just because it's "binary" doesn't mean it's just "on/off"; you can get MOST of the data needed, some or all. That being said, if it's video, it's definitely susceptible to things caused by interference, blocking and lost frames aside; but I'm not an expert at TV and Video, I just know that's the case with things like satcom modems.
|
# ? Sep 3, 2013 01:47 |
|
I count bit errors as broken, because it doesn't let the data decompress into what it is supposed to be. lovely SNR on analog means noise. lovely bits means corrupted macroblocks, wrong motion vectors, both creating broken frames. Which is an issue, considering that several successive frames will be reconstructed from motion vectors, smearing the broken image sections all over the place. So technically, either a stream is correct or not. I guess the main point is however that a higher SNR, beyond a sufficient one to read a reliable stream, doesn't make more vivid, crisp or contrasty video. Signal fuckery may achieve that effect on an analog TV signal, but your digital decoder would just go ape poo poo and show a blank image. That said digital TV, at least in Europe, is overcompressed to boot. Ringing everywhere.
|
# ? Sep 3, 2013 02:29 |
|
Wasabi the J posted:Bit Errors are a real thing, guys. Just because it's "binary" doesn't mean it's just "on/off" SNR issues will almost never (in extreme unlikelihood) cause a bit to read flipped, and if it does the byte it was a part of is nonsensical, and the frame the byte helped make up will either block, or depending on cache technology, simply be re-read and delivered normally.* The idea that bit errors can cause subtle errors (or a 'different' image .. less black blacks!) is pure audio/videophile nonsense, so don't subscribe to it. I'm not having a go at you, but there are a lot of people (usually trying to sell you something) who come out with what you're saying, and the truth of it is there is really only two states - failed or working. Everything else amounts to "you could be getting a better picture and just don't know it!" which is sales pitch. * Supported by USB & SCSI, making "this USB cable is better" arguments particularly laughable.
|
# ? Sep 3, 2013 14:59 |
|
Problem is it may not be obvious if a bitstream is in fact coming through properly, there may be errors that only show up occasionally and a marginal digital TV signal may be clear most of the time but break up on scene changes or be more susceptible to noise causing somewhat random errors to creep in. I've also had more subtle errors in DVI cables where only pixels of certain dark values would be turned blue, that's an error where it's definitely not clear from looking at a normal screen that there's anything wrong, even though it's digital there's still an analog component in there and if error correction is being used silently it's possible to have a degraded link but not notice anything most of the time.
|
# ? Sep 3, 2013 15:31 |
|
longview posted:I've also had more subtle errors in DVI cables where only pixels of certain dark values would be turned blue, that's an error where it's definitely not clear from looking at a normal screen that there's anything wrong, even though it's digital there's still an analog component in there and if error correction is being used silently it's possible to have a degraded link but not notice anything most of the time. I had that as well, turned out to be a marginal DVI cable at first, cheap and probably not fully up to DVI spec. Then I upgraded to a 1920x1200 monitor and the error returned. Technically, single-link DVI can just barely deliver 1920x1200 at like 56hz or something, if you have a card that's 100% to spec with no cost-cutting. Most of the time it's OK, but it's one of those times where you're riding right on the edge of what your hardware can do. Upgrading to a dual-link-capable graphics card solved the issue for good. Apart from the blue specks, the picture was perfect, though.
|
# ? Sep 3, 2013 16:02 |
|
Yeah in my case it went away when I tightened the connector properly, but my point was mostly that HDMI is the same way, no matter what, all electrical and optical signals are transmitted analog, so it's always worth striving for the best SNR even though it "looks fine". Looking for the highest possible SNR without overloading the receiver is always what you want for reliable transmission, noise is not constant and the stronger the signal the more resistant the link will be to pulses or other EMI problems. Just because it's digital doesn't mean cables don't matter, that's silly because cables definitely matter, you can make a USB cable out of whatever and it'll work somewhat but it's not going to be just as good as a real shielded cable. Some have cables better shielding, like RG-6 which has braid and foil shielding, meaning it's better at rejecting RF intrusion especially at higher frequencies, this can make a big difference if you live near a transmitter site since otherwise RF can couple into the cable system and overload the receivers, as well as cause additional intermod in amplifiers. It also has significantly lower loss than RG-59, especially at UHF frequencies and this can matter if you use a cable modem too, where better SNR translates directly to higher speeds. But I guess we're just supposed to ignore all that since there's a digital TV signal, it'll either get there reliably every single time or it'll completely not work! And TV systems all use QAM AFAIK so they're not even binary when transmitted
|
# ? Sep 3, 2013 18:16 |
|
This is a little ridiculous. You're pointing to DVI signal issues as an example of a digital signal that needs a cleaner signal? Firstly, it's real likely the issues described above were a DVI cable running in analog, VGA mode. Secondly, I'm not sure that DVI has any sort of error correction. When you're talking about issues with one color channel being "off" then that sounds pretty analog to me. https://en.wikipedia.org/wiki/Digital_Visual_Interface#Specifications Digital signals are very sensitive to flipped or missing bits. In the best case scenario error correction fixes it fine, or re-transmits, which someone mentioned. In almost all other scenarios you are going to have significant, obvious artifacts or signal dropouts. You're not going to get a fuzzy picture or loss of a color channel from a bad digital signal. Also, wtf are you talking about "even though it is digital there is an analog component in there"?
|
# ? Sep 3, 2013 19:05 |
|
None of what you were experiencing related to issues of SNR. It's simply orders of magnitude away from being the case. It's somewhat interesting that you reference the DSL scenario in your post, yet still don't appreciate how far off the mark talking about SNR issues in short-run high-grade digital cables is. Back in ~2003 ADSL hit my student house, of Victorian origin with 50 year old phone cables installed. Despite getting very poor SNR based on: - 3500 metres of copper wire in dodgy condition between my house and the exchange. - A couple dozen meters of old copper wire in the house, in one part we found the main line had been chewed down by mice to a few strands of copper - Said wire was literally wrapped around the main power line coming into the house. it worked, some years later the same cables synced to 6.5mbs. The connection was bit-perfect nearly all the time. I know this, because wary of the issues I ran MD5 checks on any large download. I can count on one hand the number of times I had a failure - 3. DSL connections are incredibly complex, because they split the available frequency range up into multiple bands to deliver enough bandwidth, circumventing frequency limitation issues. Decent routers run a stable bit-perfect connection over a line that will offer 10-20db of SNR. Or hey lets talk about my Digital TV signal that goes ~56,000meters through hills, a busy city with all that entails and arrives at my house all-but bit perfect (my receiver counts errors - it's a single digit number to date) with an SNR that is single digit. So lets look at USB, whose typical SNR is anywhere between 100 and 120db or, to put it another way, the signal is 100,000+ times the power of the noise going through the link, or 10,000 times better than other technologies that can deliver bit-perfect data. It's miles and miles away from being an issue with home cable lengths. quote:this can make a big difference if you live near a transmitter site since otherwise RF can couple into the cable system and overload the receivers, as well as cause additional intermod in amplifiers If you're getting RF radiation into your home at a level to blow out electronics then you probably have other concerns, like why is my skin peeling and oh god my brain is liquid what help.
|
# ? Sep 3, 2013 19:22 |
|
Jago posted:This is a little ridiculous. You're pointing to DVI signal issues as an example of a digital signal that needs a cleaner signal? Firstly, it's real likely the issues described above were a DVI cable running in analog, VGA mode. Secondly, I'm not sure that DVI has any sort of error correction. When you're talking about issues with one color channel being "off" then that sounds pretty analog to me. Nope, DVI-D cable. No friggin way anything analog was coming through that cable. I know how noise on a VGA or analog DVI connection looks on a monitor, this didn't look like that at all. My best guess is that my video card simply couldn't properly run the DVI output at the frequency needed, although it's not like 1280x1024 is the most demaning resolution out there. Part of the signal got too noisy and for some reason that made certain very dark colors default to 100% blue. There's no error correction in DVI, though, so I don't know why that would happen, it would have to be something vaguely defective in the graphics card or in the monitor. When I changed over to a DVI-I dual link cable, the problem went away until I upgraded to a 1920x1200 monitor and then it was back. Changing to a newer, higher-spec dual-link-capable graphics card (Geforce 5900XT to 7600GT) cured the problem.
|
# ? Sep 3, 2013 21:13 |
|
Khablam posted:Yes, it does. In fact, that is the literal definition of it. Sorry, what I was getting at was more along the lines of clarifying that broken data doesn't mean the signal just doesn't work; the implication that I was addressing was the commonly misunderstood "data either works or it doesn't" phrase -- which is true, but rarely to the point the entire data stream is lost. I absolutely agree with everything you said though; I was again, arguing more that you CAN have a broken/hosed up data stream to your device and still get a picture. Jago posted:Also, wtf are you talking about "even though it is digital there is an analog component in there"? "Digital" information through most mediums, including copper, are still analog impulses; those impulses are modulated and demodulated by the equipment, interpreting the analog electrical impulses and determining what the bits should have been from the transmitting source.
|
# ? Sep 3, 2013 22:02 |
|
Khablam posted:The connection was bit-perfect nearly all the time. I know this, because wary of the issues I ran MD5 checks on any large download.
|
# ? Sep 3, 2013 23:52 |
|
I'm on Amazon browsing for a snazzy-looking USB cable for my new keyboard (just for aesthetics), and I came across this review for an $89 flat USB cable.Steve "Disappointed" Dai posted:I just bought a Meridian Explorer USB DAC, and it is wonderful. It has the meridian house sound which is detailed, musical, and non-fatigue. However the stock USB cable comes with it seems to be average grade, so I figure I can get better sound with a more expensive USB cable. I did some search and seem Wireworld Starlight is a good choice.
|
# ? Sep 4, 2013 00:46 |
|
I was on Head-Fi reading the IEM discovery thread because there's been some interesting new headphones released from KEF and Onkyo and then they started talking about cables and then someone said he was waiting for his cable to properly burn in and then Just when you think there might be a useful, interesting thread boom, right back to the stupidity.
|
# ? Sep 4, 2013 08:25 |
|
grack posted:I was on Head-Fi reading the IEM discovery thread because there's been some interesting new headphones released from KEF and Onkyo and then they started talking about cables and then someone said he was waiting for his cable to properly burn in and then Head-Fi was the place that turned me onto buying dumb headphones because I thought they knew what they were talking about, and seemed well moderated. I should have known something was up before I got probated for talking about if someone double blind tested two rigs I was looking at. And when someone was doing a legitimate review for like $1300 worth of equipment (described by some as an entry-level system) with J-Pop. Wasabi the J fucked around with this message at 08:35 on Sep 4, 2013 |
# ? Sep 4, 2013 08:32 |
|
Overloading a RF receiver doesn't mean destroying it, it usually means a strong signal which causes interference, for example desensing of the receiver which will seem to make it "deaf". This can be a concern with poorly shielded cables if part of the run passes near a transmitter, keying a 5W UHF transmitter near receivers can often cause problems in my experience. As mentioned, TCP data transfers are checksummed, you need to look at how many packets are retried with a network analyzer to say anything. RF induction is a real problem, TV signals tend to be around 60 dBuV, around a mV, EMI and RFI from fairly normal devices can cause surprisingly large fields to develop. Using unshielded cables for data transmission is virtually unheard of in military applications in part because of requirements that interference be controlled. And yes, every single digital transmission exists as an analog signal in a transmission line, an optical signal or an RF field in the air. The fact that you can get a good tv signal from far away means that you bought a good antenna and pointed it the right way, good job. There is a point where it's good enough, but I hate this notion that coat hangers are just as good as coax for transmitting a signal, they're not, in some cases it's fine but try stringing a 10 meter run of S/PDIF over coat hangers and coax over a fluorescent lamp, I'd bet one would perform better.
|
# ? Sep 4, 2013 10:31 |
|
longview posted:There is a point where it's good enough, but I hate this notion that coat hangers are just as good as coax for transmitting a signal, they're not, in some cases it's fine but try stringing a 10 meter run of S/PDIF over coat hangers and coax over a fluorescent lamp, I'd bet one would perform better. If I transmit 010011011010 and 010011011010 is read at the other end, then one is as good as the other no matter how many theories you come up with as to how interference can be happening. Like I said, the SNR ratios on short-run cable lengths are so good that you need to cause massive physical damage before it starts impacting the signal. 100db is 1:100,000 of signal to noise. You can find something that increases the noise by 20,000 times the norm, and still be absolutely no-where near causing an error. Your connector can be so dirty that it only conducts 1/1000th of the voltage it's meant to, and viola, you still have a perfect transfer. RF induction is, again, orders of magnitude below causing an issue with HDMI / USB or any other digital cables. Can it cause interference at the aerial? Yes of course, because there we're dealing with uV and SNR ratios are strained as it is. In the context of a home, digital cables need to be either faulty or unsupported to cause issues. There's not really two ways about it.
|
# ? Sep 4, 2013 12:28 |
|
Wasabi the J posted:And when someone was doing a legitimate review for like $1300 worth of equipment (described by some as an entry-level system) with J-Pop. I'd like to know what 'entry' level is defined as. I read all the equipment mags. I love seeing all the cool looking stuff I'd buy if I won the lotto. Most everything in the Euro mags are priced around $10,000 and described as entry level. Nothing in these magazines usually costs less than $5,0000 when converted from the English pound. They often compare them to the same manufactures other offerings which are usually four times the price, but this is a steal because it does 90% of what the other thing does as a quarter of the price! Speaker cable advertisements have financing plans available. Seriously. I would kill for an audio rag that kept a $1,000 ceiling on any and all equipment reviewed. Philthy fucked around with this message at 15:26 on Sep 4, 2013 |
# ? Sep 4, 2013 15:24 |
|
It's been suggested that I work for Blue Jeans Cable because all I do is bring them up in this thread. I keep bringing them up because I think their articles are super relevant to this kind of discussion, even if you would never buy their products - either that or I've just been brainwashed by them. I know that a lot of people here have never seen an HDMI/DVI signal at the edge of failure, but you start to lose individual pixels or groups of pixels. It is true of course that either the picture is 100% digitally right, or it's not, but it's not always clear at first that you're having issues with HDMI when you have weird intermittent problems with a few pixels. Blue Jeans Cable posted:We tend to assume, when thinking about wire, that when we apply a signal to one end of a wire, it arrives instantaneously at the other end of that wire, unaltered. If you've ever spent any time studying basic DC circuit theory, that's exactly the assumption you're accustomed to making. That assumption works pretty well if we're talking about low-frequency signals and modest distances, but wire and electricity behave in strange and counterintuitive ways over distance, and at high frequencies. Nothing in this universe--not even light--travels instantaneously from point to point, and when we apply a voltage to a wire, we start a wave of energy propagating down that wire which takes time to get where it's going, and which arrives in a different condition from that in which it left. This isn't important if you're turning on a reading lamp, but it's very important in high-speed digital signaling. There are a few considerations that start to cause real trouble:
|
# ? Sep 4, 2013 19:55 |
|
KozmoNaut posted:Nope, DVI-D cable. No friggin way anything analog was coming through that cable. I know how noise on a VGA or analog DVI connection looks on a monitor, this didn't look like that at all. I can't understand how a cable noise issue could cause a specific pattern of video problem, such as dark blue turning black. As I understand it, if you had so much noise that the ADC couldn't reliably tell 1 from 0 anymore, you'd expect to see random flipped bits, which would in turn cause decoder errors, and thus create pixellation and splotches. It would be awfully weird if the noise was only affecting the handful of low-order 1 bits that made the difference between blue and black, but not the high order bits, or the low order blue and red bits, or whatever. Is there something in the wire protocol for DVI that would do that?
|
# ? Sep 4, 2013 21:55 |
|
Blue Jeans Cable posted:We're often asked why that's so bad. After all, CAT 5 cable can run high-speed data from point to point very reliably--why can't one count on twisted-pair cable to do a good job with digital video signals as well? And what makes coax so great for that type of application? The color channels are all carried on separate twisted pairs. It's totally possible that the blue channel would have issues while the other channels would be fine. Edit: DVI and HDMI carry video data the same way, if that wasn't clear.
|
# ? Sep 4, 2013 22:01 |
|
Opensourcepirate posted:The color channels are all carried on separate twisted pairs. It's totally possible that the blue channel would have issues while the other channels would be fine. One place I'm more than willing to spend a few extra bucks on cables is Neutrik connectors for cables that see frequent use. The strain reliefs they use beat any other I've seen hands down, and they're typically rated for twice as many insertion/removal cycles as the cheap ones. Part of my job is designing and building high-tech classroom consoles where the cables will typically be plugged in and unplugged several times a day in addition to taking a lot of abuse from being moved around, tripped over, etc. Neutrik rebrands their connectors as Rean and sells them for half the price with nickel plating instead of gold and powder coat, so it's not even that much more.
|
# ? Sep 4, 2013 23:27 |
|
Zorak of Michigan posted:I can't understand how a cable noise issue could cause a specific pattern of video problem, such as dark blue turning black. As I understand it, if you had so much noise that the ADC couldn't reliably tell 1 from 0 anymore, you'd expect to see random flipped bits, which would in turn cause decoder errors, and thus create pixellation and splotches. It would be awfully weird if the noise was only affecting the handful of low-order 1 bits that made the difference between blue and black, but not the high order bits, or the low order blue and red bits, or whatever. Is there something in the wire protocol for DVI that would do that? So if a block is too degraded to extract the data then it gets switched to black or blue or a lower bandwidth parallel signal if such is supported.
|
# ? Sep 4, 2013 23:37 |
|
Opensourcepirate posted:It's been suggested that I work for Blue Jeans Cable because all I do is bring them up in this thread. I keep bringing them up because I think their articles are super relevant to this kind of discussion, even if you would never buy their products - either that or I've just been brainwashed by them. Well, you're lambasted for it because you link stuff like this. Now whilst there is some technical merit to the points they're making, they are no-where near the "problem" that is being talked up in that piece. It's propaganda, designed to make designing good cables seem very difficult, so people are more likely to buy theirs, as they look like they know more than other people and oh god isn't this complicated I better trust the experts. HDMI complicates things much more than it needs to, in a large part because it's a huge money-making racket on selling people $100 cables when $10 cables will suit 99% of all applications. Elsewhere, the same USB extension cable I bought to carry 1mbit USB 1 speed kept working at 12mbit USB 2.0 speeds, also despite being longer than the recommended maximum length by 3m (that is to say, until I moved house and didn't need it anymore). Nothing beats the Cat5e network cable for an example of how easy one can transmit digital data, though. Originally designed to operate at Base-10 speeds, the same cables will operate at Base-100 and Base-1000. Yep, that's 100-times the data on the same cable. HDBaseT is a method of carrying video (10Gbit/s) over Cat5e cables, at up-to 100m lengths. Not bad, considering HDMI 1.4 is only 10.2Gbit/s for 15m - both therefore able to carry 4k images at 30fps, for a theoretical future where we actually have 3840 × 2160p content. FWIW Cat5e cables operate at 100Mhz, some 33% higher than the "severe" 75Mhz they talk about poor little HDMI having to deal with (though HDMI can indeed exceed this, it's more a jab at their ridiculous language). It should tell you something when cable designed in the 80s doesn't need any physical changes to work at many, many times the speed it was designed to. HDMI cables exist for two reasons: - To make money off people - HDCP The ridiculous need for point-to-point encryption is why we have HDMI, and don't see better (cheaper) alternatives like HDBaseT or USB hooking up digital devices. Khablam fucked around with this message at 10:33 on Sep 5, 2013 |
# ? Sep 5, 2013 10:29 |
|
|
# ? May 15, 2024 03:06 |
|
GWBBQ posted:One place I'm more than willing to spend a few extra bucks on cables is Neutrik connectors for cables that see frequent use. The strain reliefs they use beat any other I've seen hands down, and they're typically rated for twice as many insertion/removal cycles as the cheap ones. Part of my job is designing and building high-tech classroom consoles where the cables will typically be plugged in and unplugged several times a day in addition to taking a lot of abuse from being moved around, tripped over, etc. Neutrik rebrands their connectors as Rean and sells them for half the price with nickel plating instead of gold and powder coat, so it's not even that much more. I love Neutrik stuff, their entire line is great, I never knew Rean were a rebrand of theirs. I'm also a fan of Amphenol kit, particularly for stuff like instrument cable plugs as they're just a joy to solder.
|
# ? Sep 5, 2013 14:04 |