|
Post by ragan on Dec 22, 2016 11:35:15 GMT -6
Interesting. What kind of distances would be needed to manifest that loss in a balanced cable? Ok, lets look at a few things here. I took the specs for Belden Brilliance 8451 mic cable and simulated them. 14.1 ohms per 1000ft = 0.0141 ohms per foot. 170nH per foot. 67pF per foot, worst case. 45R characteristic impedance (although for this, I did NOT model the transmission line, nor a balanced/differential setup, just simple lumped LCR sim) Ok, so lets get real world here and assume a 200R source like a mic would have, and 1200R load like a preamp might have. That makes the -3dB point more like 15.5MHz.. So ok, now lets start assuming longer runs of cable. I'm going to just multiply the 1ft values by the length for the sake of this discussion even though that's not really how it works in real life, and those parasitics would not scale like that.. 10ft: -3dB of 1.5MHz. 20ft: -3dB of 750KHz. 60ft: -3dB of 240KHz, -1dB of 125KHz. 100ft: -3dB of 150KHz, -1dB of 77KHz. 150ft: -3dB of 100KHz, -1dB of 50KHz. 200ft: -3dB of 70KHz, -1dB of 36KHz. 250ft: -3dB of 58KHz, -1dB of 30KHz. 300ft: -3dB of 47KHz, -1dB of 25KHz. Starting to see the trend? it's semi-log since we're dealing with unmatched filters. This will change with every source and receiver! If these were matched source/receiver, you'd extend your freqency response by almost double if the system was matched at 200R.. However, you'd almost HALF it by matching at 600R. In layman's terms, the higher the impedance match, the more "effective" the parasitic filter is. This is important to know. So you see, it's NOT just as simple as using a "good cable". The impedance match matters greatly as well, but since there is NO standard for mic/line impedance matching, you're pretty much just guessing. So, for a 200R mic, like most dynamics are, and a 1200R input like some of the classic Neves are, you can get out over 300ft without too much trouble using a median-spec'd cable. HOWEVER.. As I mentioned above, if you're using line-level equipment with 2K-50K impedance, your distances are going to be greatly reduced. As with mic terminations, there is NO standard for line-level equipment either. Most manufacturers use somewhere between 5k and 50k though, but that can greatly change the characteristic of your cable. So I've said it before, and I'll say it again, the cable is one of the least important parts of the interconnection of the system. The MOST important part is the knowledge of what you're doing, and why you're doing it. You could use the cheapest cable in the world if you knew that you'd have 500KHz of bandwidth going 10ft at 200R. You could use the best cable in the world but still ruin your signal by leaving only 10KHz of bandwidth going 30ft between line-level I/O at 50K impedance. So use this info, buy what you want, but don't be fooled into thinking that the cable is the only thing that matters. What you're doing is creating a system, and the only thing that matters is that the system works well together, regardless of brand, specs or cost. I think you'll find that normal 10-100ft runs of cable will be fine with just about any commercial mic cable. The interconnects matter more, mainly for reliability. If I had to spend money on cable, I'd worry more about the line-level cables. I'd keep them short as possible first, then buy better cables second. However, if you troll Ebay, you can find used lengths of "name brand" cables/snakes for pennies on the dollar, if you don't mind soldering. Wow. Thanks for typing all that up. I think I get the general point but most of that is frankly way over my head! I wish it weren't. I need to bone up on this stuff.
|
|
|
Post by svart on Dec 22, 2016 12:07:21 GMT -6
Pretty cool Svart. I remember more than a decade back, I ordered a digital cable from The Cable Company, ( Acoustic Zen). They allow people to try a few cables and return them, putting the small cost of them sending their demo cables toward the purchase. I vaguely recall the main guy there mentioning that impedance was much more important than most people are aware of. I believe the cable I chose was 110 ohms, but it definitely sounded better than the 75 ohm cable. I know very little about the electrical engineering, so forgive me I'm my terminology is wrong, but I did hear a difference, and Svart is probably on to something. It's not quite the same thing, but it is related. Cable itself doesn't have an impedance. It requires a source and receiver impedance to assume it's own impedance... That system thing is important. However, what it does have is characteristic impedance, which in simple terms is the system impedance at which the cable works best at. If the cable has a higher characteristic impedance, then it might work slightly better in a mismatched environment, if the characteristic impedance is closer to the source impedance. There is also the fact that cables built for higher impedance, will also have less lossy parasitic elements, but those don't necessarily mean anything on their own. So lets sim a foot of cable using the same source and receiver impedances, but with a transmission line characteristic impedance of 45R, with the same parasitic specs as the Belden 8451. I get -3dB at 10.5MHz. Now, lets move that to 110R impedance.. Now that -3dB point becomes 12.5MHz, roughly 17% difference. A "large" difference, yes, but still almost 4 orders of magnitude higher than absolutely necessary.
|
|
|
Post by Ward on Dec 22, 2016 12:12:10 GMT -6
Johnkenn... svart is doing math again. Please make him stop. My mind is still hurting from all the room geometry I had to do this morning, putting in a new ceiling in the studio. UGGHHH
|
|
|
Post by EmRR on Dec 22, 2016 12:39:36 GMT -6
Yes, cheap connectors are generally a much bigger deal than the wire itself.
One missing part is capacitance between conductors which translates to an AC impedance, and directly related is AC impedance with respect to frequency. Mics can be like speakers, the impedance curve can be all over the place, as can the transformer input preamp. Depending on the preamp, you may get a different EQ curve out of the mic; this is highly dependent on the mic type and the preamp type. An immediate example illustrating this is how different some ribbon mics sound in transformer coupled preamps versus transformerless preamps, frequently having much better treble presence in the transformer preamp, due to that impedance curve equalization factor. This is with both preamp types bench testing substantially and equivalently flat within the 'audible' range (a flawed test in itself, another story for another day).
If the AC impedance of the wire with respect to frequency is low enough to get on the same graph, you have an unexpected interaction changing the sound of the mic yet again. Higher frequencies are always a lower Z load when I checked a bunch of mic cables, and some unbranded cheap mic cable gave an impedance reading of 2800Hz(!) at 120Hz, which is in itself already down into typical preamp input impedance range. Canare standard mic cable and 110 ohm AES cable read more in the 100kHz-200kHz range.
Unless I'm missing something in the data. This is an ongoing experiment prone to error and peer review......
|
|
|
Post by svart on Dec 22, 2016 12:44:29 GMT -6
Johnkenn ... svart is doing math again. Please make him stop. My mind is still hurting from all the room geometry I had to do this morning, putting in a new ceiling in the studio. UGGHHH Well, I'm not doing the calculus, my sim program is. It'd take me an hour to work through just a piece of this stuff doing long math.. And my trig/calc/algebra/diff-eq teachers used to say that nobody would have a calculator handy to do this stuff in the real world.. L-O-L I haven't met an engineer in the last decade that has had to do full, long hand, calculus/trig/diff-eq.. It's ALL sim now. I still use algebra almost every day, and I DO use the principles behind higher maths like diff-eq and calc everyday though.
|
|
|
Post by svart on Dec 22, 2016 13:08:40 GMT -6
Yes, cheap connectors are generally a much bigger deal than the wire itself. One missing part is capacitance between conductors which translates to an AC impedance, and directly related is AC impedance with respect to frequency. Mics can be like speakers, the impedance curve can be all over the place, as can the transformer input preamp. Depending on the preamp, you may get a different EQ curve out of the mic; this is highly dependent on the mic type and the preamp type. An immediate example illustrating this is how different some ribbon mics sound in transformer coupled preamps versus transformerless preamps, frequently having much better treble presence in the transformer preamp, due to that impedance curve equalization factor. This is with both preamp types bench testing substantially and equivalently flat within the 'audible' range (a flawed test in itself, another story for another day). If the AC impedance of the wire with respect to frequency is low enough to get on the same graph, you have an unexpected interaction changing the sound of the mic yet again. Higher frequencies are always a lower Z load when I checked a bunch of mic cables, and some unbranded cheap mic cable gave an impedance reading of 2800kHz(!) at 120Hz, which is in itself already down into typical preamp input impedance range. Canare standard mic cable and 110 ohm AES cable read more in the 100kHz-200kHz range. Unless I'm missing something in the data. This is an ongoing experiment prone to error and peer review...... The term "impedance" inherently means that it's an AC transmission line. For DC, it's resistance. (Sometimes it's mixed where you'll use low impedance drivers with high impedance receivers and use resistors to broadband match the transmission line.. And you'll end up with both a DC resistance and an AC impedance that are correlated to each other) I'm not sure what specs you're referring to by saying "2800KHz at 120hz". Impedance is always specified in ohms, and can be spec'd at a specific singular frequency, or on a curve. One thing about cables too. The higher the characteristic impedance, the thicker the dielectric between each of the conductors and the shield unless you increase the cross section of the conductors first. It's a tricky balancing act for sure! As for your ribbon mic example, there are some other considerations that need to take place. Most of my sims have been extremely simplistic and only covering the linear sweep aspect. Adding transformation and other complex elements really takes things to another level of complexity. First, a transformer also does not have an "impedance" per se, it has a characteristic impedance and needs to have external impedances applied to it in order to conform and perform optimally. Secondly, you have complex interactions between the primary, secondary and the source. A transformer's performance can be greatly affected by the power of the source driving it, in addition to it's own characteristics. A ribbon mic motor is such an incredibly small power source, that even when using extremely high ratio windings, the mic would have trouble providing the power necessary to overcome cable and receiver parasitics. Essentially it's like trying to push a car uphill.. But yes, you do also have changed in impedance over frequency due to a lot of attributes of the system. Some of these are from the components of the system, some are from the mismatch in impedances, and some are from complex effects like return loss (which are part of the mismatch). Some can be investigated as singular issues, but most are moving targets due to the relationships between the components in the system. if you change one thing, the results of the other issues change as well. This is what makes transmission line theory one of the harder aspects of engineering for folks. In this case, a higher ratio transformer would ultimately drive the system to better linearity in both frequency and impedance, but at the expense of a drastically larger transformer, and greatly reduced output level.. Not really the cable's fault, but a less lossy cable could improve some of the high end, but it's more of an electrical work-around than anything. The more optimal thing to do would be find a preamp with higher nominal input impedance. That would have a much greater effect than a cable exchange. It kind of reminds me of an engineering joke.. Q: When is a cable not a cable? A: It's not. At least to us transmission line folks, it's a joke, because you'd think a single conductor should be so simple, but it's ridiculously complex at the same time.
|
|
|
Post by ragan on Dec 22, 2016 14:17:54 GMT -6
Great stuff, Svart. Super informative.
|
|
|
Post by EmRR on Dec 22, 2016 14:58:03 GMT -6
I'm not sure what specs you're referring to by saying "2800 KHz at 120hz". Impedance is always specified in ohms, and can be spec'd at a specific singular frequency, or on a curve. That typo blew the whole meaning, and is vague shorthand. 2800Hz load Z with 120Hz test signal. Switch to 1kHz test signal and it goes down in the hundreds of ohms. Seriously increasing loading along a curve. You would say "there's something wrong with that cable, try another", and I'd tell you another measured the same. Frequently older transformer coupled preamps have a much higher reflected impedance at high frequencies than the average linear modern transformerless unit. When you switch a typical bridging pad inline with those type preamps you frequently load the mic more heavily, and in a more linear fashion. This is the source of the whole trend you read everywhere. The change in preamp response is measurable too. Point being if the curve of any piece of wire interacts with these other curves, you have unintended (usually negative) consequences. If you use higher quality wire with lower capacitance and inductance between conductors, you lessen the chances of interaction. And a cable change can be icing on the cake. It's kinda funny that for the specific instance of a ribbon mic that legacy preamps may do the least damage if subbed into the equation of mic/wire/preamp impedance curve over the average modern design. If I recall the AEA/etc ribbon preamps tend to be more like 17K ohm input Z as opposed to the typical 1K5-5K input Z. I have seen legacy preamps have input Z over 25K above 12kHz.
|
|
|
Post by Martin John Butler on Dec 22, 2016 15:13:38 GMT -6
Svart said, "because you'd think a single conductor should be so simple, but it's ridiculously complex at the same time".
Uhh.. exactly. I've been saying that in a completely different way when trying to explain my hearing differences between interconnects and power cords. Jim Williams would have a technical answer, but I'm just a singer/songwriter using my ears to guide me.
|
|
ericn
Temp
Balance Engineer
Posts: 14,955
|
Post by ericn on Dec 22, 2016 18:05:18 GMT -6
Damn you Chris ! Stop it ! My Brain already hurts ! I spent my day with 3 boys 6-12 amongst a couple of hundred other kids ! I don't even have a drink in my hand yet ! Oh the horror ! Oh the holidays! 😜
|
|
|
Post by EmRR on Dec 23, 2016 12:29:34 GMT -6
The term "impedance" inherently means that it's an AC transmission line. For DC, it's resistance. (Sometimes {snip} you'll end up with both a DC resistance and an AC impedance that are correlated to each other) Thanks for bringing that so clearly to folks attention. In many cases DC resistance gets quoted for the whole thing, and it's not, and in others the DC resistance is true for all of audio bandwidth.
|
|
|
Post by svart on Dec 28, 2016 11:13:13 GMT -6
Svart said, "because you'd think a single conductor should be so simple, but it's ridiculously complex at the same time". Uhh.. exactly. I've been saying that in a completely different way when trying to explain my hearing differences between interconnects and power cords. Jim Williams would have a technical answer, but I'm just a singer/songwriter using my ears to guide me. That's not quite what I meant. The physics of cables are complex but very well understood in the engineering world, but not so much outside of the engineering world. There are NO special cases, nor are there "magic" conductors, just conductors that perform up to expectations or those that don't. If they don't, then there are reasons for this, but very few have the desire to figure out why. It seems that users have a progression of perception in cases like this. 1. User sees such a physically simple object, and that physical simplicity (in addition to electrical ignorance) implies that it must also be electrically simple. 2. Now, because it's believed to be electrically simple, it must mean that the only attributes that matter are the materials and build quality. 3. Because materials and build quality are the only attributes the end user has free will to change, they are revisited until the user believes that they are the only characteristics that are believed to be important. To me, it's doing nothing more than cementing an opinion through repetition. For cable attributes, you simply have a sliding window of cable characteristics. Very few of those affect lower frequencies, while those characteristics become increasingly more parasitical as you slide up in frequency. It's the application of proper transmission line theory that doesn't work well in the audio design world, or really just the complete lack of transmission line application. The more I think about it, the more I realize that most audio hardware designers are actually banking on the lack of audio interface standardization, because creating a standardized interconnection system would negate most of the marketing fluff in the way that it will reduce unit-to-unit variability. Story time. I work in the CATV world, and my world exists at 50R and 75R over single conductor RG59 and RG6 cables from 5MHz to 3GHz. I have to manage everything from frequency response of the entire spectrum to phase delay to reflection issues as well as power transmission over coax, noise floor management, etc... Dozens to hundreds of things that could pop up, but are understood well enough that only a handful of characteristics of the cable are most important to analyze and design around. I have to ensure that all these things work down to a fraction of a dB with every single piece of CATV equipment that they might be plugged into, around the world. The only way I'm able to do that is by having very precise interconnection standards. From those standards you are able to tell exactly that RG59 cable will have a roll-off of frequencies that equate to losing 6dB at 1GHz over 50ft, while RG6 will allow you to go up to nearly 2GHz. They also tell me that the connectors designed around these cables and these transmission standards will also not interfere with my signal. But what does that mean? It means that the engineers who designed the standard know the cable characteristics and what it's capable of and have designed the impedance standard around it. That means that the end user will always be able to predict the result based on a few points of measurement because the impedance/tranmission standard is designed around nullifying the problems from mismatching loads and sources, AKA, the things that end up hurting audio transmission performance. So I guess after that long winded diatribe, the moral of the story is that if we had audio interconnection standards, the perception of "expensive" cables having magic properties would disappear, because the inherent mismatching between devices would not occur, and all device interconnects would perform as optimally as electrically possible. However, all of this would not come without cost. If we defined specs for the audio world, however, it would also start to limit the bottom end of the cable market. You'd be forced to buy at least a certain level of cable to attain the performance. Analog cables would likely start having woven shields with some higher priced insulators and slightly better connectors, etc. The lower end stuff would need to be "certified" to be in compliance before being sold, much like USB cables and such are now. You'd likely end up with cables in the mogami/beldon and switchcraft/neutrik price range as your typical cable, which is what most serious folks and studios end up with anyway, so it's not a stretch for most folks, but that certification stuff isn't free and the costs will be passed on to the consumer for the most part. Unfortunately that's all unlikely to happen. Even though pro-audio is a professional business, it doesn't really affect infrastructure very much, so it's seen as more of a hobby caliber profession by most engineers, and thusly ignored as far as standard associations are concerned.
|
|
|
Post by Bob Olhsson on Dec 28, 2016 16:25:25 GMT -6
Broadcast supply houses such as Markertek and Redco sell no-nonsense high quality cable to customers having staff engineers who can't afford failure.
The problem is music stores who view cables as a high margin add-on purchase to accompany gear they must sell at very competitive prices.
|
|
|
Post by Martin John Butler on Dec 28, 2016 17:25:29 GMT -6
I asked Mike at Blackspade Acoustics what he recommended for mic cables, and he said Redco, and Blackspade wouldn't let an inferior cable interfere with all the hard work they put in to designing something high end. I like the Mogami cable pretty much for most audio hardware connections.
|
|
|
Post by indiehouse on Dec 28, 2016 18:56:42 GMT -6
I thought I read somewhere that Mogami actually makes the Redco branded cable. That could be total BS, but people tell me things...
|
|
|
Post by Martin John Butler on Dec 30, 2016 9:06:50 GMT -6
Interesting indiehouse. I've never heard about this, but who knows, there may be a connection. Perhaps a supplying of parts thing?
|
|
|
Post by Bob Olhsson on Dec 30, 2016 12:36:23 GMT -6
Belden manufactures a huge percentage of different cable brands. My personal preferences are Gotham, Klotz and Asterope.
|
|
|
Post by drbill on Dec 30, 2016 12:43:38 GMT -6
I barely made it thru algebra in HS. Haven't touched or used it since. Does this mean if I can hear a difference in cables I should go back to HS?
Cause I can..... Hear a difference that is.......
|
|