Why is 75 Ω cable commonly used for cable TV instead of 50 Ω?

  • Thread starter Bernie G
  • Start date
  • Tags
    Cable
In summary: TV receive, power handling isn't a requirement so a nice round figure of 75 Ω is used to take advantage of the lower loss.These figures change a little with a more common Teflon, foamed etc dielectric.DaveDave also interesting to note that 300 ohm twin lead which is very uncommon nowadays is exactly 4 times the impedance of 75 ohms working out well for the common 4:1 baluns.
  • #1
Bernie G
330
13
The standard characteristic cable impedance for the microwave industry is 50 Ω. Why is 75 Ω the standard for cable TV? Making a 75 Ω low loss cable with a 3 GHz rating pushes the limits of technology. Typically a 50 Ω cable would be less than half the diameter, lighter, more flexible and easier to install, cheaper, and have better electrical characteristics.
 
Engineering news on Phys.org
  • #2
Bernie G said:
The standard characteristic cable impedance for the microwave industry is 50 Ω. Why is 75 Ω the standard for cable TV? Making a 75 Ω low loss cable with a 3 GHz rating pushes the limits of technology. Typically a 50 Ω cable would be less than half the diameter, lighter, more flexible and easier to install, cheaper, and have better electrical characteristics.

What is the input impedance of a dipole antenna? Why would that have an input on your question? :smile:
 
  • #3
berkeman said:
What is the input impedance of a dipole antenna? Why would that have an input on your question? :smile:

I should have been clearer. I'm referring to the 75 Ω coaxial cable that's used a lot in USA residential construction. The 75 Ω cable looks quite bulky compared to 50 Ω coaxial cable.
 
  • #4
Bernie G said:
I should have been clearer. I'm referring to the 75 Ω coaxial cable that's used a lot in USA residential construction. The 75 Ω cable looks quite bulky compared to 50 Ω coaxial cable.

you come from a false assumption

depends on the model of cable ... 75 Ω and 50 Ω come in all sorts of diameters

RG59 (75 Ω) and RG58 (50 Ω) are both 1/4 inch dia
RG6 (75 Ω) and RG8 (50 Ω) are similar size and lower loss
and so on

As to your actual; post question
Why is 75 Ω cable commonly used for cable TV instead of 50 Ω?

Berkeman's answer gives half the answer ...
berkeman said:
What is the input impedance of a dipole antenna? Why would that have an input on your question?

the other part is ...

you have to go a little further ... A cable, with an air dielectric, has a maximum peak power handling capability at a given impedance.
It also has a lowest loss figure for a given impedance.
Peak power handling comes at 30 Ω impedance but lowest loss is at 77 Ω
50 Ω coax is chosen particularly for cable systems where transmission and reception is involved as it is a compromise/trade-off between power handling and lowest loss. But for TV receive, power handling isn't a requirement so a nice round figure of 75 Ω is used to take advantage of the lower loss.

These figures change a little with a more common Teflon, foamed etc dielectric.

Dave
 
  • Like
Likes berkeman
  • #5
Also interesting to note that 300 ohm twin lead which is very uncommon nowadays is exactly 4 times the impedance of 75 ohms working out well for the common 4:1 baluns.
 
  • Like
Likes davenn
  • #6
Averagesupernova said:
Also interesting to note that 300 ohm twin lead which is very uncommon nowadays is exactly 4 times the impedance of 75 ohms working out well for the common 4:1 baluns.

yes, stemming from back in the days when just a folded dipole was a common TV antenna and it has an impedance of 300 Ω
The very first B&W Tv my mom and dad got way back in the late 60's even had a 300 Ω input ( no coax input)
Even when they (TV antenna manufacturers) went to folded dipole Yagi's they kept the 300 Ω thought, even tho the addition of a reflector and directors meant that the feedpoint was no longer 300 Ω
It dropped much closer to 200 Ω. where a 4:1 balun to 50 Ω coax would be more feasible
But, TV receive being what it is, no one really seemed to worry too much about a bit of a mismatch :wink: :rolleyes:
When transmit is also part of the process, matching becomes more important and my 10 element, folded dipole driven element Yagi on 2m (144MHz) ham band uses a 4:1 balun for good matching to a 50 Ω coax.Dave
 
  • #7
davenn said:
you come from a false assumption

depends on the model of cable ... 75 Ω and 50 Ω come in all sorts of diameters

RG59 (75 Ω) and RG58 (50 Ω) are both 1/4 inch dia
RG6 (75 Ω) and RG8 (50 Ω) are similar size and lower loss
and so on

As to your actual; post questionBerkeman's answer gives half the answer ...the other part is ...

you have to go a little further ... A cable, with an air dielectric, has a maximum peak power handling capability at a given impedance.
It also has a lowest loss figure for a given impedance.
Peak power handling comes at 30 Ω impedance but lowest loss is at 77 Ω
50 Ω coax is chosen particularly for cable systems where transmission and reception is involved as it is a compromise/trade-off between power handling and lowest loss. But for TV receive, power handling isn't a requirement so a nice round figure of 75 Ω is used to take advantage of the lower loss.

These figures change a little with a more common Teflon, foamed etc dielectric.

Dave

"RG59 (75 Ω) and RG58 (50 Ω) are both 1/4 inch dia
RG6 (75 Ω) and RG8 (50 Ω) are similar size and lower loss
RG59 (75 Ω) and RG58 (50 Ω) are both 1/4 inch dia
RG6 (75 Ω) and RG8 (50 Ω) are similar size and lower loss"

Thanks but 75 ohm cable installed in modern construction has about a 3 GHz rating and all the 75 ohm cables I've seen are quite large in diameter to handle this high frequency. RG59 and RG6 have frequency ratings a lot lower than 3 GHz.
 
  • #8
Bernie G said:
"RG59 (75 Ω) and RG58 (50 Ω) are both 1/4 inch dia
RG6 (75 Ω) and RG8 (50 Ω) are similar size and lower loss
RG59 (75 Ω) and RG58 (50 Ω) are both 1/4 inch dia
RG6 (75 Ω) and RG8 (50 Ω) are similar size and lower loss"

Thanks but 75 ohm cable installed in modern construction has about a 3 GHz rating and all the 75 ohm cables I've seen are quite large in diameter to handle this high frequency. RG59 and RG6 have frequency ratings a lot lower than 3 GHz.

you didn't really read all of what I said ... did you :wink:

davenn said:
depends on the model of cable ... 75 Ω and 50 Ω come in all sorts of diameters

Note that comment I made

now if you want to be specific. then state the model/type of cable that you have regularly seen
I only named a few to give you an idea, there are many, many types

Thanks but 75 ohm cable installed in modern construction has about a 3 GHz rating and all the 75 ohm cables I've seen are quite large in diameter to handle this high frequency.

that is also a totally incorrect assumption ... increasing physical size is for 2 reasons
1) power handling and
2) lower loss

Power handling for TV receive is pretty irrelevant ... lower loss if VERY relevant
in fact as cable diameter increases its maximum useable frequency decreases. This is because a
coax transmission line is basically a waveguide and if the waveguide is too large a diameter
then multimoding occurs and that leads to additional losses

RG58 and 59 are lower freq rated only because they are extremely lossy above around 500MHz or soDave
 
  • #9
I don't have access to a sample cable right now, but all the 3 GHz low loss (low power) 75 ohm cable I've seen has been large diameter. Isn't it much easier to make a high frequency 50 ohm cable than a 75 ohm cable?
 
  • #10
Bernie G said:
I don't have access to a sample cable right now, but all the 3 GHz low loss (low power) 75 ohm cable I've seen has been large diameter

all cables will operate to and well beyond 3GHz ( not sure why you are fixated with the 3GHz figure?
the only difference is that some cables will have greater loss at 1 to 5 GHz. So the design of the cable determines the attenuation (loss) the cable has
per 100 ft (30 metres). The loss is measured in dB. The spec's for a given cable will show a table of the losses for a range of frequencies / 30 metres
lets choose my fav cable for the 144MHz to 1300MHz (1.3GHz) LDF4-50A

t006_r00697_v5.jpg
this is 1/2 inch diameter ( maybe a tad more)
solid copper or sometimes copper clad aluminium, centre conductor, foam dielectric and a copper outer conductor

Freq -- dB/100m -- dB/100ft
150 ----- 2.673 ------ 0.815

450 ----- 4.749 ------ 1.447

1000 --- 7.284 ------- 2.22

1500 --- 9.093 ------ 2.771 you can see how the attenuation increases rapidly as the frequency rises.
every 3 dB of loss you halve your power ( that applies to transmit or receive signals)

I have also often used its big brother on commercial installations, LDF 5-50, 7/8 inch diameter ( close enough to 1 inch dia)

the lowest attenuation is achieved by having an all air dielectric, but this is impossible in a coaxial cable as there needs to be something to support the inner conductor in the centre of the cable. One way around this is to use a spiral of Teflon. This keeps the 2 conductors uniformly separated but still achieves a dielectric that is >80% air. See the centre coax in the pic below

upload_2015-10-15_8-21-53.jpeg

for frequencies 5GHz and greater and for lengths more than a metre or so, waveguide is used. This has very low loss compared to any coax cable
Bernie G said:
Isn't it much easier to make a high frequency 50 ohm cable than a 75 ohm cable?

no, the manufacturing process is just the same

do you understand what determines the impedance of a coaxial transmission line ? ...

Coax impedance determination
The impedance of the RF coax cable is chiefly governed by the diameters of the inner and outer conductors. On top of this the dielectric constant of the material between the conductors of the RF coax cable has a bearing. The relationship needed to calculate the impedance is given simply by the formula:

coax-cable-impedance-equation-01.gif
Where:
Zo = Characteristic impedance in Ω
εr = Relative permeability of the dielectric
D = Inner diameter of the outer conductor
d = Diameter of the inner conductor

Note: The units of the inner and outer diameters can be anything provided they are the same, because the equation uses a ratio.

that should be εr on the bottom line ... not going to name the page it came from as I don't particularly like some of their other garbage

Wiki gives a derivation of this formula ...

Derived electrical parameters[edit]
  • Characteristic impedance in ohms (Ω). Neglecting resistance per unit length for most coaxial cables, the characteristic impedance is determined from the capacitance per unit length ([PLAIN]https://upload.wikimedia.org/math/0/d/6/0d61f8370cad1d412f80b84d143e1257.png) and the inductance per unit length ([PLAIN]https://upload.wikimedia.org/math/d/2/0/d20caec3b48a1eef164cb4ca81ba2587.png). The simplified expression is ([PLAIN]https://upload.wikimedia.org/math/1/7/d/17d9d3ce2d1beb4487f2801b1e1b5046.png). Those parameters are determined from the ratio of the inner (d) and outer (D) diameters and the dielectric constant ([PLAIN]https://upload.wikimedia.org/math/c/5/0/c50b9e82e318d4c163e4b1b060f7daf5.png). The characteristic impedance is given by[7]
[PLAIN]https://upload.wikimedia.org/math/4/2/f/42f82056c01eef1ea64f5ce256b6a2c2.png[/U]
Assuming the dielectric properties of the material inside the cable do not vary appreciably over the operating range of the cable, this impedance is frequency independent above about five times the https://en.wikipedia.org/w/index.php?title=Shield_cutoff_frequency&action=edit&redlink=1 . For typical coaxial cables, the shield cutoff frequency is 600 (RG-6A) to 2,000 MHz (RG-58C).[8]
cheers
Dave
 
Last edited by a moderator:
  • Like
Likes berkeman
  • #11
Here is some 4.5 GHz RG6 commercial cable: http://www.broadbandutopia.com/belden1694a.html
I didn't know they could readily make 75 ohm cable that high but still think 50 ohm cable would probably be more practical. Don't 50 ohm cables generally have better performance than 75 ohm cables? Would it make sense to standardize most 75 ohm coaxial cable transmission equipment to 50 ohms?
 
  • #12
Bernie G said:
I didn't know they could readily make 75 ohm cable that high but still think 50 ohm cable would probably be more practical.

yes they can, and no not more practical

Bernie G said:
Don't 50 ohm cables generally have better performance than 75 ohm cables?

again, no

Bernie G said:
Would it make sense to standardize most 75 ohm coaxial cable transmission equipment to 50 ohms?

again, no

Why ? ... read again what I wrote way back in post #4 ...

davenn said:
the other part is ...

you have to go a little further ... A cable, with an air dielectric, has a maximum peak power handling capability at a given impedance.
It also has a lowest loss figure for a given impedance.
Peak power handling comes at 30 Ω impedance but lowest loss is at 77 Ω
50 Ω coax is chosen particularly for cable systems where transmission and reception is involved as it is a compromise/trade-off between power handling and lowest loss. But for TV receive, power handling isn't a requirement so a nice round figure of 75 Ω is used to take advantage of the lower loss.

These figures change a little with a more common Teflon, foamed etc dielectric.

that is why 75 Ω is much better for this type of serviceDave
 
  • #13
Thank you for that good concise answer.
 
  • Like
Likes davenn
  • #14
any more questions ... ask away :smile:

we all do our best to helpD
 

Related to Why is 75 Ω cable commonly used for cable TV instead of 50 Ω?

1. Why is 75 Ω cable commonly used for cable TV instead of 50 Ω?

The main reason for using 75 Ω cable for cable TV is to reduce signal loss and ensure better signal quality. This is because 75 Ω cable has lower attenuation (signal loss) compared to 50 Ω cable, making it more suitable for long distance transmission of high frequency signals.

2. What is the difference between 75 Ω and 50 Ω cable?

The main difference between 75 Ω and 50 Ω cable is the impedance. Impedance is a measure of the opposition to the flow of electrical current in a cable. 75 Ω cable has a higher impedance compared to 50 Ω cable, which means it has lower signal loss and is more suited for high frequency signals.

3. Can 50 Ω cable be used for cable TV?

Technically, 50 Ω cable can be used for cable TV. However, it is not the most optimal choice. 50 Ω cable is better suited for lower frequency signals, such as those used in radio communication. For cable TV, 75 Ω cable is the standard and recommended choice for better signal quality.

4. Are there any other advantages of using 75 Ω cable for cable TV?

In addition to lower signal loss, 75 Ω cable also has better impedance matching with the equipment used in cable TV systems. This means that the signal can be transmitted more efficiently and with less interference, resulting in better overall picture and sound quality for viewers.

5. Is there a reason why 50 Ω cable is not used for cable TV?

As mentioned before, 50 Ω cable is not the most optimal choice for cable TV due to its higher signal loss and lower impedance matching. Additionally, 75 Ω cable has been standardized and widely used in the cable TV industry, making it more accessible and cost-effective compared to 50 Ω cable.

Similar threads

  • Electrical Engineering
2
Replies
37
Views
36K
Replies
11
Views
6K
  • General Engineering
Replies
19
Views
10K
  • General Discussion
Replies
4
Views
7K
  • General Engineering
Replies
16
Views
6K
Back
Top