Transmission Line Simulation Questions

In summary, the simulated circuit has a transmission line with R, L, C, and G parameters. Depending on the physical dimensions of the wire and insulation dielectric constant, the R, L, C, and G parameters were computed. The general topology of the circuit is a 3 Node network, with line terminations of 120Ω at the two ends of the bus. For simulation purposes, one node of the network will be transmitting, and the others will be receiving.
  • #1
roTTer
18
1
I am simulating an unshielded twisted wire pair of a communication network, Controller Area Network.

Bus Speeds taken into consideration are 250/500/1000/2000 kbps.

Depending on the physical dimensions of the wire and insulation dielectric constant, I have computed the R,L,C,G parameters. Since, the wire length will be smaller than 300meters (i.e. considering a 1Mhz freq.), I am using lumped parameters for each section of the wire for simulation purposes.

The general topology of the circuit is a 3 Node network, with line terminations of 120Ω at the two ends of the bus. For simulation purposes, one node of the network will be transmitting, and the others will be receiving. The lumped model is defined by the generic, RLCG model.

There are a few things, where I cannot get my head around, need your help on it:

1. The orientation of the model? Generally, the model has R, L at the node where the current is entering the circuit. Is this a rule?

2. If I model the circuit to have a more distributed parameter, for e.g. instead of having a R,L,C,G model of one section of length 1m, I could split it up in 10 models of 0.1m for every section. Doing this resulted in a very distorted waveform, when the node is not transmitting(at 2.5V, everytime, everywhere). When I just use one model, that is lumped, the waveform is even, and almost follows the measured values. Why?

3. As I move farther away from the transmitting node, the initial spikes(transient) keep on increasing. This is not the case, in measured voltage waveforms. Why does this happen?

I have attached files: Capture being the simulated circuit, 3 Node at Sender being waveforms for transmitting end, and 3 Node at Farthest end being, well that is self explanatory. '004' is the measured waveform in the lab.

Thank you for your time.
 

Attachments

  • Capture.JPG
    Capture.JPG
    21.6 KB · Views: 477
  • 004.jpg
    004.jpg
    55.1 KB · Views: 424
  • 3 Node at Sender.PNG
    3 Node at Sender.PNG
    12.7 KB · Views: 441
  • 3 Node farthest end.PNG
    3 Node farthest end.PNG
    17 KB · Views: 449
Last edited:
Engineering news on Phys.org
  • #2
I don't have all the answers for you. I still need to digest exactly what you are doing.
First off: The idea of relating 300m to 1 Mhz is erroneous. Electrical wavefronts travel at, say, about 0.5 to 0.75 ns per foot (velocity of propagation in the cable). You will see reflections whenever the distance traveled is larger than the propagation distance during a rise/fall time. The period is not the issue. If you want accurate simulation your model should have enough sections to deal with the rise/fall time.

Here is some model examples:
http://www.ece.uci.edu/docs/hspice/hspice_2001_2-269.html

I didn't find any numbers for commercial cables, but they must be out there.

Note that multi-drop transmission systems will always have edge distortion in the middle taps.
 
  • #3
meBigGuy said:
You will see reflections whenever the distance traveled is larger than the propagation distance during a rise/fall time...If you want accurate simulation your model should have enough sections to deal with the rise/fall time.

By this you mean:

"Let's say the propagation delay of the wire is 5ns/meter, (1.52ns/foot). Because it has to propagate back through the return wire, a delay of 5*2=10ns/m.

With the transceivers transition (rise/fall) time being in the range of 60ns to 160ns.

Lets assume, total time delay of the cable to be 10ns/m. Rise/Fall time of transceiver be 100ns. Does it mean that 100ns/10 = 10meters, be the critical length? Meaning, if I have a section larger than 10m, I have a problem."
Is my understanding right?

If that is the case, I have accounted for it, and most of my sections are close to around 0.2 to 2 meter.

meBigGuy said:
Note that multi-drop transmission systems will always have edge distortion in the middle taps.

By middle taps you mean the connecting points between two sections? Is there a way to mitigate it?
Does it happen due to "charging" of the following section?

Thank you for your reply!
 
Last edited:
  • #4
Maybe I erred. The middle taps would only see distortion as I was thinking of it if you were driving a series terminated line (open ends).

If you are driving with low impedance drivers then you need terminators to avoid reflections (I see what look like terminators). In that case if the taps change the impedance, they they will have effects.

One problem with lumped models is that you tend to think of "charging". In a distributed model the transmission line presents its characteristic impedance always since the capacitance and inductance are distributed evenly along the cable (as are the losses).

As for the glitches you are seeing, I think I understand. It looks like an imbalance in the propagation paths (delay) and crosstalk between the lines. The purple path is earlier than the green path.
 
  • #5
The transceivers works this way: During the recessive stage both the pins are at 2.5V. In the dominant stage, one is pulled at 3.5V while the other is 1.5V.

As you mention the glitches are caused due to the crosstalk and propagation delay, I thought about it and googled too, but I can't seem to find any explanation(can you please elaborate a little on this phenomenon?) to it or what could be done to avoid it.

The cable has been designed to have a delay of 5ns/m with a char. impedance of 120Ω. It is terminated at both ends at with resistors of 120Ω.

From what I understand, the current in the inductance is causing the spikes at those points. Is it common for that to happen even in the real world, or is it just a limitation of the lumped models or a limitation in design?
 
  • #6
EDITED: blah blah blah

The transceiver needs to be open circuit during idle. If you have long cables to each transceiver, that will be an issue.
 
Last edited:
  • #8
Thanks!

A quick look at the circuit reveals that the new R & L values will be half of the original value and will be arranged according to the image you linked.

It has solved the issue! Thank you. I feel foolish of not thinking about it earlier.

I have one more doubt about the connector impedance.

As I am using a pretty low freq. 250/500/1000/2000kbps, for the simulation and the connector length will be not more than 0.5mm/1cm, does the connector impedance really impact the signal substantially?

From what I have read, it shouldn't, as the wavelength of the signal is much larger than the length of the connector. But then what exactly are the reasons behind?
 
  • #9
I agree that the connector will probably not affect the results. You need pretty high speed signals to detect a change like that. You should probably try to simulate something like that just to get a feel for it.
 

Related to Transmission Line Simulation Questions

1. What is transmission line simulation?

Transmission line simulation is a computer-based method used to model and analyze the behavior of electromagnetic signals traveling through a transmission line. It is used to study the performance of various transmission line designs and to predict the behavior of signals in real-world applications.

2. What are the benefits of using transmission line simulation?

Transmission line simulation allows engineers to test and optimize designs before they are physically constructed, saving time and resources. It also provides insights into signal behavior that may not be possible to observe in real-world conditions.

3. What types of transmission line simulations are commonly used?

Two commonly used types of transmission line simulations are time domain simulation and frequency domain simulation. Time domain simulation models the behavior of signals over time, while frequency domain simulation analyzes signals in the frequency domain.

4. What factors are important to consider when performing a transmission line simulation?

Some important factors to consider when performing a transmission line simulation are the characteristics of the transmission line, the type of signal being transmitted, the properties of the surrounding environment, and any non-ideal effects such as losses and reflections.

5. What are some common tools used for transmission line simulation?

There are many software tools available for transmission line simulation, such as MATLAB, Simulink, and high-frequency circuit simulators like SPICE. Some specialized tools, like ANSYS HFSS and CST Studio Suite, are specifically designed for electromagnetic simulations and are commonly used for more complex transmission line designs.

Similar threads

  • Electrical Engineering
Replies
6
Views
1K
Replies
8
Views
1K
  • Electrical Engineering
Replies
9
Views
3K
  • Electrical Engineering
Replies
12
Views
2K
  • Electrical Engineering
Replies
4
Views
1K
Replies
1
Views
1K
  • Electrical Engineering
Replies
5
Views
2K
  • Electrical Engineering
Replies
1
Views
1K
  • Electromagnetism
Replies
4
Views
1K
Replies
8
Views
2K
Back
Top