- #1
- 2,076
- 140
Homework Statement
I have read that the phase difference between two sinusoidal signals is calculated as follows:
$$\theta = \omega \Delta T = \frac{2 \pi}{T} \Delta T$$
Where ##\Delta T## is the time difference. This formula confuses me as it was derived from nowhere.
I am asked to compute the phase difference between the two waveforms shown:
Also how much does wave 1 lead wave 2?
Homework Equations
$$f = \frac{1}{T}$$
$$\omega = 2 \pi f$$
The Attempt at a Solution
I am confused with the question itself. I know I am merely looking for the phase difference ##|\theta_1 - \theta_2|## between the waves.
Wave 1 appears to be a plain old sin wave (##v_1 = 1*sin(\omega t)##). Wave 2 is lagging behind wave 1 (##v_2 = 2*sin(\omega t + \phi)##).
Is that the right approach? Or do I read the periods of each wave off?