- #1
Bararontok
- 298
- 0
Because digital signaling uses an almost instantaneous transition from a set power level to 0W to represent the '1's and '0's of binary data, how does the computer determine when a file has stopped being copied to its storage device? Because even after the signal has stopped being transmitted, the power level would drop to and remain at 0W, so what is done to prevent the computer from mistaking the 0W power level for a continuous stream of '0' bits and keep saving the '0' bits to the storage device endlessly?