Electric field around a charge or system of charges is defined as the force experienced per unit charge. Won't the original field be disturbed by the test charge's own field ? How then the original field be measured accurately ?
Does it mean that some form of matter is continuous ? Matter, as we all know, is made up of atoms. So it is discrete and discontinuous. So I believe that application of Calculus is inappropriate in areas like Gravitation, Electromagnetism and Nuclear Physics.
In the mathematics of Calculus, a basic requirement is that the system or function should be continuous. Until the discovery that matter is discontinuous, applying Calculus in Physics was reasonable. But why is it still applied almost everywhere in physics ? Won't such applications produce...
No, there is no other heat source. The system contains only the bulb and an object that is hotter than the bulb.
Then how about a system with two identical bodies of temperatures T and T/2 ?
Second law of thermodynamics states that heat does not flow spontaneously from cold to hot bodies. But a cool fluorescent bulb is perfectly capable of heating something that had already started out being warmer than the bulb itself. Is this not a contradiction to the law ? :confused: