It really comes down to your definition of "real time" with respect to your expectations of the timely functioning of the entire computing system (hardware plus software).
Let's say you coding for a word processing (WP) program for Windows Vista, as I am doing this moment (essentially an OS based on Windows NT). The time intervals one would expect the system processor (the CPU) to need to react to user input (typing) would be on the order of 10s or 100s of milliseconds (the speed of the typist hitting keys).
So long as there were no significant delays between the hitting of a key and the placement of a character on the screen (no latency) then the system could be considered "rear time" - you hit a key - a letter appears on the screen "instantly".
Now imagine your coding for a program that must monitor the activities of of a military jet aircraft (an enormously complex system). Consider the 100s or even 1000s of sensor inputs (engine, airfoils, radar, weapon systems, pilot, environmental, etc.) and processors outputs that the OS must cope with to assure the timely functioning of ALL systems to keep that aircraft at peak performance. That's a lot of "mouths at the watering trough" clamoring for priority. "Real time" is on the order of, realistically, micro seconds and even conceivably nanoseconds. A completely different definition of "instantly".