Real-Time Computing (RTC)
Definition - What does Real-Time Computing (RTC) mean?
Real-time computing (RTC) is a term for computing practices that have specific time constraints. Real-time computing has to be done in a time frame that is relatively imperceptible to the user. By contrast, other types of computing can be done on a delayed basis, for instance, where information is aggregated, kept and stored for later use.
Techopedia explains Real-Time Computing (RTC)
One of the best ways to explain real-time computing is to use an example such as the “form load” command. Something like this is almost always done in real time. This way, when a user clicks on a command to open the program, the form opens up immediately. In optimal conditions, with the right bandwidth for Web-delivered systems, memory storage and powerful CPU operation, the form pops up in a split second. In other cases, there may be delays, but this still counts as real-time computing — it is computing that, when done on command, is programmed to happen almost immediately.
Real-time computing is a type of metric that developers and engineers must look at when they decide how a program will work. What parts of the program will be real-time computing? In other words, what user-driven events will happen immediately on command? Another good example is a high-level computing task, such as a command-driven program that looks for discrepancies in text or numbers, or generates complex calculations. Because of the sophistication of computer hardware today, many of these programs can be built for real-time computing, where the results come back almost as soon as the user hits the command button. The same is true for rendering complex graphics, ordering data or doing other high-level computation.
Join thousands of others with our weekly newsletter
Free Whitepaper: The Path to Hybrid Cloud:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: