Definition - What does Computer Simulation mean?
A computer simulation is the usage of a computer for the imitation of a real-world process or system. The dynamic responses of one system are represented by the behavior of another system, which is largely modeled on the former. A simulation requires a model, or a mathematical description of the real system. This is in the form of computer programs, which encompass the key characteristics or behaviors of the selected system. Here, the model is basically a representation of the system and the simulation process is known to depict the operation of the system in time.
Techopedia explains Computer Simulation
Computer simulations find usage in the study of dynamic behavior in an environment that may be difficult or is dangerous to implement in real life. Say, a nuclear blast may be represented with a mathematical model that takes into consideration various elements such as velocity, heat and radioactive emissions. Additionally, one may implement changes to the equation by changing certain other variables, like the amount of fissionable material used in the blast.
Simulations largely help in determining behaviors when individual components of a system are altered. Simulations can also be used in engineering to determine potential effects, such as that of river systems for the construction of dams.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: