Definition - What does Timestamp mean?
A timestamp is temporal information regarding an event that is recorded by the computer and then stored as a log or metadata. Any event or activity could have a timestamp recorded, depending on the needs of the user or the capabilities of the process creating the timestamp.
Techopedia explains Timestamp
Timestamps are an essential feature for most computer-related processes, especially for synchronization purposes. For example, timestamps on files that require backups are essential so that the backup mechanism can know the difference between the file on backup and the current file, for example, whether it has been changed or not as referenced by the date-modified timestamp.
Usual events with timestamps recorded automatically by the operating system are file creation and file modification, which can be checked by looking at the properties of the file. In debug logs created by servers or when debugging a program, each event that happens is logged with a timestamp so that the administrator or debugger may immediately know what happened and when.
Timestamps are essential for synchronization of a number of processes such as that for IP telephony, where each packet sent must contain a timestamp so that the receiving end knows how to organize the data before putting it all together. This goes the same for some media streaming protocols.
Your Car, Your Computer: ECUs and the Controller Area Network
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: