Jump to content

Timestamping (computing)

From Wikipedia, the free encyclopedia

In computing, timestamping refers to the use of an electronic timestamp to provide a temporal order among a set of events.

Timestamping techniques are used in a variety of computing fields, from network management and computer security to concurrency control.[1][2] For instance, a heartbeat network uses timestamping to monitor the nodes on a high availability computer cluster.[3]

Timestamping computer files (updating the timestamp in the per-file metadata every time a file is modified) makes it possible to use efficient build automation tools.

See also

[edit]

References

[edit]
  1. ^ Advances in Computer Science and Information Technology by Tai-hoon Kim, Hojjat Adeli 2010 ISBN 3642135765 page 183
  2. ^ Computer aided verification: 13th International conference, by Gérard Berry, Hubert Comon, A. Finkel 2001 ISBN 3540423451 page 423
  3. ^ Theoretical Aspects of Distributed Computing in Sensor Networks by Sotiris Nikoletseas and José D.P. Rolim 2011 ISBN 3642148484 page 304