This article needs additional citations for verification .(October 2024) |
A timestamp is a sequence of characters or encoded information identifying when a certain event occurred, usually giving date and time of day, sometimes accurate to a small fraction of a second. Timestamps do not have to be based on some absolute notion of time, however. They can have any epoch, can be relative to any arbitrary time, such as the power-on time of a system, or to some arbitrary time in the past.
A distinction is sometimes made between the terms datestamp, timestamp and date-timestamp:
The term "timestamp" derives from rubber stamps used in offices to stamp the current date, and sometimes time, in ink on paper documents, to record when the document was received. Common examples of this type of timestamp are a postmark on a letter or the "in" and "out" times on a time card.
With the advent of digital data systems, the term has expanded to refer to digital date and time information attached to digital data. For example, computer files contain timestamps that tell when the file was last modified, and digital cameras add timestamps to the pictures they take, recording the date and time the picture was taken.
This data is usually presented in a consistent format, allowing for easy comparison of two different records and tracking progress over time; the practice of recording timestamps in a consistent manner along with the actual data is called timestamping. [1]
Timestamps are typically used for logging events or in a sequence of events (SOE), in which case each event in the log or SOE is marked with a timestamp.
Practically all computer file systems store one or more timestamps in the per-file metadata. In particular, most modern operating systems support the POSIX stat (system call), so each file has three timestamps associated with it: time of last access (atime: ls -lu
), time of last modification (mtime: ls -l
), and time of last status change (ctime: ls -lc
).
Some file archivers and some version control software, when they copy a file from some remote computer to the local computer, adjust the timestamps of the local file to show the date/time in the past when that file was created or modified on that remote computer, rather than the date/time when that file was copied to the local computer.
Timestamps are often found to be dirty in many cases. Without cleaning up inaccurate timestamps, time-related applications such as provenance analysis or pattern queries are not reliable. To evaluate the correctness of timestamps, temporal constraints can be applied, declaring distance limits between timestamps. [2]
ISO 8601 standardizes the representation of dates and times. [3] These standard representations are often used to construct timestamp values.
Examples of date-timestamps:
Examples of datestamps:
Examples of timestamps:
A calendar date is a reference to a particular day represented within a calendar system. The calendar date allows the specific day to be identified. The number of days between two dates may be calculated. For example, "25 November 2024" is ten days after "15 November 2024". The date of a particular event depends on the observed time zone. For example, the air attack on Pearl Harbor that began at 7:48 a.m. Hawaiian time on 7 December 1941 took place at 3:18 a.m. Japan Standard Time, 8 December in Japan.
ISO 8601 is an international standard covering the worldwide exchange and communication of date and time-related data. It is maintained by the International Organization for Standardization (ISO) and was first published in 1988, with updates in 1991, 2000, 2004, and 2019, and an amendment in 2022. The standard provides a well-defined, unambiguous method of representing calendar dates and times in worldwide communications, especially to avoid misinterpreting numeric dates and times when such data is transferred between countries with different conventions for writing numeric dates and times.
ISO 9660 is a file system for optical disc media. The file system is an international standard available from the International Organization for Standardization (ISO). Since the specification is available for anybody to purchase, implementations have been written for many operating systems.
A time zone is an area which observes a uniform standard time for legal, commercial and social purposes. Time zones tend to follow the boundaries between countries and their subdivisions instead of strictly following longitude, because it is convenient for areas in frequent communication to keep the same time.
In communications messages, a date-time group (DTG) is a set of characters, usually in a prescribed format, used to express the year, the month, the day of the month, the hour of the day, the minute of the hour, and the time zone, if different from Coordinated Universal Time (UTC). The order in which these elements are presented may vary. The DTG is usually placed in the header of the message. One example is "23:24 Sep 26, 2024 (UTC)"; while another example is "23:24 26 Sep 2024".
In computing, endianness is the order in which bytes within a word of digital data are transmitted over a data communication medium or addressed in computer memory, counting only byte significance compared to earliness. Endianness is primarily expressed as big-endian (BE) or little-endian (LE), terms introduced by Danny Cohen into computer science for data ordering in an Internet Experiment Note published in 1980. The adjective endian has its origin in the writings of 18th century Anglo-Irish writer Jonathan Swift. In the 1726 novel Gulliver's Travels, he portrays the conflict between sects of Lilliputians divided into those breaking the shell of a boiled egg from the big end or from the little end. By analogy, a CPU may read a digital word big end first, or little end first.
The byte-order mark (BOM) is a particular usage of the special Unicode character code, U+FEFFZERO WIDTH NO-BREAK SPACE, whose appearance as a magic number at the start of a text stream can signal several things to a program reading the text:
A Universally Unique Identifier (UUID) is a 128-bit label used to uniquely identify objects in computer systems. The term Globally Unique Identifier (GUID) is also used, mostly in Microsoft systems.
A text file is a kind of computer file that is structured as a sequence of lines of electronic text. A text file exists stored as data within a computer file system.
The modern 24-hour clock is the convention of timekeeping in which the day runs from midnight to midnight and is divided into 24 hours. This is indicated by the hours passed since midnight, from 00(:00) to 23(:59), with 24(:00) as an option to indicate the end of the day. This system, as opposed to the 12-hour clock, is the most commonly used time notation in the world today, and is used by the international standard ISO 8601.
The year 2038 problem is a time computing problem that leaves some computer systems unable to represent times after 03:14:07 UTC on 19 January 2038.
Unix time is a date and time representation widely used in computing. It measures time by the number of non-leap seconds that have elapsed since 00:00:00 UTC on 1 January 1970, the Unix epoch. For example, at midnight on January 1 2010, Unix time was 1262304000.
In computer science and computer programming, system time represents a computer system's notion of the passage of time. In this sense, time also includes the passing of days on the calendar.
A digital calendar is a collaborative or personal time management software with a calendar that can be used to keep track of planned events. The calendar can also contain an appointment book, address book or contact list. Common features of digital calendars are that users can:
In computer science, data type limitations and software bugs can cause errors in time and date calculation or display. These are most commonly manifestations of arithmetic overflow, but can also be the result of other issues. The most well-known consequence of this type is the Y2K problem, but many other milestone dates or times exist that have caused or will cause problems depending on various programming deficiencies.
In computing, an epoch is a fixed date and time used as a reference from which a computer measures system time. Most computer systems determine time as a number representing the seconds removed from a particular arbitrary date and time. For instance, Unix and POSIX measure time as the number of seconds that have passed since Thursday 1 January 1970 00:00:00 UT, a point in time known as the Unix epoch. The C# programming language and Windows NT systems up to and including Windows 11 and Windows Server 2022 measure time as the number of 100-nanosecond intervals that have passed since 00:00:00 UTC on 1 January in the years AD 1 and AD 1601, respectively, making those points in time the epochs for those systems. Computing epochs are almost always specified as midnight Universal Time on some particular date.
Date and time notation in the United States differs from that used in nearly all other countries. It is inherited from one historical branch of conventions from the United Kingdom. American styles of notation have also influenced customs of date notation in Canada, creating confusion in international commerce.
The European Committee for Standardization (CEN) and (CENELEC) adopted ISO 8601 with EN 28601, now EN ISO 8601. As a European Norm, CEN and CENELEC member states are obligated to adopt the standard as national standard without alterations as well.
Date and time notation in the Philippines varies across the country in various, customary formats. Some government agencies in the Philippines have adopted time and date representation standard based on the ISO 8601, notably the Philippines driver's license and the Unified Multi-Purpose ID.
Date and time notation in New Zealand most commonly records the date using the day-month-year format, while the ISO 8601 format (2024-11-16) is increasingly used for all-numeric dates, such as date of birth. The time can be written using either the 12-hour clock or the 24-hour clock (22:16).
3.5 Expansion … By mutual agreement of the partners in information interchange, it is permitted to expand the component identifying the calendar year, which is otherwise limited to four digits. This enables reference to dates and times in calendar years outside the range supported by complete representations, i.e. before the start of the year [0000] or after the end of the year [9999].