In computing, an epoch is a fixed date and time used as a reference from which a computer measures system time. Most computer systems determine time as a number representing the seconds removed from a particular arbitrary date and time. For instance, Unix and POSIX measure time as the number of seconds that have passed since Thursday 1 January 1970 00:00:00 UT, a point in time known as the Unix epoch. The C# programming language and Windows NT systems up to and including Windows 11 and Windows Server 2022 measure time as the number of 100-nanosecond intervals that have passed since 00:00:00 UTC on 1 January in the years AD 1 and AD 1601, respectively, making those points in time the epochs for those systems. [1] [2] Computing epochs are almost always specified as midnight Universal Time on some particular date.
Software timekeeping systems vary widely in the resolution of time measurement; some systems may use time units as large as a day, while others may use nanoseconds. For example, for an epoch date of midnight UTC (00:00) on 1 January 1900, and a time unit of a second, the time of the midnight (24:00) between 1 January 1900 and 2 January 1900 is represented by the number 86400, the number of seconds in one day. When times prior to the epoch need to be represented, it is common to use the same system, but with negative numbers.
Such representation of time is mainly for internal use. On systems where date and time are important in the human sense, software will almost always convert this internal number into a date and time representing a human calendar.
Computers do not generally store arbitrarily large numbers. Instead, each number stored by a computer is allotted a fixed amount of space. Therefore, when the number of time units that have elapsed since a system's epoch exceeds the largest number that can fit in the space allotted to the time representation, the time representation overflows, and problems can occur. While a system's behavior after overflow occurs is not necessarily predictable, in most systems the number representing the time will reset to zero, and the computer system will think that the current time is the epoch time again.
Most famously, older systems that counted time as the number of years elapsed since the epoch of 1 January 1900 and which only allotted enough space to store the numbers 0 through 99, experienced the Year 2000 problem. These systems (if not corrected beforehand) would interpret the date 1 January 2000 as 1 January 1900, leading to unpredictable errors at the beginning of the year 2000.
Even systems that allocate more storage to the time representation are not immune from this kind of error. Many Unix-like operating systems which keep time as seconds elapsed from the epoch date of 1 January 1970, and allot timekeeping enough storage to store numbers as large as 2 147 483 647 will experience an overflow problem on 19 January 2038. This is known as the Year 2038 problem.
Other more subtle timekeeping problems exist in computing, such as accounting for leap seconds, which are not observed with any predictability or regularity. Additionally, applications that need to represent historical dates and times (for example, representing a date prior to the switch from the Julian calendar to the Gregorian calendar) must use specialized timekeeping libraries.
Finally, some software must maintain compatibility with older software that does not keep time in strict accordance with traditional timekeeping systems. For example, Microsoft Excel observes the fictional date of 29 February 1900 in order to maintain bug compatibility with older versions of Lotus 1-2-3. [3] Lotus 1-2-3 observed the date due to an error; by the time the error was discovered, it was too late to fix it—"a change now would disrupt formulas which were written to accommodate this anomaly". [4]
There are at least six satellite navigation systems, all of which function by transmitting time signals. Of the only two satellite systems with global coverage, GPS calculates its time signal from an epoch, whereas GLONASS calculates time as an offset from UTC, with the UTC input adjusted for leap seconds. Of the only two other systems aiming for global coverage, Galileo calculates from an epoch and BeiDou calculates from UTC without adjustment for leap seconds. [5] [ needs update? ] GPS also transmits the offset between UTC time and GPS time and must update this offset every time there is a leap second, requiring GPS receiving devices to handle the update correctly. In contrast, leap seconds are transparent to GLONASS users. The complexities of calculating UTC from an epoch are explained by the European Space Agency in Galileo documentation under "Equations to correct system timescale to reference timescale". [6]
The following table lists epoch dates used by popular software and other computer-related systems. The time in these systems is stored as the quantity of a particular time unit (days, seconds, nanoseconds, etc.) that has elapsed since a stated time (usually midnight UTC at the beginning of the given date).
Epoch date | Notable uses | Rationale for selection |
---|---|---|
0 January 1 BC [nb 1] | MATLAB [7] | "Year 0" in ISO 8601 |
1 January AD 1 [nb 1] | .NET, [2] [8] Go, [9] REXX, [10] Rata Die [11] | Common Era, ISO 2014, [12] RFC 3339 [13] |
14 October 1582 | SPSS, [14] IBM z/OS Language Environment, [15] IBM AIX COBOL [16] | Same as below, but with one-based indexing |
15 October 1582 | UUID version 1 | The date of the Gregorian reform to the Christian calendar. [17] |
1 January 1601 | NTFS, COBOL, [18] Win32/Win64 (NT time epoch) [19] [20] | 1601 was the first year of the 400-year Gregorian calendar cycle at the time Windows NT was made. [19] |
31 December 1840 | MUMPS programming language | 1841 was a non-leap year several years before the birth year of the oldest living US citizen when the language was designed. [21] |
17 November 1858 | VMS, United States Naval Observatory, DVB SI 16-bit day stamps, other astronomy-related computations [22] | 17 November 1858, 00:00:00 UT is the zero of the Modified Julian Day (MJD) equivalent to Julian day 2400000.5 [23] |
10 May 1869 | PHAMIS date, used in Lastword/Carecast/Centricity Enterprise EHR running on Tandem/NonStop servers | The date of the transcontinental railroad Golden Spike, when the Central Pacific and Union Pacific railroads finally met at Promontory Summit, UT in May 1869. |
30 December 1899 | Microsoft COM DATE, Object Pascal, LibreOffice Calc, Google Sheets [24] | Technical internal value used by Microsoft Excel; for compatibility with Lotus 1-2-3. [3] |
31 December 1899 | Dyalog APL, [25] Microsoft C/C++ 7.0 [26] | Chosen so that (date mod 7) would produce 0=Sunday, 1=Monday, 2=Tuesday, 3=Wednesday, 4=Thursday, 5=Friday, and 6=Saturday. Microsoft’s last version of non-Visual C/C++ used this, but was subsequently reverted. |
0 January 1900 | Microsoft Excel, [3] Lotus 1-2-3 [27] | While logically 0 January 1900 is equivalent to 31 December 1899, these systems do not allow users to specify the latter date. Since 1900 is incorrectly treated as a leap year in these systems, 0 January 1900 actually corresponds to the historical date of 30 December 1899. |
1 January 1900 | Network Time Protocol, Time Protocol, IBM CICS, Mathematica, RISC OS, VME, Common Lisp, Michigan Terminal System | |
1 January 1901 | Ada [28] | In the first version of the language, dates were limited to the range 1901 to 2099 to avoid years affected by the 400 year leap year rule. When the upper bound was expanded in later versions, the lower bound was left the same to maintain backwards compatibility with systems that used it as a datum. [28] |
1 January 1904 | LabVIEW, Apple Inc.'s classic Mac OS, JMP Scripting Language, Palm OS, MP4, Microsoft Excel (optionally), [29] IGOR Pro | 1904 is the first leap year of the 20th century. [30] |
1 January 1960 | SAS System [31] | |
31 December 1967 | Pick OS and variants (jBASE, Universe, Unidata, Revelation, Reality) | Chosen so that (date mod 7) would produce 0=Sunday, 1=Monday, 2=Tuesday, 3=Wednesday, 4=Thursday, 5=Friday, and 6=Saturday. [32] |
1 January 1970 | Unix Epoch used in POSIX time, used by Unix and Unix-like systems (Linux, macOS, Android), and programming languages: most C/C++ implementations, [33] Java, JavaScript, Perl, PHP, Python, Ruby, Tcl, ActionScript. Also used by Precision Time Protocol. | |
1 January 1978 | AmigaOS. [34] [nb 2] The Commodore Amiga hardware systems were introduced between 1985 and 1994. Latest OS version 4.1 (December 2016). AROS, MorphOS. | |
1 January 1980 | IBM BIOS INT 1Ah, DOS, OS/2, FAT12, FAT16, FAT32, exFAT filesystems, ZIP format and derivatives | The IBM PC with its BIOS as well as 86-DOS, MS-DOS and PC DOS with their FAT12 file system were developed and introduced between 1980 and 1981. |
6 January 1980 | Qualcomm BREW, GPS, ATSC 32-bit time stamps | GPS counts weeks (a week is defined to start on Sunday) and 6 January is the first Sunday of 1980. [35] [36] Weeks are stored in a 10 bit integer, and the first GPS week number rollover occurred in August 1999. |
31 December 1989 | Garmin FIT Epoch. [37] Standard originated by Garmin as part of their FIT Protocol, which has been adopted by many as a de facto standard in the fitness devices industry. [38] | |
1 January 2000 | AppleSingle, AppleDouble, [39] PostgreSQL, [40] [nb 3] Zigbee UTCTime, [41] Ingenuity helicopter [42] | This Y2K epoch [43] is used sometimes to push the 2038 problem to the year 2068, if the processor does not support 64-bit. |
1 January 2001 | NSDate in Apple's Cocoa framework, NeXTSTEP | First day of the third millennium A.D. |
International Atomic Time is a high-precision atomic coordinate time standard based on the notional passage of proper time on Earth's geoid. TAI is a weighted average of the time kept by over 450 atomic clocks in over 80 national laboratories worldwide. It is a continuous scale of time, without leap seconds, and it is the principal realisation of Terrestrial Time. It is the basis for Coordinated Universal Time (UTC), which is used for civil timekeeping all over the Earth's surface and which has leap seconds.
A calendar date is a reference to a particular day represented within a calendar system. The calendar date allows the specific day to be identified. The number of days between two dates may be calculated. For example, "25 December 2024" is ten days after "15 December 2024". The date of a particular event depends on the observed time zone. For example, the air attack on Pearl Harbor that began at 7:48 a.m. Hawaiian time on 7 December 1941 took place at 3:18 a.m. Japan Standard Time, 8 December in Japan.
ISO 8601 is an international standard covering the worldwide exchange and communication of date and time-related data. It is maintained by the International Organization for Standardization (ISO) and was first published in 1988, with updates in 1991, 2000, 2004, and 2019, and an amendment in 2022. The standard provides a well-defined, unambiguous method of representing calendar dates and times in worldwide communications, especially to avoid misinterpreting numeric dates and times when such data is transferred between countries with different conventions for writing numeric dates and times.
A leap second is a one-second adjustment that is occasionally applied to Coordinated Universal Time (UTC), to accommodate the difference between precise time and imprecise observed solar time (UT1), which varies due to irregularities and long-term slowdown in the Earth's rotation. The UTC time standard, widely used for international timekeeping and as the reference for civil time in most countries, uses TAI and consequently would run ahead of observed solar time unless it is reset to UT1 as needed. The leap second facility exists to provide this adjustment. The leap second was introduced in 1972. Since then, 27 leap seconds have been added to UTC, with the most recent occurring on December 31, 2016. All have so far been positive leap seconds, adding a second to a UTC day; while it is possible for a negative leap second to be needed, one has not happened yet.
A time zone is an area which observes a uniform standard time for legal, commercial and social purposes. Time zones tend to follow the boundaries between countries and their subdivisions instead of strictly following longitude, because it is convenient for areas in frequent communication to keep the same time.
The Julian day is a continuous count of days from the beginning of the Julian period; it is used primarily by astronomers, and in software for easily calculating elapsed days between two events.
In chronology and periodization, an epoch or reference epoch is an instant in time chosen as the origin of a particular calendar era. The "epoch" serves as a reference point from which time is measured.
A time standard is a specification for measuring time: either the rate at which time passes or points in time or both. In modern times, several time specifications have been officially recognized as standards, where formerly they were matters of custom and practice. An example of a kind of time standard can be a time scale, specifying a method for measuring divisions of time. A standard for civil time can specify both time intervals and time-of-day.
The Network Time Protocol (NTP) is a networking protocol for clock synchronization between computer systems over packet-switched, variable-latency data networks. In operation since before 1985, NTP is one of the oldest Internet protocols in current use. NTP was designed by David L. Mills of the University of Delaware.
The year 2038 problem is a time computing problem that prevents some computer systems from representing times after 03:14:07 UTC on 19 January 2038.
The Common Object File Format (COFF) is a format for executable, object code, and shared library computer files used on Unix systems. It was introduced in Unix System V, replaced the previously used a.out format, and formed the basis for extended specifications such as XCOFF and ECOFF, before being largely replaced by ELF, introduced with SVR4. COFF and its variants continue to be used on some Unix-like systems, on Microsoft Windows, in UEFI environments and in some embedded development systems.
A timestamp is a sequence of characters or encoded information identifying when a certain event occurred, usually giving date and time of day, sometimes accurate to a small fraction of a second. Timestamps do not have to be based on some absolute notion of time, however. They can have any epoch, can be relative to any arbitrary time, such as the power-on time of a system, or to some arbitrary time in the past.
Unix time is a date and time representation widely used in computing. It measures time by the number of non-leap seconds that have elapsed since 00:00:00 UTC on 1 January 1970, the Unix epoch. For example, at midnight on January 1 2010, Unix time was 1262304000.
Though no standard exists, numerous calendars and other timekeeping approaches have been proposed for the planet Mars. The most commonly seen in the scientific literature denotes the time of year as the number of degrees on its orbit from the northward equinox, and increasingly there is use of numbering the Martian years beginning at the equinox that occurred April 11, 1955.
In computer science and computer programming, system time represents a computer system's notion of the passage of time. In this sense, time also includes the passing of days on the calendar.
A Lilian date is the number of days since the beginning of the Gregorian calendar on October 15, 1582, regarded as Lilian date 1. It was invented by Bruce G. Ohms of IBM in 1986 and is named for Aloysius Lilius, who devised the Gregorian Calendar. Lilian dates can be used to calculate the number of days between any two dates occurring since the beginning of the Gregorian calendar. It is currently used by date conversion routines that are part of IBM Language Environment (LE) software and in IBM AIX COBOL.
In computer science, data type limitations and software bugs can cause errors in time and date calculation or display. These are most commonly manifestations of arithmetic overflow, but can also be the result of other issues. The most well-known consequence of this type is the Y2K problem, but many other milestone dates or times exist that have caused or will cause problems depending on various programming deficiencies.
Coordinated Universal Time (UTC) is the primary time standard globally used to regulate clocks and time. It establishes a reference for the current time, forming the basis for civil time and time zones. UTC facilitates international communication, navigation, scientific research, and commerce.
The term year 2000 problem, or simply Y2K, refers to potential computer errors related to the formatting and storage of calendar data for dates in and after the year 2000. Many programs represented four-digit years with only the final two digits, making the year 2000 indistinguishable from 1900. Computer systems' inability to distinguish dates correctly had the potential to bring down worldwide infrastructures for computer reliant industries.
Note: When timestamp values are stored as eight-byte integers (currently the default), microsecond precision is available over the full range of values. […] timestamp values are stored as seconds before or after midnight 2000-01-01.