You can support Wikipedia by making a tax-deductible donation.

Help shape the future of Wikipedia. Please participate in our survey of readers and contributors! (More information)

Year 2038 problem

From Wikipedia, the free encyclopedia

Jump to: navigation, search
Example showing how the date would reset (at 03:14:08 UTC on 19 January 2038).

The year 2038 problem (also known as "Unix Millennium bug", or "Y2K38" by analogy to the Y2K problem) may cause some computer software to fail before or in the year 2038. The problem affects all software and systems that store system time as a signed 32-bit integer, and interpret this number as the number of seconds since 00:00:00 January 1, 1970.[1] The latest time that can be represented this way is 03:14:07 UTC on Tuesday, 19 January 2038. Times beyond this moment will "wrap around" and be stored internally as a negative number, which these systems will interpret as a date in 1901 rather than 2038. This will likely cause problems for users of these systems due to erroneous calculations.

Most 32-bit Unix-like systems store and manipulate time in this format, so this problem is often referred to as the "Unix Millennium Bug". However, any other non-Unix operating systems and software that store and manipulate time this way will be just as affected by this problem.

Contents

[hide]

[edit] Known problems

In May 2006, reports surfaced of an early Y2038 problem in the AOLserver software. The software would specify that a database request should "never" time out by specifying a timeout date one billion seconds in the future. One billion seconds (just over 31 years 251 days and 12 hours) after 21:27:28 on 12 May 2006 is beyond the 2038 cutoff date, so after this date, the timeout calculation overflowed and calculated a timeout date that was actually in the past, causing the software to crash.[2][3]

[edit] Example

$ date
Su 6. Jul 00:32:27 CEST 2008
$ openssl req -x509 -in server.csr -key server.key -out server.crt -days 10789 && openssl x509 -in server.crt -text | grep After
            Not After : Jan 18 22:32:32 2038 GMT
$ openssl req -x509 -in server.csr -key server.key -out server.crt -days 10790 && openssl x509 -in server.crt -text | grep After
            Not After : Dec 14 16:04:21 1901 GMT   (32-Bit System)
$ openssl req -x509 -in server.csr -key server.key -out server.crt -days 2918831 && openssl x509 -in server.crt -text | grep After
            Not After : Dec 31 22:41:18 9999 GMT   (64-Bit System)

[edit] Solutions

There is no easy fix for this problem for existing CPU/OS/File System combinations. Changing the definition of time_t to use a 64-bit type would break binary compatibility for software, data storage, and generally anything dealing with the binary representation of time. Changing time_t to an unsigned 32-bit integer, effectively allowing timestamps to be accurate until the year 2106, would affect many programs that deal with time differences, and thus also break binary compatibility in many cases.

Most operating systems for 64-bit architectures already use 64-bit integers in their time_t, and these operating systems are becoming more common in particularly desktop and server environments. Using a (signed) 64-bit value introduces a new wraparound date in about 290 billion years, on Sunday, December 4, 292,277,026,596. As of 2007, however, hundreds of millions of 32-bit systems are deployed, many in embedded systems, and it is far from certain they will all be replaced by 2038. Further, long before that date programs which project any kind of pattern into the future will begin to run into problems. For example, by 2028, projecting a trend ten years will encounter the "bug".

Despite the modern 18- to 24-month generational update in computer systems technology, embedded computers may operate unchanged for the life of the system they control. The use of 32-bit time_t has also been encoded into some file formats, which means it can live on for a long time beyond the life of the machines involved.

A variety of alternative proposals have been made, some of which are in use, including storing either milliseconds or microseconds since an epoch (typically either 1 January 1970 or 1 January 2000) in a signed-64 bit integer, providing a minimum of 300,000 years range.[4][5] Other proposals for new time representations provide different precisions, ranges, and sizes (almost always wider than 32 bits), as well as solving other related problems, such as the handling of leap seconds.

[edit] See also

[edit] References

[edit] External links

Personal tools