Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Unix time
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Limitations == Unix time was designed to encode calendar dates and times in a compact manner intended for use by computers internally. It is not intended to be easily read by humans or to store timezone-dependent values. It is also limited by default to representing time in seconds, making it unsuited for use when a more precise measurement of time is needed, such as when measuring the execution time of programs.<ref name="adv-unix">{{cite book |last1=Rochkind |first1=Mark |title=Advanced UNIX Programing |date=2004 |publisher=Addison-Wesley |isbn=978-0-13-141154-8 |pages=56β63 |edition=2nd}}</ref> === Range of representable times === {{see also|Year 2038 problem}} [[File:Year 2038 problem.gif|thumb|upright=1.8|An animated visual of the [[Year 2038 problem|32-bit Unix time overflow which will occur in 2038]]]] Unix time by design does not require a specific size for the storage, but most common implementations of Unix time use a [[signed integer]] with the same size as the [[Word (computer architecture)|word size]] of the underlying hardware. As the majority of modern computers are [[32-bit]] or [[64-bit]], and a large number of programs are still written in 32-bit compatibility mode, this means that many programs using Unix time are using signed 32-bit integer fields. The maximum value of a signed 32-bit integer is {{nowrap|2{{sup|31}} − 1}}, and the minimum value is {{nowrap|−2{{sup|31}}}}, making it impossible to represent dates before 13 December 1901 (at 20:45:52 UTC) or after 19 January 2038 (at 03:14:07 UTC). The early cutoff can have an impact on databases that are storing historical information; in some databases where 32-bit Unix time is used for timestamps, it may be necessary to store time in a different form of field, such as a string, to represent dates before 1901. The late cutoff is known as the [[Year 2038 problem]] and has the potential to cause issues as the date approaches, as dates beyond the 2038 cutoff would wrap back around to the start of the representable range in 1901.{{r|adv-unix|p=60}} Date range cutoffs are not an issue with 64-bit representations of Unix time, as the effective range of dates representable with Unix time stored in a signed 64-bit integer is over 584 billion years, or 292 billion years in either direction of the 1970 epoch.{{r|adv-unix|p=60-61}}<ref>{{cite web |last1=Saxena |first1=Ashutosh |last2=Rawat |first2=Sanjay |title=IDRBT Working Paper No. 9 |url=http://www.idrbt.ac.in/publications/workingpapers/Working%20Paper%20No.%209.pdf |archive-url=https://web.archive.org/web/20120513133703/http://www.idrbt.ac.in/publications/workingpapers/Working%20Paper%20No.%209.pdf |archive-date=13 May 2012 |url-status=dead}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)