What is a Unix timestamp?
A Unix timestamp (also called epoch time, POSIX time, or Unix time) is the number of seconds that have elapsed since 1 January 1970 at 00:00:00 UTC — a moment known as the Unix epoch. It's a single integer that represents a specific instant in time, and it's used universally across programming languages, databases, APIs, and log files.
The beauty of Unix timestamps is their timezone-agnosticism. The timestamp 1713278400 refers to the exact same moment in time regardless of whether you're in London, Tokyo, or New York. The conversion to a local date and time happens at the display layer, not in the data itself. This makes timestamps ideal for storing and comparing times in distributed systems where servers and users span multiple timezones.
Most programming languages provide built-in functions to get the current timestamp and convert between timestamps and human-readable dates. JavaScript uses milliseconds (Date.now() returns 13 digits), while most other languages use seconds (10 digits).
Common timestamp formats
Seconds (10 digits, e.g. 1713278400) is the standard Unix timestamp format used by most systems including C, Python, PHP, Ruby, and SQL databases. This is the most portable and widely understood format.
Milliseconds (13 digits, e.g. 1713278400000) is used by JavaScript (Date.now()), Java (System.currentTimeMillis()), and many modern APIs. It provides sub-second precision without introducing floating-point issues.
Microseconds (16 digits) and nanoseconds (19 digits) are used in high-precision contexts like Python's time.time_ns(), Go's time.UnixNano(), and database columns that need sub-millisecond accuracy.
ISO 8601 strings (e.g. 2026-04-16T14:00:00Z) are the standard human-readable format for dates in APIs and data interchange. The trailing Z denotes UTC; a timezone offset like +05:30 can be used instead.
RFC 2822 (e.g. Wed, 16 Apr 2026 14:00:00 +0000) is the email-standard date format, still used in HTTP headers and some older APIs. You can tell these formats apart by digit count: 10 = seconds, 13 = milliseconds, 16 = microseconds, 19 = nanoseconds.
The Year 2038 problem
The Year 2038 problem affects systems that store Unix timestamps as 32-bit signed integers. A signed 32-bit integer has a maximum value of 2,147,483,647, which corresponds to 03:14:07 UTC on 19 January 2038. One second later, the integer overflows and wraps to a negative number, which represents a date in December 1901.
Most modern systems have already migrated to 64-bit timestamps, which won't overflow for approximately 292 billion years. However, legacy embedded systems (ATMs, IoT devices, older database schemas) and some file formats (ext3 filesystems, older MySQL TIMESTAMP columns) still use 32-bit storage. If you maintain systems with 32-bit timestamp fields, plan your migration well before 2038.