Unix timestamp converter

Convert Unix timestamps to human-readable dates and back. Supports seconds and milliseconds, any timezone.

Current Unix timestamp1776298046

Timestamp → Date

Detected: seconds
UTCThursday, 16 April 2026 at 00:07:26 UTC
LocalThursday, 16 April 2026 at 00:07:26 Coordinated Universal Time
ISO 86012026-04-16T00:07:26.000Z
Relative0 seconds ago

Date → Timestamp

Seconds1776298046
Milliseconds1776298046000

What is a Unix timestamp?

A Unix timestamp (also called epoch time, POSIX time, or Unix time) is the number of seconds that have elapsed since 1 January 1970 at 00:00:00 UTC — a moment known as the Unix epoch. It's a single integer that represents a specific instant in time, and it's used universally across programming languages, databases, APIs, and log files.

The beauty of Unix timestamps is their timezone-agnosticism. The timestamp 1713278400 refers to the exact same moment in time regardless of whether you're in London, Tokyo, or New York. The conversion to a local date and time happens at the display layer, not in the data itself. This makes timestamps ideal for storing and comparing times in distributed systems where servers and users span multiple timezones.

Most programming languages provide built-in functions to get the current timestamp and convert between timestamps and human-readable dates. JavaScript uses milliseconds (Date.now() returns 13 digits), while most other languages use seconds (10 digits).

Common timestamp formats

Seconds (10 digits, e.g. 1713278400) is the standard Unix timestamp format used by most systems including C, Python, PHP, Ruby, and SQL databases. This is the most portable and widely understood format.

Milliseconds (13 digits, e.g. 1713278400000) is used by JavaScript (Date.now()), Java (System.currentTimeMillis()), and many modern APIs. It provides sub-second precision without introducing floating-point issues.

Microseconds (16 digits) and nanoseconds (19 digits) are used in high-precision contexts like Python's time.time_ns(), Go's time.UnixNano(), and database columns that need sub-millisecond accuracy.

ISO 8601 strings (e.g. 2026-04-16T14:00:00Z) are the standard human-readable format for dates in APIs and data interchange. The trailing Z denotes UTC; a timezone offset like +05:30 can be used instead.

RFC 2822 (e.g. Wed, 16 Apr 2026 14:00:00 +0000) is the email-standard date format, still used in HTTP headers and some older APIs. You can tell these formats apart by digit count: 10 = seconds, 13 = milliseconds, 16 = microseconds, 19 = nanoseconds.

The Year 2038 problem

The Year 2038 problem affects systems that store Unix timestamps as 32-bit signed integers. A signed 32-bit integer has a maximum value of 2,147,483,647, which corresponds to 03:14:07 UTC on 19 January 2038. One second later, the integer overflows and wraps to a negative number, which represents a date in December 1901.

Most modern systems have already migrated to 64-bit timestamps, which won't overflow for approximately 292 billion years. However, legacy embedded systems (ATMs, IoT devices, older database schemas) and some file formats (ext3 filesystems, older MySQL TIMESTAMP columns) still use 32-bit storage. If you maintain systems with 32-bit timestamp fields, plan your migration well before 2038.

Frequently asked questions

Why does Unix time start on 1 January 1970?
It was a pragmatic choice by the original Unix developers at Bell Labs. Unix was being created in the late 1960s and early 1970s, and they needed a reference point that was recent enough to be useful but far enough back to cover relevant dates. 1 January 1970 was a clean, round number close to when the system was being built. There's no deep technical reason — it was simply a convenient epoch that stuck.
What's the difference between a Unix timestamp and ISO 8601?
A Unix timestamp is a single number representing seconds (or milliseconds) since the epoch — it's compact, easy to compare mathematically, and inherently timezone-free. ISO 8601 (e.g. 2026-04-16T14:30:00Z) is a human-readable string that can include timezone information. Timestamps are better for storage and computation; ISO 8601 is better for display, logs, and APIs where readability matters. Most systems can convert freely between the two.
How do I get the current timestamp in my programming language?
JavaScript: Date.now() (milliseconds) or Math.floor(Date.now() / 1000) (seconds). Python: import time; time.time(). PHP: time(). Ruby: Time.now.to_i. Java: System.currentTimeMillis() / 1000. SQL: UNIX_TIMESTAMP() (MySQL) or EXTRACT(EPOCH FROM NOW()) (PostgreSQL). Shell: date +%s.