Timestamp Converter
Paste a Unix timestamp and see the human-readable date instantly, or pick a date to get its timestamp. Supports seconds and milliseconds. Free, 100% client-side.
— Reference
Unix Timestamps
A Unix timestamp (also called Unix time or POSIX time) is the number of seconds that have elapsed since January 1, 1970 at 00:00:00 UTC — a moment known as the Unix epoch. It is a compact, timezone-independent way to represent any point in time as a single integer. Most programming languages and databases support it natively.
The Unix Epoch
The Unix epoch (1970-01-01T00:00:00Z) was chosen by early Unix developers as an arbitrary but convenient reference point. Timestamps can be negative (before 1970) or positive (after 1970). In 32-bit systems, the maximum representable timestamp is 2147483647 (January 19, 2038) — beyond which overflow occurs, known as the Year 2038 problem. Modern 64-bit systems are not affected.
Notable timestamps
0 — the Unix epoch (January 1, 1970 00:00:00 UTC). 946684800 — Y2K (January 1, 2000). 1000000000 — September 9, 2001, the "billion-second" milestone. 2147483647 — the maximum 32-bit signed integer, triggering the Year 2038 problem on January 19, 2038. If a timestamp has 13 digits (e.g. 1700000000000), it is in milliseconds; 10 digits means seconds.
Getting timestamps in code
JavaScript: Date.now() returns milliseconds; divide by 1000 for seconds. Python: import time; time.time() returns a float in seconds. PHP: time() returns seconds as an integer. Java: System.currentTimeMillis() returns milliseconds. Bash: date +%s prints the current timestamp in seconds. SQL: EXTRACT(EPOCH FROM NOW()) in PostgreSQL, UNIX_TIMESTAMP() in MySQL.
Privacy
All conversion runs 100% in your browser. No data is sent to a server.