Convert between Unix timestamps and human-readable dates across timezones
Math.floor(Date.now() / 1000)Date.now()Math.floor(Date.now() / 1000)import time
int(time.time())from datetime import datetime
int(datetime.now().timestamp())time()System.currentTimeMillis() / 1000Ltime.Now().Unix()Time.now.to_iuse std::time::{SystemTime, UNIX_EPOCH};
SystemTime::now().duration_since(UNIX_EPOCH).unwrap().as_secs()Int(Date().timeIntervalSince1970)System.currentTimeMillis() / 1000DateTimeOffset.UtcNow.ToUnixTimeSeconds()#include <chrono>
std::chrono::duration_cast<std::chrono::seconds>(std::chrono::system_clock::now().time_since_epoch()).count()date +%s[int][double]::Parse((Get-Date -UFormat %s))SELECT UNIX_TIMESTAMP();SELECT EXTRACT(EPOCH FROM NOW());new Date().getTime() / 1000A Unix timestamp (or Epoch time) is the number of seconds that have elapsed since January 1, 1970 00:00:00 UTC. It's a universal way to represent dates independent of timezone and locale.
This tool auto-detects seconds (10-digit), milliseconds (13-digit), microseconds (16-digit), nanoseconds (19-digit), ISO 8601 strings, and most human-readable date formats.
32-bit systems store timestamps as signed integers, which will overflow on January 19, 2038. Modern systems use 64-bit integers, extending the range by billions of years.