Unix timestamp: seconds vs milliseconds

The most common bug with Unix timestamps is mixing seconds and milliseconds. If your date looks like it’s in 1970 or 51382, this is probably why.

Quick rule

10 digits is usually seconds. 13 digits is usually milliseconds.

Examples
Seconds
1700000000
≈ Nov 2023
Milliseconds
1700000000000
Same moment, just ×1000
Why it happens
  • JavaScript Date.now() returns milliseconds.
  • Most Unix tooling and many APIs historically use seconds.
  • Some systems accept both, but won’t tell you which one it assumed.
How to detect it programmatically

Use magnitude (or digit count). A safe heuristic is:

// ts: number
const ms = Math.abs(ts) >= 1e12 ? ts : ts * 1000;

That’s exactly what the converter does.

If your date is wildly wrong…
  • Seeing 1970? You probably treated seconds as milliseconds.
  • Seeing a far-future year? You probably treated milliseconds as seconds.