Global Tools Hub
Current language: English
Back to guides

Guide

Unix Timestamp in Seconds vs Milliseconds Explained

Understand the difference between second and millisecond timestamps so date conversions stop breaking.

Many timestamp bugs come from mixing seconds and milliseconds. Once you know how to spot the scale, conversion becomes quick and reliable.

The key difference in one line

Seconds-based Unix timestamps are usually 10 digits.

Milliseconds-based timestamps are usually 13 digits.

Why the mismatch happens

Backend systems may store seconds while frontend tools expect milliseconds.

Copying values between systems without checking units creates wrong dates.

Quick detection workflow

Start by checking digit length, then convert using the correct unit.

  • 10 digits: treat as seconds.
  • 13 digits: treat as milliseconds.
  • If unsure, test both and validate expected year/timezone.

Common JavaScript pitfall

JavaScript Date APIs usually expect milliseconds.

Passing a seconds value directly often shows a date near 1970.

Team best practice

Document timestamp units in API schemas and internal docs.

Clear unit labels prevent repeated debugging across teams.

Useful when

  • Debugging API date fields.
  • Fixing JavaScript date parsing.
  • Reading logs from mixed systems.
  • Validating webhook payloads.

Always check timestamp scale first

Before converting any timestamp, identify whether it is 10 digits (seconds) or 13 digits (milliseconds).

Related tools

Timestamp Converter

Convert Unix seconds, Unix milliseconds, and readable date text both ways.

Open Timestamp Converter

More guides

Browse another short article to keep exploring practical workflows.