Guía
Unix timestamp en segundos vs milisegundos
Aprende a distinguir unidades para evitar fechas incorrectas al convertir timestamps.
Muchos errores de fecha vienen de mezclar segundos y milisegundos. Detectar la escala soluciona casi todo.
The key difference in one line
Seconds-based Unix timestamps are usually 10 digits.
Milliseconds-based timestamps are usually 13 digits.
Why the mismatch happens
Backend systems may store seconds while frontend tools expect milliseconds.
Copying values between systems without checking units creates wrong dates.
Quick detection workflow
Start by checking digit length, then convert using the correct unit.
- 10 digits: treat as seconds.
- 13 digits: treat as milliseconds.
- If unsure, test both and validate expected year/timezone.
Common JavaScript pitfall
JavaScript Date APIs usually expect milliseconds.
Passing a seconds value directly often shows a date near 1970.
Team best practice
Document timestamp units in API schemas and internal docs.
Clear unit labels prevent repeated debugging across teams.
Useful when
- Debugging API date fields.
- Fixing JavaScript date parsing.
- Reading logs from mixed systems.
- Validating webhook payloads.
Always check timestamp scale first
Before converting any timestamp, identify whether it is 10 digits (seconds) or 13 digits (milliseconds).