Skip to content
Advertisement

Why can JavaScript handle timestamps beyond 2038?

As we know that all dates using Javascript Date constructor are calculated in milliseconds from 01 January, 1970 00:00:00 Universal Time (UTC) with a day containing 86,400,000 milliseconds. This implies that JS uses UNIX timestamp. I set my timer to a date beyond 2038 (say 14 Nov 2039) and run the script:

    <script>
      var d = new Date();
      alert(d.getFullYear()+" "+d.getMonth()+" "+d.getDate());
    </script>

It alerts 2039 10 14 successfully unlike PHP which prints “9 Oct, 1903 07:45:59”

How JS handles this? Explanation is appreciated as I am confused!

Advertisement

Answer

32bit PHP uses 32bit integers, whose max value puts the last UNIX timestamp that can be expressed by them in 2038. That’s widely known as the Y2K38 problem and affects virtually all 32bit software using UNIX timestamps. Moving to 64bits or libraries which work with other timestamp representations (in the case of PHP the DateTime class) solves this problem.

Javascript doesn’t have integers but only floats, which don’t have an inherent maximum value (but in return have less precision).

User contributions licensed under: CC BY-SA
6 People found this is helpful
Advertisement