Skip to main content
Bounty Awarded with 50 reputation awarded by Giacomo1968
deleted 1 characters in body
Source Link
Peter Mortensen
  • 31.5k
  • 22
  • 110
  • 134

JavascriptJavaScript works with the number of milliseconds since the epoch where aswhereas most other languages work with the seconds. You could work with milliseconds but as soon as you pass a value to say phpPHP, the phpPHP native functions will probably fail. So to be sure I always use the seconds, not milliseconds.

This will give you a Unix timestamp (in seconds):

var unix = Math.round(+new Date()/1000);

This will give you the milliseconds since the epoch (not Unix timestamp):

var milliseconds = new Date().getTime();

Javascript works with the number of milliseconds since the epoch where as most other languages work with the seconds. You could work with milliseconds but as soon as you pass a value to say php, the php native functions will probably fail. So to be sure I always use the seconds, not milliseconds.

This will give you a Unix timestamp (in seconds):

var unix = Math.round(+new Date()/1000);

This will give you the milliseconds since the epoch (not Unix timestamp):

var milliseconds = new Date().getTime();

JavaScript works with the number of milliseconds since the epoch whereas most other languages work with the seconds. You could work with milliseconds but as soon as you pass a value to say PHP, the PHP native functions will probably fail. So to be sure I always use the seconds, not milliseconds.

This will give you a Unix timestamp (in seconds):

var unix = Math.round(+new Date()/1000);

This will give you the milliseconds since the epoch (not Unix timestamp):

var milliseconds = new Date().getTime();
Source Link
Daithí
  • 4.1k
  • 5
  • 29
  • 39

Javascript works with the number of milliseconds since the epoch where as most other languages work with the seconds. You could work with milliseconds but as soon as you pass a value to say php, the php native functions will probably fail. So to be sure I always use the seconds, not milliseconds.

This will give you a Unix timestamp (in seconds):

var unix = Math.round(+new Date()/1000);

This will give you the milliseconds since the epoch (not Unix timestamp):

var milliseconds = new Date().getTime();