Skip to main content
Tweeted twitter.com/StackSoftEng/status/1273631584361611266
Commonmark migration
Source Link

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a enter image description here enter image description here

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case as I don't understand why this method would be so much faster.

I have researched this quite a bit and have found a lot of conflicting information from over the years. I want to better understand what specificity is causing the high variance measured in the benchmark test linked above. Which leads to this post: Given the benchmark example (linked above and shown below), why would the While Loop test case, measure over 5000x faster than its the For Loop counterpart?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, implicit comparison, inline pop code
var i = array.length; 
while ( i-- ) {
    sum += array.pop();
}
// Test Case #2
// Reverse loop, implicit comparison, inline code
for ( var i = arr.length; i--; ) {
    sum += arr[i];
}

*Edited

 

In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for. I also removed a previous example that was confusing.

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a enter image description here enter image description here

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case as I don't understand why this method would be so much faster.

I have researched this quite a bit and have found a lot of conflicting information from over the years. I want to better understand what specificity is causing the high variance measured in the benchmark test linked above. Which leads to this post: Given the benchmark example (linked above and shown below), why would the While Loop test case, measure over 5000x faster than its the For Loop counterpart?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, implicit comparison, inline pop code
var i = array.length; 
while ( i-- ) {
    sum += array.pop();
}
// Test Case #2
// Reverse loop, implicit comparison, inline code
for ( var i = arr.length; i--; ) {
    sum += arr[i];
}

*Edited

 

In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for. I also removed a previous example that was confusing.

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a enter image description here enter image description here

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case as I don't understand why this method would be so much faster.

I have researched this quite a bit and have found a lot of conflicting information from over the years. I want to better understand what specificity is causing the high variance measured in the benchmark test linked above. Which leads to this post: Given the benchmark example (linked above and shown below), why would the While Loop test case, measure over 5000x faster than its the For Loop counterpart?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, implicit comparison, inline pop code
var i = array.length; 
while ( i-- ) {
    sum += array.pop();
}
// Test Case #2
// Reverse loop, implicit comparison, inline code
for ( var i = arr.length; i--; ) {
    sum += arr[i];
}

*Edited

In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for. I also removed a previous example that was confusing.

Updating Question to be more direct and updated a screen capture
Source Link

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a enter image description here enter image description here https://jsperf.com/caching-array-length/145enter image description here enter image description hereenter image description here

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case as I don't understand why this method would be so much faster.

I have researched this quite a bit and have found a lot of conflicting information from over the years. And I want to better understand what specificity is causing the high variance measured in the benchmark teststest linked above. Which leads to this post: How would you expectGiven the following benchmark example (linked above and shown below), why would the While Loop test cases to comparecase, measure over 5000x faster than its the For Loop counterpart?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, popimplicit assignmentcomparison, inlinedinline pop code
var current;i = array.length; 
while (current =i-- arr.pop()) {
    sum += current;array.pop();
}
// Test Case #2
// Reverse loop, implicit comparison, inlinedinline code
for ( var i = arr.length; i--; ) {
    sum += arr [i];arr[i];
}

*Edited

In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for. I also removed a previous example that was confusing.

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a enter image description here enter image description here https://jsperf.com/caching-array-length/145 enter image description here

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case as I don't understand why this method would be so much faster.

I have researched this quite a bit and have found a lot of conflicting information from over the years. And I want to better understand what specificity is causing the high variance measured in the benchmark tests linked above. Which leads to this post: How would you expect the following benchmark test cases to compare?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, pop assignment, inlined code
var current;
while (current = arr.pop()) {
  sum += current;
}
// Test Case #2
// Reverse loop, implicit comparison, inlined code
for ( var i = arr.length; i--; ) {
    sum += arr [i];
}

*Edited

In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for.

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a enter image description here enter image description here

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case as I don't understand why this method would be so much faster.

I have researched this quite a bit and have found a lot of conflicting information from over the years. I want to better understand what specificity is causing the high variance measured in the benchmark test linked above. Which leads to this post: Given the benchmark example (linked above and shown below), why would the While Loop test case, measure over 5000x faster than its the For Loop counterpart?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, implicit comparison, inline pop code
var i = array.length; 
while ( i-- ) {
    sum += array.pop();
}
// Test Case #2
// Reverse loop, implicit comparison, inline code
for ( var i = arr.length; i--; ) {
    sum += arr[i];
}

*Edited

In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for. I also removed a previous example that was confusing.

I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for.
Source Link

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a [![enter image description here][1]][1]enter image description here [![enter image description here][2]][2]enter image description here https://jsperf.com/caching-array-length/145https://jsperf.com/caching-array-length/145 [![enter image description here][3]][3]enter image description here

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case and that the code is not working the wayas I expectdon't understand why this method would be so much faster.

I have researched this quite a bit and have found a lot of conflicting information from over the years. And I want to better understand what specificity is causing the high variance measured in the benchmark tests linked above. Which leads to this post: How would you expect the following benchmark test cases to compare? What is causing the high variance seen in the benchmarks linked above?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, pop assignment, inlined code
var current;
while (current = arr.pop()) {
  sum += current;
}
// Test Case #2
// Reverse loop, implicit comparison, inlined code
for ( var i = arr.length; i--; ) {
    sum += arr [i];
}

*Edited for clarity [1]: https://i.sstatic.net/xzkUV.png [2]: https://i.sstatic.net/sHbdK.png [3]: https://i.sstatic.net/LWl1B.png

*Edited

In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for.

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a [![enter image description here][1]][1] [![enter image description here][2]][2] https://jsperf.com/caching-array-length/145 [![enter image description here][3]][3]

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case and that the code is not working the way I expect.

Which leads to this post: How would you expect the following benchmark test cases to compare? What is causing the high variance seen in the benchmarks linked above?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, pop assignment, inlined code
var current;
while (current = arr.pop()) {
  sum += current;
}
// Test Case #2
// Reverse loop, implicit comparison, inlined code
for ( var i = arr.length; i--; ) {
    sum += arr [i];
}

*Edited for clarity [1]: https://i.sstatic.net/xzkUV.png [2]: https://i.sstatic.net/sHbdK.png [3]: https://i.sstatic.net/LWl1B.png

Recently I was asked to refactor some code that leverages JavaScript's array.reduce() method because other developers felt the code hard to read. While doing this I decided to play around with some JavaScript array iteration performance benchmarks, to help me validate various approaches. I wanted to know what the fastest way to reduce an array would be, but I don't know if I can trust the initial results:

http://jsbench.github.io/#8803598b7401b38d8d09eb7f13f0709a enter image description here enter image description here https://jsperf.com/caching-array-length/145 enter image description here

I added the test cases for "while loop array.pop() assignments" to the benchmarks linked above (mostly for the fun of it) but I think there must be something wrong with the tests. The variation in ops/sec seem to large to be accurate. I fear that something is wrong with my test case as I don't understand why this method would be so much faster.

I have researched this quite a bit and have found a lot of conflicting information from over the years. And I want to better understand what specificity is causing the high variance measured in the benchmark tests linked above. Which leads to this post: How would you expect the following benchmark test cases to compare?

//Benchmark Setup
var arr = [];
var sum = 0; //set to 0 before each test
for (var i = 0; i < 1000; i++) {
  arr[i] = Math.random();
}
// Test Case #1
// While loop, pop assignment, inlined code
var current;
while (current = arr.pop()) {
  sum += current;
}
// Test Case #2
// Reverse loop, implicit comparison, inlined code
for ( var i = arr.length; i--; ) {
    sum += arr [i];
}

*Edited

In response to the downvotes. I want this post to be useful. I am adding images to provide context for the links. I removed unnecessary details and refined the content to focus on the questions I am seeking answers for.

Adding images to provide context for the links. Removing unnecessary details and refined the content to focus on the questions I am seeking answers for.
Source Link
Loading
Source Link
Loading