800

In order to duplicate an array in JavaScript: Which of the following is faster to use?

Slice method

var dup_array = original_array.slice();

For loop

for(var i = 0, len = original_array.length; i < len; ++i)
   dup_array[i] = original_array[i];

I know both ways do only a shallow copy: if original_array contains references to objects, objects won't be cloned, but only the references will be copied, and therefore both arrays will have references to the same objects. But this is not the point of this question.

I'm asking only about speed.

3
  • 5
    jsben.ch/#/wQ9RU <= a benchmark for the most common ways to clone an array Commented Oct 24, 2016 at 18:40
  • See also javascript - Copy array by value - Stack Overflow -- (some answer in that question does performance comparison) Commented Dec 6, 2021 at 10:14
  • Has anyone tried benchmarking using a specific function that returns the desired array? E.g. const getInitialArray = () => {return [[1, 2], [3, 4]} Commented Jun 7, 2023 at 20:33

27 Answers 27

924

There are at least 6 (!) ways to clone an array:

  • loop
  • slice
  • Array.from()
  • concat
  • spread syntax (FASTEST)
  • map A.map(function(e){return e;});

There has been a huuuge BENCHMARKS thread, providing following information:

  • for blink browsers slice() is the fastest method, concat() is a bit slower, and while loop is 2.4x slower.

  • for other browsers while loop is the fastest method, since those browsers don't have internal optimizations for slice and concat.

This remains true in Jul 2016.

Below are simple scripts that you can copy-paste into your browser's console and run several times to see the picture. They output milliseconds, lower is better.

while loop

n = 1000*1000;
start = + new Date();
a = Array(n); 
b = Array(n); 
i = a.length;
while(i--) b[i] = a[i];
console.log(new Date() - start);

slice

n = 1000*1000;
start = + new Date();
a = Array(n); 
b = a.slice();
console.log(new Date() - start);

Please note that these methods will clone the Array object itself, array contents however are copied by reference and are not deep cloned.

origAr == clonedArr //returns false
origAr[0] == clonedArr[0] //returns true
Sign up to request clarification or add additional context in comments.

19 Comments

@cept0 no emotions, just benchmarks jsperf.com/new-array-vs-splice-vs-slice/31
@Dan So what? Your test case results: Firefox 30 nightly is still ~230% faster than Chrome. Check the source code of V8 for splice and you'll be surprised (while...)
Sadly for short arrays the answer is vastly different. For example cloning an array of listeners before calling each of them. Those arrays are often small, usually 1 element.
You missed this method: A.map(function(e){return e;});
You're writing about blink browsers. Isn't blink just a layout engine, mainly affecting HTML rendering, and thus unimportant? I thought we'd rather talk about V8, Spidermonkey and friends here. Just a thing that confused me. Enlighten me, if I'm wrong.
|
343

Technically slice is the fastest way. However, it is even faster if you add the 0 begin index.

myArray.slice(0);

is faster than

myArray.slice();

https://jsben.ch/F0SZ3

6 Comments

And is myArray.slice(0,myArray.length-1); faster than myArray.slice(0); ?
@jave.web you;ve just dropped last element of the array. Full copy is array.slice(0) or array.slice(0, array.length)
This is incorrect, at least on my machine and according to your own benchmarks.
The link is dead.
jsben.ch/56xWo - sometimes, slice() is faster, sometimes slice(0), both only marginally so (in Firefox 56 and latest Vivaldi, Chrome-based). But slice(0, length) is always noticeably slower (except that it's the fastest in Firefox 87).
|
197

what about es6 way?

arr2 = [...arr1];

8 Comments

if converted with babel: [].concat(_slice.call(arguments))
Not sure where arguments is coming from... I think your babel output is conflating a few different features. It's more likely to be arr2 = [].concat(arr1).
@SterlingArcher arr2 = [].conact(arr1) is different from arr2 = [...arr1]. [...arr1] syntax will convert hole to undefined. For example, arr1 = Array(1); arr2 = [...arr1]; arr3 = [].concat(arr1); 0 in arr2 !== 0 in arr3.
I tested this in my browser (Chrome 59.0.3071.115) against Dan's answer above. It was more than 10 times slower than .slice(). n = 1000*1000; start = + new Date(); a = Array(n); b = [...a]; console.log(new Date() - start); // 168
Still will not clone something like this: [{a: 'a', b: {c: 'c'}}]. If c's value is changed in the "duplicated" array, it will change in the original array, since it's just a referential copy, not a clone.
|
52

Easiest way to deep clone Array or Object:

var dup_array = JSON.parse(JSON.stringify(original_array))

7 Comments

Important note for beginners: because this depends upon JSON, this also inherits its limitations. Among other things, that means your array cannot contain undefined or any functions. Both of those will be converted to null for you during the JSON.stringify process. Other strategies, such as (['cool','array']).slice() will not change them but also do not deep clone objects within the array. So there is a tradeoff.
Very bad perf and don't work with special objects like DOM, date, regexp, function ... or prototyped objects. Don't support cyclic references. You should never use JSON for deep clone.
worst possible way! Only use if for some issue all other doesn't work. It's slow, it's resources intense and it has all JSON limitations already mentioned in comments. Can't imagine how it got 25 up-votes.
It deep copies arrays with primitives, and where properties are arrays with further primitives/arrays. For that it is ok.
I tested this in my browser (Chrome 59.0.3071.115) against Dan's answer above. It was nearly 20 times slower than .slice(). n = 1000*1000; start = + new Date(); a = Array(n); var b = JSON.parse(JSON.stringify(a)) console.log(new Date() - start); // 221
|
50

🏁 Fastest Way to Clone an Array

I made this very plain utility function to test the time that it takes to clone an array. It is not 100% reliable. However, it can give you a bulk idea as to how long it takes to clone an existing array:

function clone(fn) {
  const arr = [...Array(1000000)];
  console.time('timer');
  fn(arr);
  console.timeEnd('timer');
}

And tested different approaches:

1)   5.79ms -> clone(arr => Object.values(arr));
2)   7.23ms -> clone(arr => [].concat(arr));
3)   9.13ms -> clone(arr => arr.slice());
4)  24.04ms -> clone(arr => { const a = []; for (let val of arr) { a.push(val); } return a; });
5)  30.02ms -> clone(arr => [...arr]);
6)  39.72ms -> clone(arr => JSON.parse(JSON.stringify(arr)));
7)  99.80ms -> clone(arr => arr.map(i => i));
8) 259.29ms -> clone(arr => Object.assign([], arr));
9) Maximum call stack size exceeded -> clone(arr => Array.of(...arr));

UPDATES

  1. Tests were made back in 2018, so most likely, you'll get different results with current browsers.
  2. Out of these methods, the best way to deep clone an array is by using JSON.parse(JSON.stringify(arr))
  3. Do not use the above if your array might include functions as it will return null.
    Thank you @GilEpshtain for this update.

9 Comments

I tried benchmarking your answer and I got very different results: jsben.ch/o5nLG
@mesqueeb, the tests might change, depending on ur machine of course. However, feel free to update the answer with your test result. Nice work!
I like your answer a lot, however I try your test and get that arr => arr.slice() is the fastest.
@LiorElrom, your update isn't correct, due to the fact that methods aren't serializable. For example: JSON.parse(JSON.stringify([function(){}])) will output [null]
Nice benchmark. I've tested this on my Mac in 2 browsers: Chrome Version 81.0.4044.113 and Safari Version 13.1 (15609.1.20.111.8) and fastest is spread operation: [...arr] with 4.653076171875ms in Chrome and 8.565ms in Safari. Second fast in Chrome is slice function arr.slice() with 6.162109375ms and in Safari second is [].concat(arr) with 13.018ms.
|
37
var cloned_array = [].concat(target_array);

7 Comments

Please explain what this does.
While this code snippet may answer the question, it doesn't provide any context to explain how or why. Consider adding a sentence or two to explain your answer.
I hate this kind of comments. It's obvious what it does!
A simple answer for a simple quetions, no big story to read. I like this kind of answers +1
"I'm asking only about speed" - This answer gives no indication on speed. That is the main question being asked. brandonscript has a good point. More information is needed to consider this an answer. But if it were a simpler question, this would be an excellent answer.
|
26

I put together a quick demo: http://jsbin.com/agugo3/edit

My results on Internet Explorer 8 are 156, 782, and 750, which would indicate slice is much faster in this case.

1 Comment

Don't forget the additional cost of the garbage collector if you have to do this very fast a lot. I was copying each neighbour array for each cell in my cellular automata using slice and it was much slower than reusing a previous array and copying the values. Chrome indicated about 40% of the total time was spent garbage collecting.
21

a.map(e => e) is another alternative for this job. As of today .map() is very fast (almost as fast as .slice(0)) in Firefox, but not in Chrome.

On the other hand, if an array is multi-dimensional, since arrays are objects and objects are reference types, none of the slice or concat methods will be a cure... So one proper way of cloning an array is an invention of Array.prototype.clone() as follows.

Array.prototype.clone = function(){
  return this.map(e => Array.isArray(e) ? e.clone() : e);
};

var arr = [ 1, 2, 3, 4, [ 1, 2, [ 1, 2, 3 ], 4 , 5], 6 ],
    brr = arr.clone();
brr[4][2][1] = "two";
console.log(JSON.stringify(arr));
console.log(JSON.stringify(brr));

1 Comment

Not bad, but unfortunately this doesn't work if you have Object in your array :\ JSON.parse(JSON.stringify(myArray)) works better in this case.
21

Fastest way to clone an Array of Objects will be using spread operator

var clonedArray=[...originalArray]
or
var clonedArray = originalArray.slice(0); //with 0 index it's little bit faster than normal slice()

but the objects inside that cloned array will still pointing at the old memory location. hence change to clonedArray objects will also change the orignalArray. So

var clonedArray = originalArray.map(({...ele}) => {return ele})

this will not only create new array but also the objects will be cloned to.

disclaimer if you are working with nested object in that case spread operator will work as SHALLOW CLONE. At that point better to use

var clonedArray=JSON.parse(JSON.stringify(originalArray));

3 Comments

You are the only one who noticed the memory location. Extra points for that!!
Thank you so much for emphasizing the memory location. Actually, that's probably the main reason why you need a 'decoupled' array clone.
JSON.parse(JSON.stringify(x)) is always a bad idea. It's lossy, and there's no point whatsoever in making a round-trip through text if you're trying to clone structured information.
9

Take a look at: link. It's not about speed, but comfort. Besides as you can see you can only use slice(0) on primitive types.

To make an independent copy of an array rather than a copy of the refence to it, you can use the array slice method.

Example:

To make an independent copy of an array rather than a copy of the refence to it, you can use the array slice method.

var oldArray = ["mip", "map", "mop"];
var newArray = oldArray.slice();

To copy or clone an object :

function cloneObject(source) {
    for (i in source) {
        if (typeof source[i] == 'source') {
            this[i] = new cloneObject(source[i]);
        }
        else{
            this[i] = source[i];
  }
    }
}

var obj1= {bla:'blabla',foo:'foofoo',etc:'etc'};
var obj2= new cloneObject(obj1);

Source: link

2 Comments

The primitive types comment applies to the for loop in the question as well.
if I were copying an array of objects, I would expect the new array to reference the same objects rather than cloning the objects.
8

ECMAScript 2015 way with the Spread operator:

Basic examples:

var copyOfOldArray = [...oldArray]
var twoArraysBecomeOne = [...firstArray, ..seccondArray]

Try in the browser console:

var oldArray = [1, 2, 3]
var copyOfOldArray = [...oldArray]
console.log(oldArray)
console.log(copyOfOldArray)

var firstArray = [5, 6, 7]
var seccondArray = ["a", "b", "c"]
var twoArraysBecomOne = [...firstArray, ...seccondArray]
console.log(twoArraysBecomOne);

References

2 Comments

Probably the only thing that is fast with the spread is to type it. It is waaay less performant than other ways of doing it.
Please provide some links about your argument.
8

Benchmark time!

function log(data) {
  document.getElementById("log").textContent += data + "\n";
}

benchmark = (() => {
  time_function = function(ms, f, num) {
    var z = 0;
    var t = new Date().getTime();
    for (z = 0;
      ((new Date().getTime() - t) < ms); z++)
      f(num);
    return (z)
  }

  function clone1(arr) {
    return arr.slice(0);
  }

  function clone2(arr) {
    return [...arr]
  }

  function clone3(arr) {
    return [].concat(arr);
  }

  Array.prototype.clone = function() {
    return this.map(e => Array.isArray(e) ? e.clone() : e);
  };

  function clone4(arr) {
    return arr.clone();
  }


  function benchmark() {
    function compare(a, b) {
      if (a[1] > b[1]) {
        return -1;
      }
      if (a[1] < b[1]) {
        return 1;
      }
      return 0;
    }

    funcs = [clone1, clone2, clone3, clone4];
    results = [];
    funcs.forEach((ff) => {
      console.log("Benchmarking: " + ff.name);
      var s = time_function(2500, ff, Array(1024));
      results.push([ff, s]);
      console.log("Score: " + s);

    })
    return results.sort(compare);
  }
  return benchmark;
})()
log("Starting benchmark...\n");
res = benchmark();

console.log("Winner: " + res[0][0].name + " !!!");
count = 1;
res.forEach((r) => {
  log((count++) + ". " + r[0].name + " score: " + Math.floor(10000 * r[1] / res[0][1]) / 100 + ((count == 2) ? "% *winner*" : "% speed of winner.") + " (" + Math.round(r[1] * 100) / 100 + ")");
});
log("\nWinner code:\n");
log(res[0][0].toString());
<textarea rows="50" cols="80" style="font-size: 16; resize:none; border: none;" id="log"></textarea>

The benchmark will run for 10s since you click the button.

My results:

Chrome (V8 engine):

1. clone1 score: 100% *winner* (4110764)
2. clone3 score: 74.32% speed of winner. (3055225)
3. clone2 score: 30.75% speed of winner. (1264182)
4. clone4 score: 21.96% speed of winner. (902929)

Firefox (SpiderMonkey Engine):

1. clone1 score: 100% *winner* (8448353)
2. clone3 score: 16.44% speed of winner. (1389241)
3. clone4 score: 5.69% speed of winner. (481162)
4. clone2 score: 2.27% speed of winner. (192433)

Winner code:

function clone1(arr) {
    return arr.slice(0);
}

Winner engine:

SpiderMonkey (Mozilla/Firefox)

Comments

7

As @Dan said "This answer becomes outdated fast. Use benchmarks to check the actual situation", there is one specific answer from jsperf that has not had an answer for itself: while:

var i = a.length;
while(i--) { b[i] = a[i]; }

had 960,589 ops/sec with the runnerup a.concat() at 578,129 ops/sec, which is 60%.

This is the lastest Firefox (40) 64 bit.


@aleclarson created a new, more reliable benchmark.

4 Comments

You should really link the jsperf. The one you are thinking of is broken, because a new array is created in every test case, except the 'while loop' test.
I made a new jsperf that is more accurate: jsperf.com/clone-array-3
60% what? 60% faster?
@PeterMortensen: 587192 is ~60% (61.1...) of 960589.
6

It depends on the browser. If you look in the blog post Array.prototype.slice vs manual array creation, there is a rough guide to performance of each:

Enter image description here

Results:

Enter image description here

3 Comments

arguments is not a proper array and he's using call to force slice to run on the collection. results may be misleading.
Yeh I meant to mention that in my post that these stats would probably change now with the broswers improving, but it gives a general idea.
@diugalde I think the only situation where posting code as a picture is acceptable is when the code is potentially dangerous and should not be copy-pasted. In this case though, it's quite ridiculous.
6

There is a much cleaner solution:

var srcArray = [1, 2, 3];
var clonedArray = srcArray.length === 1 ? [srcArray[0]] : Array.apply(this, srcArray);

The length check is required, because the Array constructor behaves differently when it is called with exactly one argument.

4 Comments

But is it the fastest?
More semantic than splice(), perhaps. But really, apply and this is all but intuitive.
shows the slowest performance on chrome- jsperf.com/new-array-vs-splice-vs-slice/113
You can use Array.of and ignore the length: Array.of.apply(Array, array)
6

Remember .slice() won't work for two-dimensional arrays. You'll need a function like this:

function copy(array) {
  return array.map(function(arr) {
    return arr.slice();
  });
}

1 Comment

In Javascript there aren't two-dimensional arrays. There are just arrays containing arrays. What you are trying to do is a deep copy which is not required in the question.
5

It depends on the length of the array. If the array length is <= 1,000,000, the slice and concat methods are taking approximately the same time. But when you give a wider range, the concat method wins.

For example, try this code:

var original_array = [];
for(var i = 0; i < 10000000; i ++) {
    original_array.push( Math.floor(Math.random() * 1000000 + 1));
}

function a1() {
    var dup = [];
    var start = Date.now();
    dup = original_array.slice();
    var end = Date.now();
    console.log('slice method takes ' + (end - start) + ' ms');
}

function a2() {
    var dup = [];
    var start = Date.now();
    dup = original_array.concat([]);
    var end = Date.now();
    console.log('concat method takes ' + (end - start) + ' ms');
}

function a3() {
    var dup = [];
    var start = Date.now();
    for(var i = 0; i < original_array.length; i ++) {
        dup.push(original_array[i]);
    }
    var end = Date.now();
    console.log('for loop with push method takes ' + (end - start) + ' ms');
}

function a4() {
    var dup = [];
    var start = Date.now();
    for(var i = 0; i < original_array.length; i ++) {
        dup[i] = original_array[i];
    }
    var end = Date.now();
    console.log('for loop with = method takes ' + (end - start) + ' ms');
}

function a5() {
    var dup = new Array(original_array.length)
    var start = Date.now();
    for(var i = 0; i < original_array.length; i ++) {
        dup.push(original_array[i]);
    }
    var end = Date.now();
    console.log('for loop with = method and array constructor takes ' + (end - start) + ' ms');
}

a1();
a2();
a3();
a4();
a5();

If you set the length of original_array to 1,000,000, the slice method and concat method are taking approximately the same time (3-4 ms, depending on the random numbers).

If you set the length of original_array to 10,000,000, then the slice method takes over 60 ms and the concat method takes over 20 ms.

1 Comment

dup.push is wrong in a5, instead dup[i] = should be used
4

In ES6, you can simply utilize the Spread syntax.

Example:

let arr = ['a', 'b', 'c'];
let arr2 = [...arr];

Please note that the spread operator generates a completely new array, so modifying one won't affect the other.

Example:

arr2.push('d') // becomes ['a', 'b', 'c', 'd']
console.log(arr) // while arr retains its values ['a', 'b', 'c']

Comments

4

I was redirected here from another question about shallow cloning Arrays, and after finding out that most of the links are either dead, outdated, or broken, figured I'd post a solution that you can just run in your own environment.

The following code should semi-accurately measure how much time it takes to clone an array using specific ways. You can run it in your Browser directly using the Developer console or Node.JS. The latest version of it can always be found here.

function benchTime(cycles, timeLimit, fnSetup, ...fnProcess) {
    function measureCycle(timeLimit, fn, args) {
        let tmp = null;
        let end, start = performance.now();
        let iterations = 0;

        // Run until we exceed the time limit for one cycle.
        do {
            tmp = fn.apply(null, args);
            end = performance.now();
            ++iterations;
        } while ((end - start) <= timeLimit);
        tmp = undefined;

        // Build a result object and return it.
        return {
            "iterations": iterations,
            "start": start,
            "end": end,
            "duration": end - start,
            "opsPerSec": (iterations / (end - start)) * 1000.0,
        };
    }

    console.log(`Measuring ${fnProcess.length} functions...`);
    let params = fnSetup();
    //console.log("Setup function returned:", params);

    // Perform this for every function passed.
    for (let fn of fnProcess) {
        let results = [];
        console.groupCollapsed(`${fn.name}: Running for ${cycles} cycles...`);
        // Perform this N times.
        for (let cycle = 0; cycle < cycles; cycle++) {
            let result = {
                "iterations": Number.NaN,
                "start": Number.NaN,
                "end": Number.NaN,
                "duration": Number.NaN,
                "opsPerSec": Number.NaN,
            };

            try {
                result = measureCycle(timeLimit, fn, params);
                results.push(result);
            } catch (ex) {
                console.error(`${fn.name}:`, ex);
                break;
            }

            console.log(`Cycle ${cycle}/${cycles}: ${result.iterations}, ${result.end - result.start}, ${result.opsPerSec} ops/s`);
        }

        // If we have more than 3 repeats, drop slowest and fastest as outliers.
        if (results.length > 3) {
            console.log("Dropping slowest and fastest result.");
            results = results.sort((a, b) => {
                return (a.end - a.start) > (b.end - b.start);
            }).slice(1);
            results = results.sort((a, b) => {
                return (a.end - a.start) < (b.end - b.start);
            }).slice(1);
        }
        console.groupEnd();

        // Merge all results for the final average.
        let iterations = 0;
        let totalTime = 0;
        let opsPerSecMin = +Infinity;
        let opsPerSecMax = -Infinity;
        let opsPerSec = 0;
        for (let result of results) {
            iterations += result.iterations;
            totalTime += result.duration;
            opsPerSec += result.opsPerSec;
            if (opsPerSecMin > result.opsPerSec) {
                opsPerSecMin = result.opsPerSec;
            }
            if (opsPerSecMax < result.opsPerSec) {
                opsPerSecMax = result.opsPerSec;
            }
        }
        let operations = opsPerSec / results.length; //iterations / totalTime;
        let operationVariance = opsPerSecMax - opsPerSecMin;
        console.log(`${fn.name}: ${(operations).toFixed(2)}±${(operationVariance).toFixed(2)} ops/s, ${iterations} iterations over ${totalTime} ms.`);
    }
    console.log("Done.");
}

function spread(arr) { return [...arr]; }
function spreadNew(arr) { return new Array(...arr); }
function arraySlice(arr) { return arr.slice(); }
function arraySlice0(arr) { return arr.slice(0); }
function arrayConcat(arr) { return [].concat(arr); }
function arrayMap(arr) { return arr.map(i => i); }
function objectValues(arr) { return Object.values(arr); }
function objectAssign(arr) { return Object.assign([], arr); }
function json(arr) { return JSON.parse(JSON.stringify(arr)); }
function loop(arr) { const a = []; for (let val of arr) { a.push(val); } return a; }

benchTime(
    10, 1000,
    () => {
        let arr = new Array(16384);
        for (let a = 0; a < arr.length; a++) { arr[a] = Math.random(); };
        return [arr];
    },
    spread,
    spreadNew,
    arraySlice,
    arraySlice0,
    arrayConcat,
    arrayMap,
    objectValues,
    objectAssign,
    json,
    loop
);

I've run this for a few sizes, but here's a copy of the 16384 element data:

Test NodeJS v20.12.2 Firefox v127.0b2 Edge 124.0.2478.97
spread 13.180±1.022 ops/ms 7.436±1.110 ops/ms 3.321±0.239 ops/ms
spreadNew 4.727±0.532 ops/ms 1.010±0.022 ops/ms 21.045±2.982 ops/ms
arraySlice 12.912±2.127 ops/ms 11046.737±237.575 ops/ms 494.359±32.726 ops/ms
arraySlice0 13.192±0.477 ops/ms 10665.299±500.553 ops/ms 492.209±66.837 ops/ms
arrayConcat 16.590±0.656 ops/ms 7923.657±224.637 ops/ms 476.975±112.053 ops/ms
arrayMap 6.542±0.301 ops/ms 32.960±3.743 ops/ms 52.127±3.472 ops/ms
objectValues 4.339±0.111 ops/ms 10840.392±619.567 ops/ms 115.369±3.217 ops/ms
objectAssign 0.270±0.013 ops/ms 10471.860±202.291 ops/ms 83.135±3.439 ops/ms
json 0.205±0.039 ops/ms 4.014±1.679 ops/ms 7.730±0.319 ops/ms
loop 6.138±0.287 ops/ms 6.727±1.296 ops/ms 27.691±1.217 ops/ms

Overall, it appears as if the following is true:

  • (arraySlice)array.slice(), (arraySlice0)array.slice(0) and (arrayConcat)[].concat(array) are equivalently fast. arraySlice appears to be minimally faster.
  • In Chromium-based Browsers only, (objectAssign)Object.assign([], array) is faster with massive arrays, but brings the downside of eliminating any entries that are undefined.
  • The worst ways overall to shallow-clone arrays are (spreadNew)New Array(...array), (arrayMap)array.map(i => i), (objectValues)Object.values(array), (objectAssign)Object.assign(array), (json)JSON.parse(JSON.stringify(array), (loop)let arr = new Array(); for (let e of array) { arr.push(e); }; return arr; }.
  • Firefox appears to be optimizing in a way that outright looks like cheating. I've failed to identify a way to make Firefox look normal.

Disclaimer: These results are from my personal system, and I reserve the right to have made an error in my code.

Comments

3
        const arr = ['1', '2', '3'];

         // Old way
        const cloneArr = arr.slice();

        // ES6 way
        const cloneArrES6 = [...arr];

// But problem with 3rd approach is that if you are using muti-dimensional 
 // array, then only first level is copied

        const nums = [
              [1, 2], 
              [10],
         ];

        const cloneNums = [...nums];

// Let's change the first item in the first nested item in our cloned array.

        cloneNums[0][0] = '8';

        console.log(cloneNums);
           // [ [ '8', 2 ], [ 10 ], [ 300 ] ]

        // NOOooo, the original is also affected
        console.log(nums);
          // [ [ '8', 2 ], [ 10 ], [ 300 ] ]

So, in order to avoid these scenarios to happen, use

        const arr = ['1', '2', '3'];

        const cloneArr = Array.from(arr);

1 Comment

It's a valid thing to point out about how changing cloneNums[0][0] in your example propagated the change to nums[0][0] - but that's because the nums[0][0] is effectively an object whose reference is copied into cloneNums by the spread operator. All that is to say, this behaviour won't affect code where we are copying by value (int, string etc literals).
3

There were several ways to clone an array. Basically, Cloning was categorized in two ways:

  1. Shallow copy
  2. Deep copy

Shallow copies only cover the 1st level of the array and the rest are referenced. If you want a true copy of nested elements in the arrays, you’ll need a deep clone.

Example :

const arr1 = [1,2,3,4,5,6,7]           
// Normal Array (shallow copy is enough)     
const arr2 = [1,2,3,[4],[[5]],6,7]          
// Nested Array  (Deep copy required) 


Approach 1 : Using (...)Spread Operator  (Shallow copy enough)
const newArray = [...arr1] // [1,2,3,4,5,6,7]

Approach 2 : Using Array builtIn Slice method (Deep copy)  
const newArray = arr1.slice()  // [1,2,3,4,5,6,7]

Approach 3 : Using Array builtIn Concat method (Deep a copy)
const newArray = [].concat(arr1)  // [1,2,3,4,5,6,7]

Approach 4 : Using JSON.stringify/parse. (Deep a copy & fastest)
const newArray = JSON.parse(JSON.stringify(arr2));)  // [1,2,3,[4],[[5]],6,7]

Approach 5: Using own recursive function or using loadash's __.cloneDeep method. (Deep copy)

Comments

2

A simple solution:

original = [1,2,3]
cloned = original.map(x=>x)

Comments

1

Fast ways to duplicate an array in JavaScript in Order:

#1: array1copy = [...array1];

#2: array1copy = array1.slice(0);

#3: array1copy = array1.slice();

If your array objects contain some JSON-non-serializable content (functions, Number.POSITIVE_INFINITY, etc.) better to use

array1copy = JSON.parse(JSON.stringify(array1))

Comments

1

You can follow this code. Immutable way array clone. This is the perfect way to array cloning


const array = [1, 2, 3, 4]

const newArray = [...array]
newArray.push(6)
console.log(array)
console.log(newArray)

Comments

1

If you want a REAL cloned object/array in JS with cloned references of all attributes and sub-objects:

export function clone(arr) {
    return JSON.parse(JSON.stringify(arr))
}

ALL other operations do not create clones, because they just change the base address of the root element, not of the included objects.

Except you traverse recursive through the object-tree.

For a simple copy, these are OK. For storage address relevant operations I suggest (and in most all other cases, because this is fast!) to type convert into string and back in a complete new object.

Comments

1

May 2024

Chrome

enter image description here

Firefox

never finishes this benchmark, certainly an issue with the benchmarking tool.

https://jsben.ch/lO6C5

Also, structuredClone is now available which is very slow (3%)

Comments

0

If you are taking about slice it is used to copy elements from an array and create a clone with same no. of elements or less no. of elements.

var arr = [1, 2, 3 , 4, 5];

function slc() {
  var sliced = arr.slice(0, 5);
// arr.slice(position to start copying master array , no. of items in new array)
  console.log(sliced);
}
slc(arr);

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.