Skip to main content
added hyperlink to torch.stack
Source Link
James Hirschorn
  • 8.1k
  • 6
  • 56
  • 58

You can use torch.stacktorch.stack:

torch.stack(li, dim=0)

after the for loop will give you a torch.Tensor of that size.

Note that if you know in advance the size of the final tensor, you can allocate an empty tensor beforehand and fill it in the for loop:

x = torch.empty(size=(len(items), 768))
for i in range(len(items)):
    x[i] = calc_result

This is usually faster than doing the stack.

You can use torch.stack:

torch.stack(li, dim=0)

after the for loop will give you a torch.Tensor of that size.

Note that if you know in advance the size of the final tensor, you can allocate an empty tensor beforehand and fill it in the for loop:

x = torch.empty(size=(len(items), 768))
for i in range(len(items)):
    x[i] = calc_result

This is usually faster than doing the stack.

You can use torch.stack:

torch.stack(li, dim=0)

after the for loop will give you a torch.Tensor of that size.

Note that if you know in advance the size of the final tensor, you can allocate an empty tensor beforehand and fill it in the for loop:

x = torch.empty(size=(len(items), 768))
for i in range(len(items)):
    x[i] = calc_result

This is usually faster than doing the stack.

Source Link
iacolippo
  • 4.5k
  • 1
  • 29
  • 39

You can use torch.stack:

torch.stack(li, dim=0)

after the for loop will give you a torch.Tensor of that size.

Note that if you know in advance the size of the final tensor, you can allocate an empty tensor beforehand and fill it in the for loop:

x = torch.empty(size=(len(items), 768))
for i in range(len(items)):
    x[i] = calc_result

This is usually faster than doing the stack.