I have a huge numpy 3D tensor which is stored in a file on my disk (which I normally read using np.load). This is a binary .npy file. On using np.load, I quickly end up using most of my memory.
Luckily, at every run of the program, I only require a certain slice of the huge tensor. The slice is of a fixed size and its dimensions are provided from an external module.
What's the best way to do this? The only way I could figure out is somehow storing this numpy matrix into a MySQL database. But I'm sure there are much better / easier ways. I'll also be happy to build my 3D tensor file differently if it will help.
Does the answer change if my tensor is sparse in nature?
.npy. Saved usingnp.save