2

I'm trying to allocate a vector<vector<Class>> which contain itself a vector<AnotherClass>, but I obtain an allocation error, so my question is: the max_size() given on a variable apply for all the vector of my program ?

Can I change this limit by changing my compiler ?

Here the code I used to check that :

class Couches
{
public:
    Couches() : m_value(-1) {}
    ~Couches() {}

    void initialize(const int& value) {
        m_value = value;
    }
private :
    int m_value;
};

class Case
{
public :
    Case() {}
    ~Case() {}

    void initialize(const int& hauteur) {
        m_couches.resize(hauteur);
        for (int i(0); i<hauteur;i++)
            m_couches[i].initialize(i);
    }
private :
    vector<Couches> m_couches;
};


void bug1()
{
    vector<vector<Case>> m_cases;
    m_cases.resize(5000, vector<Case>(5000));

    cout<< m_cases.max_size()<<" " <<5000*5000*20<<endl;

    for (int i(0); i<m_cases.size(); i++)
    {
        for (int j(0); j<m_cases[i].size(); j++)
        {
            m_cases[i][j].initialize(20);
        }
    }
}

I have a max_size of 357M < 500M I was expected to create.

EDIT : Sorry guys I said error but it's an error given by the debugger :

#1 0x405b36 operator new(unsigned int) () (??:??)

#2 0x490f58 typeinfo for std::time_put<wchar_t, std::ostreambuf_iterator<wchar_t, std::char_traits<wchar_t> > > () (??:??)

#3 0x4761ac std::allocator_traits<std::allocator<Couches> >::allocate(__a=..., __n=0) (D:/CodeBlocks/MinGW/lib/gcc/mingw32/4.9.2/include/c++/bits/alloc_traits.h:357)

terminate called after throwing an instance of 'std::bad_alloc'
  what():  std::bad_alloc

I use an initialize function because it is a mcve and in my original code I need this function.

5
  • 1
    max_size() is implemented by the library implementor, it depends on the implementor whether or not you can change this. Commented Jul 18, 2017 at 8:29
  • 2
    1) but I obtain an allocation error Which is..? Please copy-paste that error message. 2) Why are you using initialize method, for your class, while having constructor empty? Since initialization of a class, is a job for constructor. Commented Jul 18, 2017 at 8:30
  • 1
    Still, what you have written is not a full error message. Copy-paste full error message, verbatim. Commented Jul 18, 2017 at 8:47
  • 2
    This value typically reflects the theoretical limit on the size of the container. At runtime, the size of the container may be limited to a value smaller than max_size() by the amount of RAM available en.cppreference.com/w/cpp/container/vector/max_size Commented Jul 18, 2017 at 8:48
  • Posting the root node of a stack trace is next to useless :/ If you're not certain what readers need to see, then don't post what you think they need to see; post the whole thing. Commented Jul 18, 2017 at 8:51

3 Answers 3

1

vector::max_size() relates to how big a single vector can get, in the absence of any other memory usage. None of your vectors approach that size. The biggest is 5000, with an individual allocation of 5000 * max(sizeof(vector<Case>), sizeof(Case)) which for me is 80,000.

The error you are seeing is that the total allocation of all the 25,000,000 Cases and 500,000,000 Couches exceeds the address space in your program

Sign up to request clarification or add additional context in comments.

3 Comments

ok that's clear, I compiled in 64-bit and of course that was good ! Then can I prevent the begin of the overflow of my memory ?
@PierreChéneau: allocate only the memory you actually use. Do you need to use 5000*5000 Cases? Do you need to use 20 Couches in each? Do you need int for m_value? Maybe you can use a smaller data type for it, like short/char.
It was an example to find the limit of my program, so yes i will have less Cases and Couches. I didn't think about using short, i will use it. It is more about a way to detect the slow down of the computer and block the program.
1

max_size() is the maximum number of items that can be placed in a vector this number is limited by your system bits. if you are on a 32 bit system that is 2^32 char values. you are reaching the system or library implementation limitations thats why you get 375M

You should use std::vector<T>::size_type for your array index

4 Comments

What will it change to use a std::vector < T >::size_type for my array index. Could you give me an example ?
there is a nice explanation here paercebal's answer
@PierreChéneau The size_type of a container is what you should use for referring to its size or an index within it - period. An int almost certainly can't represent the max size or index the container might have: the int is going to have a lower max than a std::size_t because (A) it may have a lower width in bits and (B) even if has the same width in bits, it is signed so the max is reduced by half accordingly.
so in my boucle for if I write : for (auto i (0) ; i < m_cases.size(); i++) Will the compiler understand it's a size_type or because of the (0) it will be considered as an unsigned int ?
1

I suppose that you compiled your code in 32-bit.

Here's a guess, why get 341M as max_size. A typical implementation will get you the value of SIZE_MAX/sizeof(element_size). SIZE_MAX is 4GB-1. And you check the max_size() value for a vector which contains another vector. A typical sizeof(vector) is 12. So, the given answer is 4GB/12=341M. Note: the implementation can provide any max_size value it wants.

Compile your code in 64-bit, and if you have the necessary memory, your code will run.

Note: in your code, you don't have to worry about max_size(), as your vectors don't contain that much elements. I mean, there is no single vector instance, which has that much elements. The problem is that your summed memory consumption is large, which doesn't fit into ~2GB, which is a typical maximum allowed process size in 32-bit. So I think your program gets out of memory, there is no problem about vector size.

Note2: I've used M=2^20, G=2^30

7 Comments

Why do you think the growth rate affects the maximal size that can be stored? It doesn't. It only affects how the available capacity grows when reallocation occurs. I don't know why this was marked as accepted. What am I missing?
I would say not to post something as an answer if it relies upon an idea that is peppered with "I think" or "this [...] is just a guess".
@underscore_d this is related to implementation of the vector directly. when a vectors capacity is increased elements needs to be copied over and as a result extra space is used
@Swift: yes, yes. :) That's why I said "typical". Order of magnitude. So the OP can understand what's going on. We could talk about the overhead of a single allocation, memory/address space fragmentation, etc. That's not the point here, I think.
@underscore_d: actually, that was the exact reason I mentioned growth rates. But, I thought about a little bit, and that cannot be the answer for 1.5, so I started to check the implementation. It's because, if a current size is X, during grow, X + rate*X memory is needed. If max is 2GB, then dividing by 1.5 is meaningless, as we need division by (1+rate), which is at least 2. If max is 4GB, it would mean divison by ~2.93 (1GB/0.341GB), for which rate would be 1.93, which is a strange number.
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.