I have a function that constructs a std::string from a const char* with two numbers, passed as parameters, appended to the end of it.
std::string makeName(const char* name, uint16_t num1, uint16_t num2) {
std::string new_name(name);
new_name.reserve(new_name.length()+5);
new_name += ":";
new_name += boost::lexical_cast<std::string>(num1);
new_name += ":";
new_name += boost::lexical_cast<std::string>(num2);
return new_name;
}
This function gets called thousands of times to create unique names for small objects allocated on the heap.
Object* object1= new Object(makeName("Object", i, j)); // i and j are simply loop indices
I have discovered using valgrind's massif tool that the calls to makeName allocates a lot of memory since it gets called so many times.
87.96% (1,628,746,377B) (heap allocation functions) malloc/new/new[], --alloc-fns, etc.
->29.61% (548,226,178B) 0xAE383B7: std::string::_Rep::_S_create(unsigned long, unsigned long, std::allocator<char> const&) (in /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.19)
| ->26.39% (488,635,166B) 0xAE38F79: std::string::_Rep::_M_clone(std::allocator<char> const&, unsigned long) (in /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.19)
| | ->26.39% (488,633,246B) 0xAE39012: std::string::reserve(unsigned long) (in /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.19)
| | | ->15.51% (287,292,096B) 0x119A80FD: makeName(char const*, unsigned short, unsigned short) (Object.cpp:110)
| | | | ->15.51% (287,292,096B) in 42 places, all below massif's threshold (01.00%)
My question is, how can I minimize these allocations to help reduce the overall total amount of memory my program uses?
EDIT: I also want to note that as a program requirement I cannot use c++11 features.
sstream ss; ss << name << " : " << num1 << " : " << num2; return ss.str();std::string? If your number of objects is within numeric limits, how about assigning a uniqueintto it?stringstreamisn't the answer."Object:22:979"will take up 14 bytes of memory, plus 12-24 bytes for the pointers to track it, and another 4-16 bytes of allocation overhead by the heap. If that is large compared to your object, and you have many objects... that is overhead. And if the objects are otherwise small, the high percentage might be because that is what you asked for?