I have a big database consisting of 170k Items and each Item having 20-150 features. When I run the method, which retrieves this data into a map, consisting of item_id and list of features, I get Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded. The method runs around 30 minutes and throws exception. Here is the method:
public Map<Integer, List<ItemFeatureMapping>> getItemFeatures() {
List<Item> allItems = getAllItems();
Map<Integer, List<ItemFeatureMapping>> result = new HashMap<>();
for (Item i : allItems) {
List<ItemFeatureMapping> itemFeatures = new ArrayList<>();
for (ItemFeatureMapping feature: i.getItemFeatures()) {
itemFeatures.add(feature);
}
result.put(i.getId(), itemFeatures);
}
return result;
}
I looked in the manual:
The parallel collector will throw an OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an
OutOfMemoryErrorwill be thrown. This feature is designed to prevent applications from running for an extended period of time while making little or no progress because the heap is too small. If necessary, this feature can be disabled by adding the option-XX:-UseGCOverheadLimitto the command line.
How can I sort this out? Or is it possible to optimize the method?
P.S.: I tried to do via command line, but getting an error that Command not found. However, I dont like this solution, since the method then may be running for a long time.