DataStax Help Center

sstableloader throws out of memory error

Summary

The user can sometimes see an out of memory issue when running sstable loader

Symptoms

The error message seen will typically appear to std out like so:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.apache.cassandra.io.compress.CompressionMetadata.getChunksForSections(CompressionMetadata.java:212)
at org.apache.cassandra.streaming.messages.OutgoingFileMessage.<init>(OutgoingFileMessage.java:76)
at org.apache.cassandra.streaming.StreamTransferTask.addTransferFile(StreamTransferTask.java:51)
at org.apache.cassandra.streaming.StreamSession.addTransferFiles(StreamSession.java:297)
at org.apache.cassandra.streaming.StreamPlan.transferFiles(StreamPlan.java:121)
at org.apache.cassandra.io.sstable.SSTableLoader.stream(SSTableLoader.java:176)
at org.apache.cassandra.tools.BulkLoader.main(BulkLoader.java:85)

Cause

The problem is caused by this bug: https://issues.apache.org/jira/browse/CASSANDRA-7385

Note the bug was closed as not a problem.

Workaround

Running the command as follows allows the user to allocate more memory to the process

sed -i -e 's/-Xmx256M/-Xmx8G/g' /usr/bin/sstableloader

Solution

The original bug was closed as not requiring a fix. Users must run the sstableloader with large memory settings as and when required

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request

Comments

Powered by Zendesk