You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello. I am trying to use Bambu v3.8.0 to perform isoform discovery and quantification on 8 BAM files with sizes ranging from 30 GB to 52 GB. The server I am running Bambu on has 256 GB of memory. However, the Bambu process ran out of memory even under its low-memory mode. The log messages I got before it crashed are as follows:
--- Start generating read class files ---
< warning messages not included here >
--- Start extending annotations ---
Using a novel discovery rate (NDR) of: 0.47
--- Start isoform quantification ---
Running journalctl -rx, I see an Out of memory: Killed process... log message.
I am calling bambu() with all 8 BAMs passed into it and with quant = TRUE, discovery = TRUE and lowMemory = TRUE. The BAMs only contain primary alignments and contain no secondary or supplementary alignments. What can I do to optimize Bambu's memory usage?