Skip to content

Memory exhausted - too many files #57

@alistairhockey

Description

@alistairhockey

Hi there,

I am trying to run 'intervene upset'on 73 BED files that have ~40,000 intervals each.

intervene upset -i /data/alistairh/projects/SV_calling/data/peaks/DiscRegions/*{1,2,3}.bed --output SV_calling/data/peaks/DiscRegions/results_RT --save-overlaps

However, Intervene uses up all the available memory (62G) before being killed by the server. Is there a setting or a fix to limit the memory use of Intervene so it doesn't get killed by the server? This hasn't been a problem before when I have used intervene for 15-20 BED files.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions