Skip to content

Python script error: Too many open files in system #13361

Closed Locked Answered by Joibel
Nataliia5722 asked this question in Q&A
Discussion options

You must be logged in to vote

This is not argo-workflows fault.

You are running python inside your workflow pods, which are just normal kubernetes pods.
Some thoughts:

  • you are now processing more files than you used to, so are running out of file handles
  • your nodes are busier than they used to be, so the nodes are running out of file handles
  • your code has been updated and now doesn't close a file handle that it should
  • your kubernetes cluster has been updated and the nodes are more constrained than they used to be.

The argo-workflows community can't help fix this for you

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@Nataliia5722
Comment options

Answer selected by Nataliia5722
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
type/support User support issue - likely not a bug solution/invalid This is incorrect. Also can be used for spam
2 participants