Replies: 1 comment
-
Update: I found the bug. The 32 bit integer size is exceeded in the covariance matrix. The covariance matrix has the following type (dakota_data_types.hpp):
The Teuchos matrix uses a size_t during allocation, so that works fine. But during access of the matrix, it uses ints, which causes an integer overflow. As a result, Dakota cannot compute a full covariance matrix for more than sqrt(2**32/2) = 46340 responses. Setting the covariance matrix to diagonal only fixes this issue. I would recommend fixing this bug with one of the following options:
Note that most users probably don't want the full covariance matrix with that many responses. But it is turned on by default, causing a segfault on large problems. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello. I recently tried to run a PCE for a problem with a large number of responses (two field variables with 90,000 elements each). Dakota segfaults when processing the results. It works fine when I reduce the size of the field variables to 900 elements. **Are there any compile-time limitations on problem sizes that can be run with Dakota? **
I am not running out of memory and my ulimit is set to unlimited. I am running Dakota 6.20.0 on RHEL8.
Below is a minimal example that reproduces this error for me. The driver script 'driver.sh' just fills the results.dat file with random numbers.
Any thoughts or suggestions would be much appreciated!
Output:
bug_report.tar.gz
Beta Was this translation helpful? Give feedback.
All reactions