Memory leak when saving and indexing (clouseau) a big document (10MiB) #3689
Unanswered
rmartinez-dasnano
asked this question in
Q&A
Replies: 2 comments
-
@rmartinez-dasnano We aren't responsible for the You'll need to work with IBM on this one. You may have luck on our Slack channel at https://couchdb.apache.org/#chat , where some of its maintainers are. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Ok. I'll try with dreyfus guys, as the index part is mentioned in the doc I
supposed it was a good starting point.
Thanks!
Raúl
El dom, 1 ago 2021 a las 19:38, Joan Touzet ***@***.***>)
escribió:
… @rmartinez-dasnano <https://github.com/rmartinez-dasnano> We aren't
responsible for the ibmcom image and don't directly support Dreyfus
indexing, sorry - that code has not been adopted by Apache CouchDB. As such
I'm moving this to our Discussions area.
You'll need to work with IBM on this one. You may have luck on our Slack
channel at https://couchdb.apache.org/#chat , where some of its
maintainers are.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#3689 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AUPS53NRMDWE2FWZSUCALDLT2WBAPANCNFSM5BLLANOQ>
.
--
Raúl Martínez Maestre
das-Gate | Engineering R&D
<https://www.linkedin.com/company/das-nano/> <https://twitter.com/dasnano>
das-nano.com <https://www.das-nano.com/>
das-Nano, S.L.
***@***.***
Mobile: +34 697 88 74 69 | Phone: +34 948 24 62 95
Pamplona | Madrid | Ciudad de México | Bogotá | San Francisco
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Description
Using ibm docker (couchdb+lucene index) based on couchdb 3.1.1 (ibmcom/couchdb3:3.1.1), we are storing a big document into couchdb.
One of the fields in the document contains an array of arrays created from a 10MiB csv. When this document is created container memory usage starts growing until it reaches memory limit for the container (8Gib), or host limit (running in a 16GiB machine), or pocess limit
Javascript fragment in the desgin document indexing this field: joins position 1 of arrays in a variable and then we index it in a single call to index function (built string size is ~ 11MiB). It's the same if we call index function as many times as rows in the array, same error.
To avoid
os_process_error
issue, we tried to increase max memory of couchjs processes withCOUCHDB_QUERY_SERVER_JAVASCRIPT
But it seems it has no effect
Steps to Reproduce
Expected Behaviour
Your Environment
CouchDB version used: 3.1.1
Docker version: Docker version 20.10.7, build f0df350
Container limits:
Couchdb configuration:
Operating system and version: Ubuntu 20.04
Additional Context
Beta Was this translation helpful? Give feedback.
All reactions