pg_dum while node is running causes locking errors #2417
DerEwige
started this conversation in
Node operators
Replies: 3 comments
-
Yes, the recommended way to do backups with postgres is to use streaming replication. I believe you don't need a postgres cluster for that, that's something you should be able to easily setup with just two instances of postgres you manage yourself, but I haven't run it myself. |
Beta Was this translation helpful? Give feedback.
0 replies
-
You can try to backup only the critical parts, it should be faster and cause less issues: pg_dump --format=c \
--file=$filename \
--exclude-table=local.htlc_infos \
--exclude-schema=network \
--exclude-schema=audit \
--host=$dbhost \
--dbname=eclair \
--username=readonly \
--no-password # will find password in .pgpass or fail But yes, the proper way is to set up replication |
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I want to migrate my DB from sqlite to PostgreSQL.
I have it running in dual mode with sqlite as primary.
I noticed that I cannot take a dump using pg_dump while eclair is running.
(causes locking errors)
Is my only option to run a high availabilty PostgreSQL Cluster with streaming Replication?
Beta Was this translation helpful? Give feedback.
All reactions