Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

3
  • I’m aware of pg_dump, but it’s really user-unfriendly and a logistical nightmare compared to just copying the folder. My database is running inside of docker, so I’d need to run pg_dump inside, then copy outside, then tar and transfer it to my dev machine. The importing is even worse, I’d need to delete the local db, start a new db without data, wait for it to be up, pg_import or whatever it’s called, then start the app. My current workflow is just copying the prod db to my dev machine, placing it in a docker mounted folder and start the app. I don’t like pg_dumpall and had many issues with it Commented Sep 24, 2022 at 17:08
  • I use borgmatic for backups, it’s just that I want to verify that what I’m doing on dev will work well in prod. For that, all I need is a somewhat accurate snapshot. Only reason I even care it writes while compressing is because it sometimes causes foreign key issues that I need to manually fix which is annoying and becomes more common the longer prod is running as it collects a lot of data every day. Commented Sep 24, 2022 at 17:09
  • Also, I’m not quite sure how compatible pg_dumpall is with postgres extensions. I’m technically using TimeScaleDB as I have a ton of time-series data. I wouldn’t be surprised if pg_dumpall has no clue what to do with the TimeScaleDB data Commented Sep 24, 2022 at 17:11