0

Is there any possibility to insert 50k datasets into a postgresql database using dbeaver? Locally, it worked fine for me, it took me 1 minute, because I also changed the memory settings of postgresql and dbeaver. But for our development environment, 50k queries did not work.

Is there a way to do this anyway or do I need to split the queries and do for example 10k queries 5 times? Any trick?

EDIT: with "did not work" I mean I got an error after 2500 seconds saying something like "too much data ranges"

3
  • 1
    Can't you use the COPY statement? That's way faster than INSERT Commented Jul 5, 2022 at 8:02
  • "50k queries did not work": please describe what happened. Commented Jul 5, 2022 at 8:07
  • Use copy instead pf insert statements. If not possible, split your inserts in small batches. Commented Jul 5, 2022 at 8:10

1 Answer 1

2

If you intend to execute a giant script sql via interface: don't even try.

If you have a csv file, DBeaver gives you a tool:

DBeaver import tool

Even better, as described in comments, copy command is the tool.

If you have a giant SQL file you need to use command line, like:

psql -h host -U username -d myDataBase -a -f myInsertFile

Like in this post: Run a PostgreSQL .sql file using command line arguments

Sign up to request clarification or add additional context in comments.

2 Comments

hi, thanks, but can you elaborate a bit more please, what you mean with command line? doing it via windows command line?
Done! You should have tools installed.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.