Linked Questions
53 questions linked to/from How to speed up insertion performance in PostgreSQL
384
votes
11
answers
599k
views
What's the fastest way to do a bulk insert into Postgres? [closed]
I need to programmatically insert tens of millions of records into a Postgres database. Presently, I'm executing thousands of insert statements in a single query.
Is there a better way to do this, ...
259
votes
2
answers
86k
views
Optimise PostgreSQL for fast testing
I am switching to PostgreSQL from SQLite for a typical Rails application.
The problem is that running specs became slow with PG.
On SQLite it took ~34 seconds, on PG it's ~76 seconds which is more ...
41
votes
2
answers
31k
views
Use binary COPY table FROM with psycopg2
I have tens of millions of rows to transfer from multidimensional array files into a PostgreSQL database. My tools are Python and psycopg2. The most efficient way to bulk instert data is using ...
23
votes
3
answers
24k
views
Insert the whole value of DataTable bulk into postgreSQL table
In SQL we do something like this for bulk insert to datatable
SqlBulkCopy copy = new SqlBulkCopy(sqlCon);
copy.DestinationTableName = strDestinationTable;
copy.WriteToServer(dtFrom);
...
5
votes
3
answers
26k
views
postgresql database insert takes too long
Is this slow?
I have a table with 4 columns
ID Surname Coach City
I have a add data button, that adds 300 rows, each containing data in each column, so a total of 1200 records.
It takes just ...
18
votes
1
answer
10k
views
Writing more than 50 millions from Pyspark df to PostgresSQL, best efficient approach
What would be the most efficient way to insert millions of records say 50-million from a Spark dataframe to Postgres Tables.
I have done this from spark to
MSSQL in the past by making use of bulk ...
3
votes
2
answers
11k
views
Why psycopg2 are so slow with me?
I have a program that works with postgres using psycopg2.
But insertion in DB takes too long.
Here are the results of profiling using cProfile.
ncalls tottime percall cumtime percall filename:...
5
votes
1
answer
4k
views
Slow Simultaneous Writes to Same Table in PostgreSQL Database
I suspect this question may be better suited for the Database Administrators site, so LMK if it is and I'll move it. :)
I'm something of a database/Postgres beginner here so help me out. I have a ...
2
votes
1
answer
4k
views
Hibernate multi row insert postgresql
Now I have working batch inserts with hibernate ("hibernate.jdbc.batch_size = 50)
but as far as i know hibernate generates single inserts in batches.
I know that I can tell my db driver to create ...
3
votes
1
answer
4k
views
Bulk insert to postgresql with prepared statement(s)
I'm using libpq (but I am potentially be ready to switch to other library)
I have a bunch of similar INSERT queries I want to make, they are different only in values between them.
I'd like to use ...
4
votes
2
answers
4k
views
PostgreSQL shared_buffers on Windows
I'm runnung 64-bit PostgreSQL 9.1 on Windows Server. I'm trying to improve its performanace especially for handling heavy writing. I used to increase shared_buffer to %25 of RAM, and scine I got 32GB ...
1
vote
1
answer
2k
views
How to write data frame to Postgres table without using SQLAlchemy engine?
I have a data frame that I want to write to a Postgres database. This functionality needs to be part of a Flask app.
For now, I'm running this insertion part as a separate script by creating an ...
0
votes
2
answers
5k
views
Best way to move data between two databases using SQLAlchemy
(ANSWERED) My answer to this is down below hope it helps.
I am quite new to SQLAlchemy and Python as a whole and I am looking for some advise. I am looking at moving data from one Postgres DB to ...
0
votes
1
answer
2k
views
Possible bottlenecks when inserting and updating BYTEA rows?
The project requires storing binary data into PostgreSQL (project requirement) database. For that purpose we made a table with following columns:
id : integer, primary key, generated by client
data ...
0
votes
2
answers
1k
views
Best way to make PostgreSQL backups
I have a site that uses PostgreSQL. All content that I provide in my site is created at a development environment (this happens because it's webcrawler content). The only information created at the ...