Linked Questions

384 votes
11 answers
599k views

I need to programmatically insert tens of millions of records into a Postgres database. Presently, I'm executing thousands of insert statements in a single query. Is there a better way to do this, ...
Ash's user avatar
  • 25.9k
259 votes
2 answers
86k views

I am switching to PostgreSQL from SQLite for a typical Rails application. The problem is that running specs became slow with PG. On SQLite it took ~34 seconds, on PG it's ~76 seconds which is more ...
Dmytrii Nagirniak's user avatar
41 votes
2 answers
31k views

I have tens of millions of rows to transfer from multidimensional array files into a PostgreSQL database. My tools are Python and psycopg2. The most efficient way to bulk instert data is using ...
Mike T's user avatar
  • 44.3k
23 votes
3 answers
24k views

In SQL we do something like this for bulk insert to datatable SqlBulkCopy copy = new SqlBulkCopy(sqlCon); copy.DestinationTableName = strDestinationTable; copy.WriteToServer(dtFrom); ...
Karan Singh's user avatar
5 votes
3 answers
26k views

Is this slow? I have a table with 4 columns ID Surname Coach City I have a add data button, that adds 300 rows, each containing data in each column, so a total of 1200 records. It takes just ...
Andrew - Avro Computers's user avatar
18 votes
1 answer
10k views

What would be the most efficient way to insert millions of records say 50-million from a Spark dataframe to Postgres Tables. I have done this from spark to MSSQL in the past by making use of bulk ...
Chetan_Vasudevan's user avatar
3 votes
2 answers
11k views

I have a program that works with postgres using psycopg2. But insertion in DB takes too long. Here are the results of profiling using cProfile. ncalls tottime percall cumtime percall filename:...
Viach Kakovskyi's user avatar
5 votes
1 answer
4k views

I suspect this question may be better suited for the Database Administrators site, so LMK if it is and I'll move it. :) I'm something of a database/Postgres beginner here so help me out. I have a ...
dmn's user avatar
  • 995
2 votes
1 answer
4k views

Now I have working batch inserts with hibernate ("hibernate.jdbc.batch_size = 50) but as far as i know hibernate generates single inserts in batches. I know that I can tell my db driver to create ...
Jack's user avatar
  • 455
3 votes
1 answer
4k views

I'm using libpq (but I am potentially be ready to switch to other library) I have a bunch of similar INSERT queries I want to make, they are different only in values between them. I'd like to use ...
RiaD's user avatar
  • 47.8k
4 votes
2 answers
4k views

I'm runnung 64-bit PostgreSQL 9.1 on Windows Server. I'm trying to improve its performanace especially for handling heavy writing. I used to increase shared_buffer to %25 of RAM, and scine I got 32GB ...
Shad's user avatar
  • 203
1 vote
1 answer
2k views

I have a data frame that I want to write to a Postgres database. This functionality needs to be part of a Flask app. For now, I'm running this insertion part as a separate script by creating an ...
Underoos's user avatar
  • 5,266
0 votes
2 answers
5k views

(ANSWERED) My answer to this is down below hope it helps. I am quite new to SQLAlchemy and Python as a whole and I am looking for some advise. I am looking at moving data from one Postgres DB to ...
IronSoul's user avatar
0 votes
1 answer
2k views

The project requires storing binary data into PostgreSQL (project requirement) database. For that purpose we made a table with following columns: id : integer, primary key, generated by client data ...
omittones's user avatar
  • 867
0 votes
2 answers
1k views

I have a site that uses PostgreSQL. All content that I provide in my site is created at a development environment (this happens because it's webcrawler content). The only information created at the ...
Andre's user avatar
  • 439

15 30 50 per page