3

I have a large CSV file and I want to insert it all at once, instead of row by row. This is my code:

import pypyodbc

import csv

con = pypyodbc.connect('driver={SQL Server};' 'server=server_name;' 'database=DB-name;' 'trusted_connection=true')

cur = con.cursor()

csfile = open('out2.csv','r')

csv_data = csv.reader(csfile)

for row in csv_data:

    try:
        cur.execute("BULK INSERT INTO Table_name(Attribute, error, msg, Value, Success, TotalCount, SerialNo)" "VALUES (?, ?, ?, ?, ?, ?, ?)", row)
    except Exception:
        time.sleep(60)
cur.close()

con.commit()

con.close()

2 Answers 2

1

Bulk Insert should do it for you.

BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest
GO

http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/

Also, check out this link.

https://www.simple-talk.com/sql/learn-sql-server/bulk-inserts-via-tsql-in-sql-server/

Sign up to request clarification or add additional context in comments.

3 Comments

what if there is comma in 1 of the field?
Ouch! Maybe you can use a different field terminator. See the link below. howtogeek.com/howto/21456/…
so for python, you suggest simply replacing the SQL statement in OP with your query? (minus the SELECT statement)
0

It really depends on your system resources. You can store CSV file in memory and then insert it into database. But if your CSV file is larger than your RAM there should be some Time issue. You can save each row of csv file as an element in python List.here is my code:

csvRows = []
csvFileObj = open('yourfile.csv', 'r')
readerObj = csv.reader(csvFileObj)
for row in readerObj:
    element1 = row[0]
    .......
    csvRows.append((element1,element2,...))

after that read element of the list and insert it to your db. I don't think there is a direct way to insert All csv rows into sqldb at once. you need some preprocessing.

1 Comment

thank you for the suggestion but i think this will also take same amount of time as that of usual insert statement row by row, so i used bulk insert instead.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.