5

I have a Python script that is executing the following command to copy the content of a CSV file to a database:

copy_query = "COPY table_name FROM STDIN DELIMITER '%s' CSV HEADER QUOTE '\"';" % (self.delimiter)

table_name represents an already created table with specific types. The data in the CSV file doesn't always match with the column type. For example, the CSV file could be like this:

"row1", 123, "2012-11-11"
"row2", 346, "2012-11-12"
"row3", \N,  "2012-11-12"

As you can see, column 2 should be of type int, but since the data on row 3 doesn't match with type int, the entire operation fails. Is there a way to maybe reject this row altogether? I'd prefer it to fill in with some default value of the appropriate type, but rejecting the row outright is fine too. Thank you for your help!

2
  • Currently (9.5 and older, at least) PostgreSQL offers no way to let you transform or filter rows in COPY like this. The usual solution is to import to a temporary/unlogged table with all text columns, then do an insert into ... select ... to transform it. See many existing related answers. Commented Oct 31, 2015 at 11:21
  • Possible duplicate of Copy NULL values present in csv file to postgres Commented Nov 2, 2015 at 16:27

1 Answer 1

0

You cannot do this with COPY command but you could modify the target table to accept NULL for these numeric fields.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.