I have problems splitting the values of bulk-insert because the idea is to make 1 insert every 10 values at a time and reading the entire contents of CSV file
The code already inserts in a single line reading the entire CSV file but I am unable to perform the division of VALUES in the case in the future perform an insert of 10 thousand values at a time.
def bulk_insert(table_name, **kwargs):
mysqlConnection = MySqlHook(mysql_conn_id='id_db')
a = mysqlConnection.get_conn()
c = a.cursor()
with open('/pasta/arquivo.csv') as f:
reader = csv.reader(f, delimiter='\t')
sql ="""INSERT INTO user (id,user_name) VALUES"""
for row in reader:
sql +="(" + row[0] + " , '" + row[1] + "'),"
c.execute(sql[:-1])
a.commit()
LOAD DATAbulk insert tool. No need in re-inventing the wheel and trying to manually do this from a Python script.