2

I have a Pandas dataframe, where some columns have values longer than 65536 characters. When I tried to export the data to MySQL using df.to_sql(con=engine, name=table_name, if_exists='replace', index=False), they were truncated to 65536 characters.

Is there a way to automatically convert a column to LONGTEXT or BLOB (instead of TEXT) if it has values longer than 65536 so that the table content won't be truncated?

1 Answer 1

2

This might be a workaround. The only thing is you need to have the list of columns that need to be converted to LONGTEXT.

from sqlalchemy.dialects.mysql import LONGTEXT
dtype = {
    "long_column_1": LONGTEXT,
    "long_column_2": LONGTEXT
}
pdf.to_sql(con=engine, name=table_name, if_exists='replace', index=False, dtype=dtype)
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.