5

I have been trying to load a huge DF into postgres table with SqlAlchemy but the process is always been killed after a while. Is there a better way to do this with pandas?

...

>>engine = create_engine('postgresql://stargateuser:5tar9ate@localhost/stargate',encoding='utf-8', echo=True)

>>MainName.to_sql("landingpage_mainname", con=engine, if_exists="replace")
Killed
1
  • 2
    Have you tried the "chunksize" keyword? i.e. MainName.to_sql("landingpage_mainname", con=engine, if_exists="replace", chunksize=100) Here's the doc for to_sql Commented Aug 22, 2018 at 12:45

1 Answer 1

3

This works! Thanks @JohnChing

MainName.to_sql("landingpage_mainname", con=engine,
                if_exists="replace", chunksize=200000)
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.