I have been trying to load a huge DF into postgres table with SqlAlchemy but the process is always been killed after a while.
Is there a better way to do this with pandas?
...
>>engine = create_engine('postgresql://stargateuser:5tar9ate@localhost/stargate',encoding='utf-8', echo=True)
>>MainName.to_sql("landingpage_mainname", con=engine, if_exists="replace")
Killed
MainName.to_sql("landingpage_mainname", con=engine, if_exists="replace", chunksize=100)Here's the doc for to_sql