I'm trying to import the data. I'm getting Memory Error. I increased the virtual memory, and the data size is 2.71 GB. I thought about setting the data types in advance to optimize memory consumption, so I found this site: Optimize Pandas Memory Usage for Large Datasets
base_path = pathlib.Path('dataset')
base_airbnb = pd.DataFrame()
for file in base_path.iterdir():
df = pd.read_csv(r'dataset\{}'.format(file.name))
base_airbnb = base_airbnb.append(df)
display(base_airbnb)
How to set pandas column types to decrease memory consumption?
ParserError: Error tokenizing data. C error: out of memory