Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

4
  • You can change the settings regarding PHP execution time limits, which would solve the time out issues. This question may be more suited to StackOverflow though as it's bordering on specific implementation details. Commented May 9, 2014 at 4:55
  • You could insert all data into a new temp table, add all old values whose id isnt in the temp table to it (with sql), then delete the real table and switch in the temp table (in a transaction of course). I would imagine this is way faster than doing 40k selects, looking at the result and do update/insert for each one Commented May 9, 2014 at 6:28
  • Have you considered using ETL tools like pentaho data integration or talend open studio (both are free)? You can do it in 5 minutes and they also do other useful stuff too. 40k rows is not big-data. Commented May 25, 2014 at 15:54
  • Hi @imel96 : please see as to how i have done. I have answered my question. Also since I do not have access to change anything in the production server, I cannot use these tools and stick to convectional techniques. Will check it out in the future. Commented May 26, 2014 at 23:51