Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

1
  • This is just wrong. The smaller file needs to fit in memory as an array (hash table) to overcome the ordering problem. The larger file is processed serially, record by record. In extreme conditions, it would be a trivial improvement to split the smaller file into, say 1GB sections, and make multiple passes over the larger file. The first pass might need to be "special" to restructure the input columns to the "joined" format, and place default values in columns where the first pass did not contain the required update data. Commented Feb 8, 2020 at 13:58