Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

4
  • Thanks for the suggestion gogoud. The only concern I have with this is the creation of a separate file for each url. Ideally I need a single output file as I will be performing further global cleaning of the resulting data and having it in separate files may get messy. I suppose with your suggestion I could then introduce a command to join the data from the separate outputs in a single file, but I will wait and see if anyone provides a simpler solution before I accept this answer. Thanks again. Commented Nov 25, 2015 at 15:27
  • I added a short script which outputs a single stream of data and avoids any leftover files Commented Nov 25, 2015 at 15:39
  • gogoud- ok mate, gonna try this now, will get back to you, thanks. Commented Nov 25, 2015 at 16:14
  • Thanks gogoud, that is working nicely, thanks for your input, i've accepted your answer :) Commented Nov 25, 2015 at 16:44