I have found a website that hosts a few files that I'm after, there's too many to download them all individually. The filenames take a fairly standard and reproduceable form i.e. 1_a, 1_b, 1_c etc
Is there a way, using the Linux command line, to use wget to automate downloading them all? I can easily put the filenames in a 1 entry per line text file and direct the command line to look up from there, but it wouldn't be the whole URL, just the bit that changes so the command would need to look something like:
wget url.com/files/(bit from file).doc sourcefile.txt
And basically be able to substitute in an entry from the sourcefile to the bit in the brackets.
Also, at one stage a large chunk (a few hundred) of the files are simply sequentially numbered, so could I use a for loop for that bit? If so, how would I do this syntactically in the command line?