Using your same find command, this will return the URLs that match the regex:
find . -path "*alder/ * / * .html" -exec grep -oh "http://[^'\"]*" {} +
Unlike find...-print | xargs command..., this approach will work on files whose names contain whitespace or other difficult characters
The -o option to grep tells it to return only the matching part, not the line that the match is on. -h tells it to omit printing of the file names from which the matches were found.
The find command in the OP only matches files whose names have spaces in the path. Since I suspect that this is not what you want, here is an alternative form of the find command that finds all .html files at any depth under the subdirectories of the current directory whose names end in alder:
find *alder/ -name '*.html' -exec grep -oh "http://[^'\"]*" {} +
More robust approach
To guard against other kinds of bad html files, cas suggests letting whitespace or > also signal the end of a URL and also accepting https as well as http:
find . -path "*alder/ * / * .html" -exec grep -oEh "https?://[^'\"[:space:]>]*" {} +
http://on it, the entire line, is that what you actually want? If not, isseda requirement, or would solutions that do not involvesedalso be acceptable?-path "*alder/ * / * .html"looks suspicious. What files are you really trying to find?