I'm looking for a utility that will download files from the Web based on lists of URLs stored in text files.
This is for a non-profit organization that has secured permission from the authors to distribute their works to people in poor parts of the world where the Internet is not available.
I seek a modular solution that would accommodate various link types, like YouTube videos, Google Docs, and FTP. For this project I want a solution for YouTube and Google Docs.
For example, the YouTube downloader would take a URL like...
<[url removed, login to view]>
And then store the resulting file in a folder based on the host name ([[url removed, login to view]]) and the filename using the video ID (0pL4hQHunxg)
The solution can use one or more third-party tools. I prefer open source, but I'm also willing to pay for licensed software.
1) All deliverables will be considered "work made for hire" under U.S. Copyright law. Employer will receive exclusive and complete copyrights to all work purchased. (No GPL, GNU, 3rd party components, etc. unless all copyright ramifications are explained AND AGREED TO by the employer on the site per the worker's Worker Legal Agreement).
2) Complete and fully-functional working program(s) in executable form as well as complete source code of all work done.
3) Deliverables must be in ready-to-run condition, as follows (depending on the nature of the deliverables):
a) For web sites or other server-side deliverables intended to only ever exist in one place in the Employer's environment--Deliverables must be installed by the Worker in ready-to-run condition in the Employer's environment.
b) For all others including desktop software or software the employer intends to distribute: A software installation package that will install the software in ready-to-run condition on the platform(s) specified in this project.
Windows and command line interfaces