I am looking for some shell or pearl scripts to aggregate data from U.S. Government based sites such as [url removed, login to view], Library of Congress ([url removed, login to view]) to XML based on standards set at [url removed, login to view] and finally to a MySQL database. I am initially looking for Congressional Information for current legislation, history, current and past representatives of house and senate, bio and voting records, all similiar to this project, [url removed, login to view] A really great resource for how to accomplish some of these tasks are found here: [url removed, login to view] See - Interagency Committee on Government Information (ICGI), [url removed, login to view] See - Federal CIO Council Website, [url removed, login to view] See - [url removed, login to view], [url removed, login to view]
1) Complete and fully-functional working program(s) in executable form as well as complete source code of all work done.
2) Deliverables must be in ready-to-run condition, as follows (depending on the nature of the deliverables):
a) For web sites or other server-side deliverables intended to only ever exist in one place in the Buyer's environment--Deliverables must be installed by the Seller in ready-to-run condition in the Buyer's environment.
b) For all others including desktop software or software the buyer intends to distribute: A software installation package that will install the software in ready-to-run condition on the platform(s) specified in this bid request.
3) All deliverables will be considered "work made for hire" under U.S. Copyright law. Buyer will receive exclusive and complete copyrights to all work purchased. (No GPL, GNU, 3rd party components, etc. unless all copyright ramifications are explained AND AGREED TO by the buyer on the site per the coder's Seller Legal Agreement).
Server(s) are Debian Linux 4.0.1 running PHP 5, MySQL 5, and Apache 2. Looking for either shell or Pearl based scripts. Am interested in better alternatives if someone has any better ideas.