I need a perl script( or suitable alternate app that can be run from windows) that will download a specific data file from an ftp site (daily); re-sort the field data and apply logic to certain fields to adjust the final output. There are 50 input fields - deliminated by # of characters only. Up to 1.4 million lines in the file. Script must take each line, split by field length to 50 fields; Then it needs to export the data to a tab-file with approximately 12 fields (most of the original fields are to be ignored) -- applying certain criteria to the values of certain fields. Most of the fields in the original file will either be ignored or reformated to the final file without change. Script should be fairly fast - able to run through the entire file within approximately 1 hour.
1) Complete and fully-functional working program(s) in executable form as well as complete source code of all work done.
2) Deliverables must be in ready-to-run condition, as follows (depending on the nature of the deliverables):
a) For web sites or other server-side deliverables intended to only ever exist in one place in the Buyer's environment--Deliverables must be installed by the Seller in ready-to-run condition in the Buyer's environment.
b) For all others including desktop software or software the buyer intends to distribute: A software installation package that will install the software in ready-to-run condition on the platform(s) specified in this bid request.
3) All deliverables will be considered "work made for hire" under U.S. Copyright law. Buyer will receive exclusive and complete copyrights to all work purchased. (No GPL, GNU, 3rd party components, etc. unless all copyright ramifications are explained AND AGREED TO by the buyer on the site per the coder's Seller Legal Agreement).
* * *This broadcast message was sent to all bidders on Thursday Jun 14, 2007 1:08:55 PM:
A file indicating the input file field specs has been uploaded. The output file is a simple tab-delimited file with about a dozen fields. The speed of the script is not crucial - any well-written script should blaze through this as fast as could be expected.. my only concern was the shape of the logic scripts could theoretically bog down on a file this size - or that a script wouldn't be written that would attempt to process the file as a single piece or array.
Platform: I have Cygwin installed with Perl to run a script from command-line. Alternately, a windows app or other suitable executable would be fine.