I need a perl script( or suitable alternate app that can be run from windows) that will download a specific data file from an ftp site (daily); re-sort the field data and apply logic to certain fields to adjust the final output. There are 50 input fields - deliminated by # of characters only. Up to 1.4 million lines in the file. Script must take each line, split by field length to 50 fields; Then it needs to export the data to a tab-file with approximately 12 fields (most of the original fields are to be ignored) -- applying certain criteria to the values of certain fields. Most of the fields in the original file will either be ignored or reformated to the final file without change. Script should be fairly fast - able to run through the entire file within approximately 1 hour.
* * *This broadcast message was sent to all bidders on Thursday Jun 14, 2007 1:08:55 PM:
A file indicating the input file field specs has been uploaded. The output file is a simple tab-delimited file with about a dozen fields. The speed of the script is not crucial - any well-written script should blaze through this as fast as could be expected.. my only concern was the shape of the logic scripts could theoretically bog down on a file this size - or that a script wouldn't be written that would attempt to process the file as a single piece or array.
Platform: I have Cygwin installed with Perl to run a script from command-line. Alternately, a windows app or other suitable executable would be fine.