I have 102,000 images in one directory on my Rackspace account. They are NOT sorted into sub-directories, they are all in one giant directory. I need to download these files from my server but I cannot because it seems most FTP programs cannot open/process a folder with 102,000 files.
The real problem here is that I do NOT have SSH access, this is why I believe I need to use PHP.
I need a PHP script (or whatever you recommend that might work better) to somehow take all 102,000 files and ZIP them into different archives that I can download. My main concern is PHP timeout because it is so big, so I am assuming that zipping all of them into one file is not possible.
I can place the PHP file in the directory and run it, I just need it written for me.
Some other ideas that my friend suggested to me:
- Get a list of the filenames and then just generate a script to pull them down one at a time.
- pull them down using wild cards that will get a small number each time A*.jpg, B*.jpg. etc.
- Use several cron jobs to put them in several similar sized zips using the wild cards like above.
Message me for [url removed, login to view] file. Uses PHP 5.3