[RESOLVED] Downloading to PC
Hi, normally an FTP client is used on a PC to download your folders and files from the server. New security restrictions on my host have led me to see if it is possible to do this with a Perl script instead of my FTP software.
I upload my images with a script, so I'm thinking can I do something like that - but in reverse.
It would be just for text based files. All my admin is inside a protected directory, so the download script will be there too.
In two directories (my blog archive and public HTML), I would need to see the files. That way I can download an individual file, when a new blog or site page is created.
The rest are all in one directory (with files and subdirectories) and so downloading the entire directory to my hard disk will be the issue.
Is this possible, in a short way? I'd like to avoid modules unless they are default standard ones installed when Perl is.
Any suggestions or help?
Thanks in advance.
Last edited by edatz; 04-05-2012 at 05:31 AM.
I am looking at Net::FTP, but would be unable to implement that on my local testbed server (I think). I'm scared of attempting anything on a live server without making sure it works locally first (brought down a live one once doing that kind of stuff).
Last edited by edatz; 04-05-2012 at 08:41 AM.
Well, to list a directory should not be a problem. The glob function should help you with that. To download a directory, I'd probably just zip it and stream the zipped thing to the client.
Hi Sixtease, I've been messing around and arrived at that conclusion too. Doing tests at the moment to set that all up.
Probably should of thought of that in the first place, just means I will still have to FTP on occasion for new site page files.
The zip works, but I get extra empty folders belonging to the path enroute to actual folder I want to back up (which has everything in okay).
I'm putting it together on my localhost.
Not sure if there is a way around that easily. Any ideas?
my $zip = Archive::Zip->new();
# "'s for variables
The empty folders are inside f
Size wise it doesn't really matter, it's just that they are there and it would be nice not to have them, as long as I don't have to define every file to be backed up (which I may have to do - in which case I'll probably just leave it as is).
Dare I ask what the new security restrictions are and why why prompt you to believe a Perl script is the solution?
It's something which blocks FTP clients from servers because a PC using say FileZilla has been cracked and the passwords grabbed allowing the malware to load up crap to the server. I will not discuss it any further on a public forum.
Perl, because it's a bunch safer than PHP and all those exploits etc. 'Sides, I don't know PHP. Anything I do sits inside a secure area and not had any problems in over 15 years.
By doing this little zip & mail thing, I'll save a lot of time and hassle associated with FTP'ing into the server. Once implemented FTP will be a rare thing instead of daily.
If it were a Perl FTP Client, inside the secure area, then I would go to it and upload/download my files and not have to worry about it as it is not on my PC. So far I've only been cracked once and that was because of AVG, which I dumped immediately and started using Avast - no problems since.
Sorry to press the matter, but this is an absurdly silly project. It's common practice to FTP/SFTP to put/get files from a web server. If you have issues securing your local machine, fooling around with Perl scripts to manage the process isn't going to help.
We run AVG at my office and have no issues. We're even using straight FTP, the remote server doesn't support SFTP.
Stop fooling around and get your local machine in order. If it's been compromised, securing your FTP client is undoubtedly the least of your concerns.
Empty folders problem solved.
Last edited by edatz; 04-08-2012 at 01:35 PM.
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)