www.webdeveloper.com
Results 1 to 9 of 9

Thread: [RESOLVED] Downloading to PC

  1. #1
    Join Date
    Jun 2008
    Posts
    223

    resolved [RESOLVED] Downloading to PC

    Hi, normally an FTP client is used on a PC to download your folders and files from the server. New security restrictions on my host have led me to see if it is possible to do this with a Perl script instead of my FTP software.

    I upload my images with a script, so I'm thinking can I do something like that - but in reverse.

    It would be just for text based files. All my admin is inside a protected directory, so the download script will be there too.

    1)
    In two directories (my blog archive and public HTML), I would need to see the files. That way I can download an individual file, when a new blog or site page is created.

    2)
    The rest are all in one directory (with files and subdirectories) and so downloading the entire directory to my hard disk will be the issue.

    Is this possible, in a short way? I'd like to avoid modules unless they are default standard ones installed when Perl is.

    Any suggestions or help?

    Thanks in advance.
    Last edited by edatz; 04-05-2012 at 04:31 AM.

  2. #2
    Join Date
    Jun 2008
    Posts
    223
    I am looking at Net::FTP, but would be unable to implement that on my local testbed server (I think). I'm scared of attempting anything on a live server without making sure it works locally first (brought down a live one once doing that kind of stuff).
    Last edited by edatz; 04-05-2012 at 07:41 AM.

  3. #3
    Join Date
    Oct 2007
    Location
    Vienna, Austria
    Posts
    389
    Well, to list a directory should not be a problem. The glob function should help you with that. To download a directory, I'd probably just zip it and stream the zipped thing to the client.

  4. #4
    Join Date
    Jun 2008
    Posts
    223
    Hi Sixtease, I've been messing around and arrived at that conclusion too. Doing tests at the moment to set that all up.

    Probably should of thought of that in the first place, just means I will still have to FTP on occasion for new site page files.

  5. #5
    Join Date
    Jun 2008
    Posts
    223
    The zip works, but I get extra empty folders belonging to the path enroute to actual folder I want to back up (which has everything in okay).

    I'm putting it together on my localhost.

    Code:
    use Archive::Zip;
    use Archive::Zip::Tree;
    
    $buf="f:/xyz/apache/cgi-bin";
    $bud="$buf/some/folder";
    $bua="$buf/ready";
    
    bakit();
    
    sub bakit{
    my $zip = Archive::Zip->new();
    # "'s for variables
    $zip->addTree("$bud");
    $zip->writeToFileNamed("$bua/folder.zip"); 
    exit(0);
    }
    
    The empty folders are inside f
    f 
     xyz 
      apache 
       cgi-bin 
        some 
         folder
    Not sure if there is a way around that easily. Any ideas?

    Size wise it doesn't really matter, it's just that they are there and it would be nice not to have them, as long as I don't have to define every file to be backed up (which I may have to do - in which case I'll probably just leave it as is).

  6. #6
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120
    Dare I ask what the new security restrictions are and why why prompt you to believe a Perl script is the solution?
    Jon Wire

    thepointless.com | rounded corner generator

    I agree with Apple. Flash is just terrible.

    Use CODE tags!

  7. #7
    Join Date
    Jun 2008
    Posts
    223
    It's something which blocks FTP clients from servers because a PC using say FileZilla has been cracked and the passwords grabbed allowing the malware to load up crap to the server. I will not discuss it any further on a public forum.

    Perl, because it's a bunch safer than PHP and all those exploits etc. 'Sides, I don't know PHP. Anything I do sits inside a secure area and not had any problems in over 15 years.

    By doing this little zip & mail thing, I'll save a lot of time and hassle associated with FTP'ing into the server. Once implemented FTP will be a rare thing instead of daily.

    If it were a Perl FTP Client, inside the secure area, then I would go to it and upload/download my files and not have to worry about it as it is not on my PC. So far I've only been cracked once and that was because of AVG, which I dumped immediately and started using Avast - no problems since.

  8. #8
    Join Date
    Jan 2007
    Location
    Wisconsin
    Posts
    2,120
    Sorry to press the matter, but this is an absurdly silly project. It's common practice to FTP/SFTP to put/get files from a web server. If you have issues securing your local machine, fooling around with Perl scripts to manage the process isn't going to help.

    We run AVG at my office and have no issues. We're even using straight FTP, the remote server doesn't support SFTP.

    Stop fooling around and get your local machine in order. If it's been compromised, securing your FTP client is undoubtedly the least of your concerns.
    Jon Wire

    thepointless.com | rounded corner generator

    I agree with Apple. Flash is just terrible.

    Use CODE tags!

  9. #9
    Join Date
    Jun 2008
    Posts
    223
    Empty folders problem solved.
    Last edited by edatz; 04-08-2012 at 12:35 PM.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
HTML5 Development Center



Recent Articles