I'm a web/database developer in a specific office at a large academic medical center. Our central ITS doesn't give "field" people cpanel access to the webservers, so I can't schedule cron jobs for things like backup of mysql.
Now, they backup for us, but they backup onto tape, which is sketchy on reliability, and it can take days and weeks for restoration since they server the whole institution.
I've been manually dumping to .sql through phpMyAdmin, but what I'd really like to do is automate this.
I'd like to create a php script to do the dump, but somehow call it from a desktop machine or some other remote method. I have my own personal web server for various other things. Is there any way I could schedule from that other server or a desktop PC to call a backup script in that way?
If you are the developer you can create a page that displays the table. and scrape the data off the page into a notepad file. If you wan't to save it so that it can be restored, you will need to add the text around it INSERT INTO etc.
In that case the only thing I could sugest is that you do as I sugested before but instead write the information to a file on the server. You could give it an extension of .dat. Another page could be set to give you the option of downloading the file.
You won't be able to automate it. You would have to access a web page each time. Even the database won't automate. Some form of triiger would be required. If you are trying to by-pass the DBA's then you will have to do a work around.
Here's what I'd do. Include a function call with an onLoad event handler to a function included on an external file. Then have that file call a PHP script which contains the queries to back-up the database, but have it have a conditional to only run once and only on certain days, (ie, 1,7,14,21,28 of every month).
Put that onto a frequently visited page, and voila. It's a random chance thing, but if your page generates 1+hit a day you'll be ok lol.