I don't know script language but by the help of many people step by step this code is created as a
solution to my case. And this php file is backuping the files by a cron job running every minute..
<?php
$src_folder='/home4/user2//public_html/.mydomain/cam';
$dest_folder='/home4/user2//public_html/.mydomain/cam/old';
if(!is_writable($dest_folder)){
die("$dest_folder not writable");
}
$arr=array('cam_office1','cam_office2a','cam_office2b','cam_home');
foreach($arr as $cam_office1){
if(file_exists("$src_folder/{$cam_office1}.jpg")){
copy("$src_folder/{$cam_office1}.jpg","$dest_folder/{$cam_office1}_".date('ymdhis').".jpg");
}else{
echo "can't find file {$src_folder}/{$cam_office1}.jpg<br />";
}
}
?>
But there are 2 problems in this solution;
1)In one minute there can be 10 different uploads but I can catch and copy only 1 image in this time
period by this way
2)If there is no change (new upload according to the detected motion) in camera image file, it is
useless to save a copy of image in every minute
For the first item I think it is more difficult to find any solution, beacuse script must be triggered
by every new upload, but it is not easy to do.
But for the second item there can be still something to do. For example these 2 things below can help
to manage high quantity of uploaded files;
a) An addition to the script to check if the file is still same or not. If same no saving will be done
by the script even every minute it is running.
b) If we can modify the current code for saving the image files every hour to a new created folder
instead of one stable folder, then it can be possble to manage many saved files.
I mean for every day one parent folder of the day will be created automaticly by php file if it is the
first file saving after 12:00 am of previous day. Folders as below
121116
121117
121118
.......
And in each folder, 24 folders will be created one by one by php file when the time comes to save an
image if file belongs to a new hour, as
00
01
02
04
....
...
22
23
If anyone can help me about modifying the file, thanks very much in advanced.