It would take too much typing to explain but in a nutshell, you work out the number of milliseconds in a standard day, this will be 246060*1000
From this you can work out an average for a year.
Keep in mind the accuracy will be out by as much as 6 hour per year because leap years introduce 24 hours variance = 24 / 4 = 6 hours.
I am currently working on a way of ironing out this difference, I suggest you too look at working on a way of improving accuracy.
What I suggest you do is work on a large number like total milliseconds then deduct years, months, weeks, etc. as millisecond values.
When you get the script functioning as you want it, you then can start improving it.