setInterval once set will occur at 'nth' milliseconds that you set it for.
setTimeout will only occur after 'nth' milliseconds occurs and the time between end and recreation of the interval next adds 'nth' millisconds to the function call.
An event is created and you have a small time 'nth' milliseconds for it to create, so you could have 250 milliseconds added to a 1000 millisecond event trigger using setTime out. It is not necessarily your computer that will have a problem but older PC's that people may have that are the consideration.
You have to remember that some people can't afford new PC's and inherit hand me downs or managed to gather the parts to make a PC and these older architectures make up the majority of user machines. My first PC was such a machine, my second was a self build out of a mixed bag of new and second hand parts, my third and forth PC's were hand me downs, my fifth PC was another home brew kit of spares I had amassed and a board that someone chucked out that only problem with it was the CPU wasn't properly inserted!!! My sixth PC was a barebones system that I bought new and bought new parts for it.
In CPU terms, started with Intel 8086 @ 113Mhz, then AMD 650Mhz, and then made it past the 1Ghz mark with an Intel1.3Ghz, then I leapt up to an Intel3.1Ghz
What this means is that the architecture has grown fast enough for those users who have them, will support a browser that interprets the script to then react fast enough, the amount of interpreter lag that gets introduced gets larger the slower the system.
So what happens? You could very well have users of the countdown seeing your count down witnessing jumps of 2 and 3 second intervals, depending on how fast the machine is and when the next cycle is triggered.
SetInterval overcomes allot of these problems as it is an interval that has been initiated and will call the desired function at that aet interval, you only have one element of lag and that is in the functions initialization to start with, thereafter you have no lag.