Browser Cookies speed test
Just for fun there are 2 awards I've given out - "Cookie Monster" and "Ginger Cookie" (because I love ginger cookies).
C.M. is based on a browsers ability to manage a large pile of cookies fast.
G.C. is the browsers ability to manage just 1 cookie, really fast.
The G.C. award exists because a site normally has at least one cookie (ex. user or session ID) and if you really wanted to you could make everything work with just that one cookie.
Both awards ratings are relative to the leader, so it becomes a scale of 0.0 to 1.0 where higher is better.
The computer used is a Dell Inspiron 9100, 3.2Ghz P4, 2GB RAM, WinXP. It's loud, bulky, not very enegry efficient, and old, but held up very well over the past half-decade. I hope your spouse has done better in this same time frame of your marriage.
Seriously though props to Dell for this model.
BATCH LOOP: 100 R & W function calls
COOLDOWN: 1 second
REPEATED: 100 times
(there is a jpg file with charts attached)
Basically the test takes 100 seconds. Each second the read and write operations are repeated 100 times in a loop. The cookie used was predefined, and created such that it would be at the end of the cookie string, so that having many cookies could affect the read operation.
The CPU usage is a rough estimate, don't take it to heart. I just used task manager and gathered several values over the course of the test. It is a VERY rough gauge. I would just like to mention that IE8 did peek quite a few times over 10%.
IE8 Internet Explorer 8
OP9 Opera 9
OPX Opera 10 beta
SF4 Safari 4
FF3 FireFox 3
GC2 Google Chrome 2
I used the max number of cookies each browser would allow, but Safari has no hard max, so I did several tests with it.
R(ms) W(ms) CPU #Cookies CPC C.M. Rating
OP9: 33.4 36.7 ~7% 30 2.34 0.19 :o
OPX: 29.1 35.1 ~7% 30 2.14 0.20 :[
IE8: 16.5 51.8 ~8% 50 1.37 0.32 :[
SF4: 38.8 12.4 ~5% 50 1.02 0.43 :(
SF4: 70.1 11.9 ~6% 100 0.82 0.53 :|
SF4: 103.5 12.0 ~9% 150 0.77 0.57 :|
FF3: 14.3 17.7 ~6% 50 0.64 0.69 :]
GC2: 24.5 6.2 ~6% 70 0.44 1.00 :D
Opera, I respect your innovation and sexy interface, but the cookie thing is sad.
IE definitely enjoys to pick its nose during writes... ouch.
Safari's increase in read speed is linear as the cookies grow, expected, but a tad high.
FireFox is on Fire with the reads!
Google Chrome has amazing overall performance. However it does exhibit very strange cookie creation behavior. When I created 70 cookies in a loop I get 70 saved, 69 gets 69, but 71 gets 51, 72->52... 90->70, etc... It's very strange, and I should penalize GC for being a tad weird like that, perhaps treating it's limit as 50 and not 70, however I'm sure that will be fixed soon because it's a relatively new browser.
"CPC" is Cost-Per-Cookie and I calculate it as (R+W)/Cookies.
If I could gauge the CPU usage more accurately I'd apply it as a tax on top of the R+W values, but I can't.
Also one could argue that reading cookies is more important than writing them because it happens more as the user browses, therefore the read value should carry more weight in the CPC. I say it all depends on the site, so I don't feel right to make that weight up.
Since Safari is kind enough to have no hard limit on the max cookies I did the test with 500 cookies. Note that sending too many cookies to a server can cause errors and the request will be rejected because the header info will be too large.
R(ms) W(ms) CPU #Cookies
SF4: 350.0 12.0 ~21% 500 (<15ish seconds)
SF4: 871.7 29.6 >50% 500 (>15ish seconds)
The slow down will happen in any browser, and not just when dealing with cookies. This "breaking point" I've seen happen at around >300ms when I've done other code benchmarks with these browsers. The code runs fine at steady rate for a little while, then it starts to peek sharply, and can reach well over 1 second, during which the page is practically unresponsive. So be careful, the JS architecture needs a cool down to clean up and run smooth.
Here is the Ginger Cookie test (just 1 cookie)
BTW it wasn't until I finished writing this post that I noticed Google Chrome and Ginger Cookie have the same initials.
R (ms) W (ms) CPC G.C. Rating
OP9: 33.2 36.2 69.4 0.26 :o
IE8: 16.6 45.5 62.1 0.29 :[
OPX: 27.2 34.8 62.0 0.29 :[
FF3: 10.8 18.2 29.0 0.62 :]
SF4: 10.9 12.0 22.9 0.79 :)
GC2: 12.8 5.2 18.0 1.00 :D
Also I was originally considering calling the award the Golden Cookie award, to the same effect no doubt. But you can't eat a Golden Cookie, you gotta sell it first and buy edible cookies.
Final Score (C.M. & G.C. average)
You'll notice multiple cookies hardly have an effect in some browsers during reads. This is due to the code I'm using for getting the cookie. I've avoided using the typical split by ";" method. The code I like only uses indexOf and substring to extract the cookie information. Doing it this way saves system resources by not making potentially unused array elements and is overall faster because the operations are simpler.
There is one functionality limitation in this method in IE and browsers that do not have '=' for empty cookies: you don't get an emptry string but rather undefined for the empty cookie, even if it's name exists. I'm ok with that because it's hardly a make-or-break functionality. Furthermore most *smarter* browsers leave the '=' in because they know it's more trouble to not ad it than it is to leave it out.
It's a backwards feature, and more of a glitch really.
Here is the get code I like to read cookies with:
I've come to believe TSpecials is what should be used on cookie names and values. Here is more about TSpecials.
n=' ' + TSpecials.encode(n) + '=';
The document.cookie model is a garbage implementation, it's been the same for a long time now and has issues and limitations. Browsers should expose their functions for cookie manipulation through a standard naming convention so that we can read all cookie attributes, and find them by path too. I hope W3C has plans to come up with that new standard...
Last edited by qak_duk; 06-25-2009 at 04:27 AM.
localStorage and globalStorage are already out...
Originally Posted by qak_duk
globalStorage[document.domain] == localStorage;
Thanks dude, I'll look into that.
Though unlike cookies that seems to be for local client storage, not data shared with the server.
I'd rather go with Flash's local storage, it's shared across any browser the client has, making it a very flexible caching mechanism.
Last edited by qak_duk; 06-25-2009 at 11:48 PM.
Originally Posted by qak_duk
if there was a way to do this for free, i think this could be a VERY popular and viable solution. if you get something up and running and can share, it would be a virtuous and benevolent act to post code or links.
please and thank you.
It's just an idea I've been toying around with in my mind for a while now. If I ever did make a working draft of some kind I would share it for sure.
The concept would be very friendly with textual resources, actual web page HTML, user data, work data, even CSS and JS code. Everything can be loaded through Flash and JS interacting - and if one browser gets it, they all do, cause it's a common data pool.
On a side note:
One thing that really bugs me about the way browsing works these days is when you view a different page all JS code is reset. It would be nice if the bigger code snippets didn't have to re-initialize themselves all the time.
One option would be to have a global domain JS sandbox, where the code state persists as long as the user has at least one tab open of that site. That way your most common libraries are active and ready and willing. This functionality would also rock because it would allow various tabs to interact with each other quite easily.
The other option I wish for is that browsers implement a way to just *passively* modify the address bar, anything so long as you are still in that domain. It would be necessary to specify how the url change will affect the browsers history and back/forward buttons.
Currently you can only modify the "#" part of the link without reloading the page. While this can be made to deliver the correct content if you pass on the url to a friend, search engines don't care about anything past the hash sign.
If you could actually change the real address stuff without reloading the page you would accomplish:
A) being search engine friendly
B) making your website conform to the URL browsing flow (sharing the current view as a link) and
C) having site code that is not constantly going to bed and waking up every time the user changes the page (avoid re-initializing).
This would make a HUGE difference for sites that have a huge utility code base and are AJAX heavy.
Last edited by qak_duk; 06-26-2009 at 12:56 AM.
if you re-use the <head>, you could loop through all the links to local pages, and re-write them with ajax commands that load the link's data, and replace the <body> of the current page with the extracted <body> of the link'd page.
change document.title and viola.
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)