Browser freeze when manipulating the dom?
Basically I've got a form on which a user can either fill out everything or they can select from some stored forms that they have previously saved. If they select a stored form I iterate through the input fields in the existing form and remove them using parentelement.removeChild(this) and then add the new fields in. This works fine when the user's saved form only contains a handful of items (< 10) but when it contains 10+ items the browser freezes for 10 seconds as it adds the fields to the form. I tried adding a little loading gif while the form is being loaded but that freezes too.
My general and sadly not specific question is: Is there a way to do this more efficiently? E.g., clone the existing form, operate on the clone and then replace the existing form with the now modified clone? (I'm not even sure that would provide any benefit and I'd try it out to test it but I'm not quite certain how that would work).
Any insight is greatly appreciated.
Everytime you make (visible) changes to the DOM the browser has to re-render the page. It shouldn't be a problem though in your case - I have never run into performance problems like that unless manipulating a huge amount of nodes. Are you sure the your code for iterating is working as it should...the problem might lie in a loop not properly consrtucted...just a guess
Thanks much - I appreciate the input. I don't *think* the issue lies in a poorly constructed loop; unfortunately I'm modifying an existing internal app and the modification of the dom that is slowing everything done is handled by code that came into existence prior to my tenure on the project. Basically what it is doing is using Prototype.js to serialize a form and then it takes that and deserializes it, iterating over the elements and replacing/attaching them to the dom. This looks to be what is consuming the vast majority of the time in the action.
Sadly I know it isn't much to go on when providing guidance, I'm trying to dig through the code that handles this now but being a non js expert it'll take me awhile to actually notice if something is being handled in a less then efficient manner (I'd copy and paste but i'm at home).
Is it possible to clone the form I'm trying to replace, modify the clone object and replace the existing form wholesale? Would that even provide any sort of performance benefit?
It is more efficient to write a new value to a form element's value attribute than to remove/ replace childNodes using the DOM. But 10 replaceChilds should not cause a noticable delay- something else is going on.
Are you using the childNode length attribute as a loop limit while you change the length by adding/deleting nodes?
Last edited by mrhoo; 07-18-2007 at 03:18 PM.
I don't know about the perfomance of cloning. What you can do is build your new form without appending the elements until the form is complete.
Create element 1
Create element 2
Create element n
Append elements to the form and finally append the form to the document. Then the browser only has to re-render once.
@mrhoo - I'll have to check that in the morning, I don't recall anything like that being done
@Dok, I was thinking of something like that but it just seems so darned frustrating that something so trivial should be freezing the whole browser - replacing 25 odd dom elements doesn't seem like it should lock up the browser for 20 odd seconds (or whatever the exact time period was).
I'll have to dig in to the existing code tomorrow and post back with some more questions...
just realize as I was about to crash that it might be helpful to add some more context information.
The form that is getting serialized can have a dynamic number of fields. Basically, users can add additional search constraints by clicking a button. The sticky wicket in the deserialization method written by my predecessors was dealing with the dynamic form elements. I'm not sure that provides anything useful towards fix suggestions but hey, it might clue someone in! (because it certainly hasn't clued me in ).
well, still haven't yet found the spot causing the slow down, but i thought it was interesting to note that it's actually quicker to set a bunch of break points in the firebug debugger and then just "play" through all of them then it is to just let the script load all the elements out right.
Is there any reason why this would be the case? Some sort of timing sensitivity in the script somewhere?
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)