www.webdeveloper.com
Results 1 to 8 of 8

Thread: loading a table

  1. #1
    Join Date
    Mar 2009
    Posts
    36

    loading a table

    I am looking for a solution to have my table showing up already while loading all data.

    The trick is that per line in the table some work is done in the script.
    This can take some time (1-3 seconds) per line

    While the user is waiting for the whole result, it would be great if already the data that has been collected would show on the website.
    I have been looking at flush() and ob_flush() but for some reason this does not do the trick.

    Help to point me in the correct direction is more than welcome

    /Fons

  2. #2
    Join Date
    Jul 2009
    Location
    Falls Church, Va.
    Posts
    780
    Couple of comments...

    What on Earth takes 3 seconds to load for one line? Explain the data involved and why the long load time (explain technically, is database involved?)

    Why do I ask? Sometimes a re-thinking of the logic as to how data is organized and displayed should be re-evaluated, we might be able to offer suggestions to help you reduce load times and make it easier for the end user to browse the data. Sometimes it's as simple as pagination. Follow?

    If your answer above includes graphics or links - tables are intended for tabular data, not any other type of content - should you be using tables at all?

    Now for some generic advice, without knowing what exactly you are doing:

    1) Introduce a caching mechanism
    2) Optimize any database queries involved
    3) If images are involved, are they optimized as well, and nothing crazy like loading thumbnails simply by resizing via the HTML?
    4) Consider the use of Ajax to dynamically load paginated content (keeping track of the offset, so content is loaded one page at a time, not all at once, plus it's cool looking)
    5) Your server is rediculously slow (low on resources, shared, free webhost, etc.)

    So consider this a way to get the conversation started. But need more input and info from you, no doubt.

    -jim

  3. #3
    Join Date
    Mar 2009
    Posts
    36
    Jim thanks for your reply,

    The fact that the loading can take between 1 and 3 seconds is due to the fact that the script is reading external data in XML format.
    The time out per request is 5 seconds and rarely happens. (99,99% hits)

    Using a table to display the data is required to format the result data in a correct way.

    by example the script looks like this:
    - curl request from external source
    - parse xml data
    - save xml data in the database if updates are needed
    - format collected data in HTML coding plain html no links etc
    - echo formatted data
    - back to line 1 till all data has been parsed (about 200 times the above)

    As you see, there is nothing real fancy going on, the only thing I am looking for is a way to present (echo) the data direct when all parsing etc is done.
    and this for every of the 200 times the script collects data from the external source.

    Hope this gives you a better inside of what I am looking for.
    including ajax seems like a good option. diving into that as we speak.
    /Fons

  4. #4
    Join Date
    Jul 2009
    Location
    Falls Church, Va.
    Posts
    780
    This part of my answer is based on using your existing procedures:

    I'm leaning towards caching and also, if possible, using a cron job to do all the work on a scheduled basis during off hours and save the output as a .php file or whatever. Caching implies a simple method of comparing time stamp to determine age, then either display cached file or reload the data accordingly.

    As to improving your existing procedures, let me think about it some more.

    Please wait for other replies, this is just one opinion of many. I'd like to hear what others have to say on this one. Thanks for explaining.

    -jim

  5. #5
    Join Date
    Jul 2009
    Location
    Falls Church, Va.
    Posts
    780
    After some thinking, only one thing came to mind to attempt to optimize your procedure (beyond the caching idea), and you might have already thought of it:

    Focusing on this statement you made: "format collected data in HTML coding plain html no links etc" ...

    If you're writing code to generate HTML line by line, that's costly in resources compared to simply using the XML you've already got and applying a style sheet to it via XSL. Instructions here if you're not familiar, and polite apologies if you are familiar as some of the stuff you listed is more advanced than the usual stuff we help with, but others might find the link useful. Oh, BTW, this assumes the XML node structure is relatively static, of course. If not, disregard!

    As to Ajax, it doesn't speed up anything now that I know your procedure. But using Ajax will, without doubt, make the page higher on the "cool looking" scale! In truth, it's overhead and I suggested it based on an assumption the user could query the data and control what was loaded via database (i.e. pagination or constrained search results) instead of the complete result set on one page.

    Beyond this, I'm out of ideas and encourage anyone else to chime in as I noted before.

    -jim
    Last edited by SrWebDeveloper; 02-23-2010 at 04:55 PM. Reason: added link

  6. #6
    Join Date
    Jan 2009
    Posts
    3,346
    Is there a reason you have to make ~200 external calls for each call to your script?

    - curl request from external source
    - parse xml data
    - save xml data in the database if updates are needed
    - format collected data in HTML coding plain html no links etc
    - echo formatted data
    - back to line 1 till all data has been parsed (about 200 times the above)
    Would there be a way to reduce that to a single call that retrieves the ~200 results in one go? If this was done in addition to the caching of data as already suggested you should have little to no wait time.

  7. #7
    Join Date
    Mar 2009
    Posts
    36
    I guess we are loosing the key question (and digging up some other real coolstuff in between)

    - curl request from external source
    - parse xml data
    - save xml data in the database if updates are needed
    - format collected data in HTML coding plain html no links etc

    - echo formatted data
    - back to line 1 till all data has been parsed (about 200 times the above)

    I make the point bold where my issue is at the moment.
    Collecting the data is as-is, there is no way I can get it different as the data provider has his limits in what he is providing.

    The key issue is that i would like the script to really present the data on the webbrowser of the user once the data has been processed.

    as said, i tried using flush() and ob_flush() but they do nothing for me.

    I guess today will be ajax day to see if they can provide a solution.
    If there are any other ways someone knows how to get flush() to work, i am open for trying anything.

    /Fons

  8. #8
    Join Date
    Nov 2008
    Posts
    2,477
    It's not a good idea to rely on flush(), there are too many variables involved with specific server, OS and client types which can affect it. I would also lean towards an AJAX approach. You will be making many more HTTP requests, but as this page is going to take forever to complete anyway I'd see that as acceptable.

    As others have mentioned though, I'd personally be looking at ways to minimize the load time, and caching and cron would be where I'd start.
    The first rule of Tautology Club is the first rule of Tautology Club.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
HTML5 Development Center



Recent Articles