Web app best practice for dynamically assigned content
I'm looking to create a web app for delivering post items to users.
Users can be assigned to multiple categories.
Post items can be sent to as many categories as the subject requires.
Post items contain a title, description (upto 4000 chars), which categories it was sent to, contact details of the publisher, issue date and deletion date.
When a user logs into the frontend they can view post items assigned to the categories that they are members of.
When they view a post items they can move the item to 'archive' to say that they've read it, ie move it out of their 'inbox'. Items are deleted automatically based on a deletion date set by the post creator.
It's kinda like a centrally-managed email system with automatic deletion.
At times users could have upto a few hundred items of post assigned to them with any number of them in the user's 'archive' folder.
My question is what is the best way to work this.
Should it cache an xml file at first logon of all their post items which will make searching post lightning fast (a filter/search bar will be used quite frequently)? I expect that when a user 'moved' a post item to 'archive' it would either have to re-create the xml or update the xml in cache using ajax as well as updating the database.
Or should the entire thing hit the database everytime something changes, ie a post item gets archived? This would be the easier option to code, I think, but would mean a database hit for every page load or post search.
Any other suggestions are welcome or even any current software that does something similar. It will be used in a secure corporate environment so it cannot be reliant upon anything hosted externally.
I'm not looking for code, only for suggestions of best practice for how to deal with content assigned dynamically to a user so that it performs well.
There are many cache options to use. You can cache at the DB level, integrated in the PHP level, and at the application level (like you specified with xml files). Memcache, Zend cache, etc are pre-built caching options. Maybe someone else will jump in here with other suggestions as I usually make use of the pre-built ones and create static versions (xml, htm, etc) of things and updating the static versions as needed.
Given the limited info I have offered so far, and your knowledge of creating xml files, would you then use php to work with the info from the xml file which would reduce download size but increase processing time or would you use xsl to work with the xml data which I assume would mean that the user would download and cache the xml in the temp files whilst the xsl worked with it as many times as was required?
I use XML files to cache the queries from the DB server so I don't have to query for things that won't change often. Then I just remove the XML file when I need to refresh the cache. Theoretically you could store entire parsed pages as xml and just use xslt to display it to the user, though, I usually find a section of most pages will update very frequently and thus the largest benefit would come from caching the majority of the page and loading those few truly dynamic bits on the fly every time.