Shared process HTTP server
I have developed a simple Perl event server which handles incoming connections and messages via IO::Socket::INET sockets to any number of chat room channels and then dispatches streaming event triggers to all connected clients in the form of JS commands. This eliminates the overhead of using an interval-based AJAX poller to query a script which in turn checks the database for changes and also delivers messages without delay.
Unfortunately, there is no way that Apache could handle the number of open connections that this requires. If each user maintains a persistent connection to Apache, it won't be long before Apache croaks. I am looking for an HTTP server which will instead allow a connection to a single script to be shared among or broadcasted to various users.
For instance, all users connected to a certain channel would receive streaming data from a single event listener script that is distributing events for the entire channel, rather than starting individual event listener scripts for each user which will end up delivering the same exact message to each user individually.
I am looking for recommendations on a server that is able to do this, or is simply able to handle a very large number of concurrent connections. With a quick Google search I found Medusa - if anyone has experience with this or similar servers, I would like to know if this type of approach is supported or if there are better suggestions.
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)