Page 1 of 2 - Has anyone broken a rib coughing?! - posted in Miscellaneous: As per title. Can you really cough your way to a broken rib (or should that be a 'cracked'rib?) or is it an urban myth? I was thinking of implementing real time chat using a PHP backend, but I ran across this comment on a site discussing comet: My understanding is that PHP is a terrible language for Comet, beca. Stack Overflow. Using comet with PHP? Ask Question. Up vote 82 down vote favorite.

Chat Php Script

Download Driver Laptop Hp 520 Windows Xp. I was thinking of implementing real time chat using a PHP backend, but I ran across this comment on a site discussing comet: My understanding is that PHP is a terrible language for Comet, because Comet requires you to keep a persistent connection open to each browser client. College Physics Paul Peter Urone Pdf Printer. Using mod_php this means tying up an Apache child full-time for each client which doesn’t scale at all.

The people I know doing Comet stuff are mostly using Twisted Python which is designed to handle hundreds or thousands of simultaneous connections. Is this true? Or is it something that can be configured around? Agreeing/expanding what has already been said, I don't think FastCGI will solve the problem.

Apache Each request into Apache will use one worker thread until the request completes, which may be a long time for COMET requests. Mentions using COMET on Apache, and that it is difficult. The problem isn't specific to PHP, and applies to any back-end CGI module you may want to use on Apache. The suggested solution was to use the which changes the way requests are dispatched to worker threads. This MPM tries to fix the 'keep alive problem' in HTTP. After a client completes the first request, the client can keep the connection open, and send further requests using the same socket. This can save signifigant overhead in creating TCP connections.

However, Apache traditionally keeps an entire child process/thread waiting for data from the client, which brings its own disadvantages. To solve this problem, this MPM uses a dedicated thread to handle both the Listening sockets, and all sockets that are in a Keep Alive state.

Unfortunately, that doesn't work either, because it will only 'snooze' after a request is complete, waiting for a new request from the client. PHP Now, considering the other side of the problem, even if you resolve the issue with holding up one thread per comet request, you will still need one PHP thread per request - this is why FastCGI won't help. You need something like which allow the comet requests to be resumed when the event they are triggered by is observed. AFAIK, this isn't something that's possible in PHP. I've only seen it in Java - see the Apache. Edit: There's an about using a load balancer () to allow you to run both an apache server and a comet-enabled server (e.g.

Jetty, tomcat for Java) on port 80 of the same server. PHP I found this funny little explaining simple comet. As a side note I really think this is going to kill your server on any real load.

When just having a couple of users, I would say to just go for this solution. This solution is really simple to implement(screencasts only takes 5 minutes of your time:)).

But as I was telling previously I don't think it is good for a lot of concurrent users(Guess you should benchmark it;)) because: • It uses file I/O which is much slower then just getting data from memory. Like for example the functions filemtime(), • Second, but I don't think least PHP does not a have a decent thread model. PHP was not designed for this anyway because of the. Like the slides says 'Shared data is pushed down to the data-store layer' like for example MySQL. Alternatives I really think you should try the alternatives if you want to do any comet/long polling. You could use many languages like for example: • Java/JVM: Jetty. • Python: Dustin's.

• Erlang: Popular language for comet/etc. • Lua, Ruby, C, Perl just to name a few. Just performing a simple google search, will show you a lot alternatives also PHP(which I think on any big load will kill your server).

Coments are closed
Scroll to top