At 2:19 AM +0000 26-02-2001, Sam Varshavchik wrote:
Adam Sherman writes:
What does the timeout default to? The server *is* on a crappy
connection, a 1.3 MB DSL line... (-: But I average about 30KBps on
The default is 5 minutes, per byte. The clock resets with each byte
received. If you didn't get a single byte in 5 minutes, chances are
you're not going to get anything at all.
The bandwidth isn't the issue here, it's packet loss. I've seen
these kinds of things occur when there's a pretty heavy packet loss
on the line.
Hmmm, I can't seem to ping any of the hosts that are causing
problems. So, unless their *all* behind the same type of firewall...
Very wierd. How can I prevent users from receiving those empty messages?
President & Technology Architect
+1 (613) 255-5164