Is it just me, or does this function fail under Linux? I've written a
small server (source available upon request, if you feel that would
help - only about 100 lines) that allows multiple clients to connect
to it. If I setSoTimeout to a non-zero value, the client gets
disconnected after the timeout, but also when the next client
connects, even though the timeout has not yet expired! So the server
is actually limited to one client at the time.
Under windows 95, or when the timeout value is set to zero, this does
not happen.
So:
1. Is this problem specific to Linux, or does it affect other unix
systems as well?
2. Is there some kind of workaround? I guess I could always use
_another_ thread for each client, but isn't there a better way?
3. Any idea about whether this problem can or will be fixed in future
versions? (I'm using JDK 1.1.3 which I believe is the latest for java.
Platform independence? Sure, once all virtual machines act in exactly
the same way. But not this millenium.