Hi,
I am experimenting with running a FTP site on my Linux box. The system I use
is a P100 with 16megs, 24megs swap, kernel version 1.3.20 (ELF) and most of
the slackware 3 distribution.
In the beginning all was fine and the system could easily handle 30
concurrent FTP users. But after two days my computer was being clogged with
hundreds of wu.ftpd and /usr/sbin/wu.ftpd processes.
These processes ate away all memory, causing excessive swapping and eventually
out of memory errors. I noticed that some FTP users issued several connection
requests per second. Each request starts up a new wu.ftpd process that lingers
around for a few minutes before it exits. In the meantime hundreds of other
wu.ftpd processes are being started. These slow down the computer, and old
processes are being terminated less frequently. So the number of processes
keeps building up, till the point were my computer becomes unusable.
Now, what can I do about it? When I impose access restrictions (max 7 users)
the problem is solved. But I want to serve more users. How can I do that?
Can I convince wu.ftpd to get lost immediatly? Is it normal that a FTP user
requests hundreds of connections per minute? How does he do that? Can I prevent
this?
Another oddity is that when about 10 FTP users are logged on, it will take me
several minutes to log into my own computer as root. I can still log in
immediatly under my own username, though.
If you know anything more about it, I would really appriciate some advice!
Ewald