Post by blue » Tue, 08 Sep 1998 04:00:00


Why does a forked child get defunct, and how to prevent this?



1. <defunct> processes from a server

I have a server process that listens on a port and forks to a new child process
whenever it gets a connection, and the parent continues listening for a new
connection.  Whenever the child process exits, it leaves a <defunct> process in
the process list.  The only way I have to kill these  <defunct>s is to kill the
parent.  How can I exit my child so it won't linger around? I tried a simple
test program with a fork and it does the same.  I am runing on SunOS 4.1.3.  

Thanks in advance for any help.

Here is the toy program:
#include <stdio.h>
#include <sys/types.h>
  pid_t pid;

  if ((pid = fork()) < 0) {
    printf("\nFork error");
  else if (pid == 0) {
    /* Child */
    printf("\nI am a child!");
  else {
    printf("\nI am the parent!");
  exit (0);

Jean-Francois Theoret, ift.a.   | Institut de recherche d'Hydro-Quebec
VA2JFT                          | 1800 Montee Ste-Julie

2. file:table is full

3. <defunct> entries in the ps -A table

4. $$$ "Have some cash" $$$

5. Something wrong: defunct eth0

6. Is there a program the opposite of rawrite

7. defunct routes (was Re: multiple default routes appear in solaris 2.5.1)

8. Stylus Pro printcap entry

9. What does "eth0 <defunct>" mean?

10. Defunct processes in Net Application

11. Netscape Enterprise 3 <defunct> processes

12. <defunct> child, wait, and accept

13. <defunct> processes again!