Two processes redirects output to the same file

Two processes redirects output to the same file

Post by alois egge » Tue, 13 Apr 1999 04:00:00



Hello,
I want to redirect the output of two or more processes to the same file.
What happens if the information is written at the same time.
Can I rely on the Operating System (AIX 4.x), that all information is
written into the output file.

Thanks

 
 
 

Two processes redirects output to the same file

Post by Jens-Uwe Mag » Tue, 13 Apr 1999 04:00:00



>I want to redirect the output of two or more processes to the same file.
>What happens if the information is written at the same time.
>Can I rely on the Operating System (AIX 4.x), that all information is
>written into the output file.

If you open the file for append (>> using the shell) then the OS will make
sure that output is always appended at the end of the file.

--
Jens-Uwe Mager  <pgp-mailto:62CFDB25>

 
 
 

Two processes redirects output to the same file

Post by Scott L. Field » Tue, 13 Apr 1999 04:00:00


Yes, but you can't guarantee that the data will be in the order you want.

Further, it will be by "write()" and not by any standard you imply.

IE, if you have a sentence but it takes 2 write() system calls to print it,
you
might have a "write()" from the other process interrupt them.

Further, the "printf()" functions are buffered, and by default are fully
buffered.
You either have to force the write or depend on it writing to the actual
file
every 4K of data.


>Hello,
>I want to redirect the output of two or more processes to the same file.
>What happens if the information is written at the same time.
>Can I rely on the Operating System (AIX 4.x), that all information is
>written into the output file.

>Thanks

 
 
 

Two processes redirects output to the same file

Post by Flemming Josephse » Tue, 13 Apr 1999 04:00:00


This is true .. I have a series of scripts that depends on a common log
file .... I managed to get around the problem of 'funny sequences and
broken lines ' by observing the process' and putting in a strategically
placed sleep here and there.... by the grace of the great CPU this
worked, so, even if it's nothing to brag about .. it did the trick and
I'm suitably thankful.

> Yes, but you can't guarantee that the data will be in the order you want.

> Further, it will be by "write()" and not by any standard you imply.

> IE, if you have a sentence but it takes 2 write() system calls to print it,
> you
> might have a "write()" from the other process interrupt them.

> Further, the "printf()" functions are buffered, and by default are fully
> buffered.
> You either have to force the write or depend on it writing to the actual
> file
> every 4K of data.


> >Hello,
> >I want to redirect the output of two or more processes to the same file.
> >What happens if the information is written at the same time.
> >Can I rely on the Operating System (AIX 4.x), that all information is
> >written into the output file.

> >Thanks

 
 
 

Two processes redirects output to the same file

Post by Juergen Gmeine » Wed, 14 Apr 1999 04:00:00



> > Further, the "printf()" functions are buffered, and by default are fully
> > buffered.
> > You either have to force the write or depend on it writing to the actual
> > file
> > every 4K of data.

my "solution" to the problem is not to redirect the
output of the scripts (which would make the output buffered
and therefore lead to scrambled output), but
in the scripts via "exec >> mylogfileorwhatever 2>&1".
this redirects the output, yet it remains line-buffered.

i only used this with scripts started interactively, though.
it probably won't work if you start them via cron or at.

and it only affects the scripts output ... programs called
within the script still do buffered io, unless you use
setlinebuf()

i still find it useful, sometimes

regards,
juergen

 
 
 

Two processes redirects output to the same file

Post by Tony R. Benne » Thu, 15 Apr 1999 04:00:00





>> > Further, the "printf()" functions are buffered, and by default are fully
>> > buffered.
>> > You either have to force the write or depend on it writing to the actual
>> > file
>> > every 4K of data.

>my "solution" to the problem is not to redirect the
>output of the scripts (which would make the output buffered
>and therefore lead to scrambled output), but
>in the scripts via "exec >> mylogfileorwhatever 2>&1".
>this redirects the output, yet it remains line-buffered.

>i only used this with scripts started interactively, though.
>it probably won't work if you start them via cron or at.

>and it only affects the scripts output ... programs called
>within the script still do buffered io, unless you use
>setlinebuf()

>i still find it useful, sometimes

>regards,
>juergen

The only solution I have found is that is guaranteed to not 'interleave'
lines from two processes is to controll access to the print file with a
semaphore... this of course will not work from a shell-script.

--


 
 
 

Two processes redirects output to the same file

Post by Joerg Brueh » Fri, 23 Apr 1999 04:00:00


Hi !

Jens-Uwe Mager schrieb:


> >I want to redirect the output of two or more processes to the same file.
> >What happens if the information is written at the same time.
> >Can I rely on the Operating System (AIX 4.x), that all information is
> >written into the output file.

> If you open the file for append (>> using the shell) then the OS will make
> sure that output is always appended at the end of the file.

I am sure this depends on the shell used, I had bad luck when
I tried that (sorry - I do not remember the environment used).
It seems that some shells do not set the O_APPEND flag for a file
opened with '>>', which IMHO should be done.

I take it that the original question refers to concurrent processes.

My approach would be:
1) Explicitly 'open()' the file in the program, using the O_APPEND
   flag, maybe even O_SYNC (probably overkill - try both with and
   without).
2) Explicitly use 'write()', not the STDIO-functions.

If you want to use STDIO-functions like 'fprintf()', also use
'fflush()' at proper places - before the buffer fills !
You would then need to combine 'open (... , O_APPEND)' with
'fdopen()', I have never done that.

HTH
Joerg Bruehe

--
Joerg Bruehe, SQL Datenbanksysteme GmbH, Berlin, Germany
     (speaking only for himself)

 
 
 

1. Redirecting to TWO output files

Hi,

I have a program how write some lines to the terminal (stdout).

Not all the lines are of this kind. I would like to use
a script with awk and/or sed to be able to redirect EACH group

to a different output file. Now, what I do is running the program

file.

Can anybody help me? Thanks.

  J C
--
357-"Funny, doesn't *look* like a cyberpsycho...." {but it
was one}
        ("FAMOUS LAST WORDS", collected by O.Rosenkranz)
====================================================================
   *    J. C. Gonzalez                  
  .     ------------------------------------------------------------  
  .     Max-Planck-Institut fuer Physik        Tel.: +49 89 32354445
   .    (Werner-Heisenberg-Institut)           Fax : +49 89 3226704  
     *  Foehringer Ring, 6          ... . .... . ..... ......... ..

   .    Deutschland                    WWW: www.gae.ucm.es/~gonzalez
====================================================================

2. Token Ring card on X86?

3. redirecting a process output

4. tcp delayed ack

5. Help: redirecting process input/output

6. how to configure to use null modem port?

7. redirect output from child process into parent's stdout

8. Sun E450/Clariion E3200 RAID scsi problem

9. How do I redirect process output?

10. redirect output after a process has started

11. redirected output from a killed process

12. Suspend process, then "bg > some_file" with output redirected

13. Q:redirecting output while source process is running