filtering for unique lines

filtering for unique lines

Post by aw.. » Thu, 28 Oct 1999 04:00:00



How do you filter a file for unique lines?  For example if the
file 'file.txt' contains:

line1
line1
line1
line2
line2
line3
line3

I want the output to be:

line1
line2
line3

Thanks in advance.

Sent via Deja.com http://www.deja.com/
Before you buy.

 
 
 

filtering for unique lines

Post by John Doher » Thu, 28 Oct 1999 04:00:00



> How do you filter a file for unique lines?  For example if the
> file 'file.txt' contains:

> line1
> line1
> line1
> line2
> line2
> line3
> line3

> I want the output to be:

> line1
> line2
> line3

man uniq

--

 
 
 

filtering for unique lines

Post by Charles Dem » Thu, 28 Oct 1999 04:00:00





>> How do you filter a file for unique lines?  For example if the
>> file 'file.txt' contains:

>> line1
>> line1
>> line1
>> line2
>> line2
>> line3
>> line3

>> I want the output to be:

>> line1
>> line2
>> line3

>man uniq

Uniq will only remove repeated lines that are one after the other,

if you want to remove lines that are the same but not
necessarily after each other, one must either sort the file
first, or use another tool, such as awk or gawk

gawk 'a[$0]++ == 0 {print}' infile

or written more compactly/cryptically:

gawk '!a[$0]++' infile

Chuck Demas
Needham, Mass.

--
  Eat Healthy    |   _ _   | Nothing would be done at all,

  Die Anyway     |    v    | That no one could find fault with it.

 
 
 

filtering for unique lines

Post by Suds » Thu, 28 Oct 1999 04:00:00



> How do you filter a file for unique lines?  For example if the
> file 'file.txt' contains:

> line1
> line1
> line1
> line2
> line2
> line3
> line3

> I want the output to be:

> line1
> line2
> line3

> Thanks in advance.

> Sent via Deja.com http://www.deja.com/
> Before you buy.

cat <filename> | sort | uniq
 
 
 

filtering for unique lines

Post by Charles Dem » Fri, 29 Oct 1999 04:00:00





>> How do you filter a file for unique lines?  For example if the
>> file 'file.txt' contains:

>> line1
>> line1
>> line1
>> line2
>> line2
>> line3
>> line3

>> I want the output to be:

>> line1
>> line2
>> line3

>cat <filename> | sort | uniq

If you're going/willing to sort, then this is a better way:

sort -u <filename>

man sort

no need for cat here, unless you want a UUOC award (Useless Use Of Cat).

Chuck Demas
Needham, Mass.

--
  Eat Healthy    |   _ _   | Nothing would be done at all,

  Die Anyway     |    v    | That no one could find fault with it.

 
 
 

filtering for unique lines

Post by Joel Hatto » Fri, 29 Oct 1999 04:00:00



Quote:

> no need for cat here, unless you want a UUOC award (Useless Use Of Cat).

This seems to be a common occurence here. I'll (mis) quote someone else:

"If you're using cat with one argument, you probably shouldn't be using
it"

joel

-- HelpDesk, ITS, Uni of Qld, Australia - phone [+61] [07] 33654400|
|opinions expressed herein are mine alone and may not be forwarded!|
|plaintext/ascii messages only, all unsolicited attachments deleted|
|to send me a file/document see http://www.uq.edu.au/~uqjhatto/#ftp|

 
 
 

filtering for unique lines

Post by fred smit » Fri, 29 Oct 1999 04:00:00


: How do you filter a file for unique lines?  For example if the

Look up the 'uniq' command.

Fred

: file 'file.txt' contains:

: line1
: line1
: line1
: line2
: line2
: line3
: line3

: I want the output to be:

: line1
: line2
: line3

: Thanks in advance.

: Sent via Deja.com http://www.deja.com/
: Before you buy.

--

    "Not everyone who says to me, 'Lord, Lord,' will enter the kingdom of
     heaven, but only he who does the will of my Father who is in heaven."
------------------------------ Matthew 7:21 (niv) -----------------------------

 
 
 

filtering for unique lines

Post by Frank J. Perricon » Fri, 29 Oct 1999 04:00:00



Quote:> cat <filename> | sort | uniq

sort <filename | uniq

--

MIS Manager                                               802-828-4926
Vermont Department of Liquor Control                 Fax: 802-828-2803
http://www.state.vt.us/dlc/            http://www.sover.net/~hawthorn/

 
 
 

filtering for unique lines

Post by John Doher » Fri, 29 Oct 1999 04:00:00






> > cat <filename> | sort | uniq

> sort <filename | uniq

If you can't assume the input file is already sorted, and can simply
be given to uniq as is, then you can usually do this with sort by
itself, e.g., "sort -u file".

--

 
 
 

filtering for unique lines

Post by Ken Pizzi » Sat, 30 Oct 1999 04:00:00


On Thu, 28 Oct 1999 11:14:51 -0400,



>> cat <filename> | sort | uniq

>sort <filename | uniq

sort -u filename

                --Ken Pizzini

 
 
 

1. filtering out unique IPs from pcap file

I have a pcap file with traffic from multiple hosts.  What would be an
easy way to parse this file in order to pull out a list of unique IP
addresses from this pcap file?  I'm thinking it will probably use a
command line utility like snort/tcpdump/tethereal to read the pcap and
then perhaps run it through a bit of perl to help sort/organize, but
this will probably take me a long time to figure out.  Any suggestions
or guidance greatly appreciated!

Thanks.

2. umask

3. unique lines maintaining order

4. Embedded Linux Newsletter for Oct. 23, 2003

5. original order in unique lines

6. xselFile widget hangs machine

7. how to output unique lines from a file

8. What is this program?? Can I remove it from iniitab??

9. Comm: printing unique lines only

10. Filters, Filters, where are you Filters...

11. line filter utility?

12. filter lines with a specific expression at a specific position

13. Filter/Remove Lines?