## World's best lossless image compression?

### World's best lossless image compression?

Hi everyone,

I have been doing some preliminary research into lossless image
compression. I've come so far that I can calculate how much my images will
compress. Typically 2:1. Since I'm studying and otherwise busy I don't feel
I have the time to implement it, unless it can compare in performance with
what's state of the art. I'm doing this most for fun, but if my method
turns out to be comparable to other methods out there I might spend some
time working on it.

I have the following questions: What is the best lossless compression
algorithm for images (continuos tone) on the market (in the labs?) today,
and how well does it perform (compression ratio). Can a working version be
for lossless compression of images containing e.g. diagrams, text etc, and
lossless compression of video, and how do these perform?

I've heard that an algorithm by the name LOCO is supposed to be very good,
anybody heard of it?

Who as interest in lossless image compression? That is, who needs lossless
compression? I'm interested in knowing what the market is for a lossless
image compressor.

I have a lot of questions, and I hope somebody can answer at least some of
them.

Greetings from

John Reidar Mathiassen
Department of Engineering *netics (ITK)
Norwegian University of Science and Technology (NTNU)

### World's best lossless image compression?

: I've heard that an algorithm by the name LOCO is supposed to be very
: good, anybody heard of it?

You can learn about it (and JPEG-LS, the LOCO-based new standard for
lossless/near-lossless image compression) in:

<http://www.hpl.hp.com/loco/>

--
Marcelo Weinberger
Information Theory & Algorithms
Hewlett-Packard Laboratories
Palo Alto, California

### World's best lossless image compression?

CALIC and TMW are the bad-asses in lossless image compression.

It's usefull for medical image compression.

--------------------------------------------

http://www.cco.caltech.edu/~bloom/index.html
Dogs love me cuz my brains are sniffable.

### World's best lossless image compression?

Hi,
The Pegasus Medical Image Toolkit has a couple of lossless compression
options built in, including jpeg-ls.  If you would like to try out an
http://www.veryComputer.com/
Best wishes,
jack
--
|   Check out "BETTER JPEG" live: http://www.veryComputer.com/; |
|- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - |
| Jack Berlin - Pegasus Imaging Corp - the BETTER JPEG people! |

|--------------------------------------------------------------|

> Hi everyone,

> I have been doing some preliminary research into lossless image
> compression. I've come so far that I can calculate how much my images will
> compress. Typically 2:1. Since I'm studying and otherwise busy I don't feel
> I have the time to implement it, unless it can compare in performance with
> what's state of the art. I'm doing this most for fun, but if my method
> turns out to be comparable to other methods out there I might spend some
> time working on it.

> I have the following questions: What is the best lossless compression
> algorithm for images (continuos tone) on the market (in the labs?) today,
> and how well does it perform (compression ratio). Can a working version be
> for lossless compression of images containing e.g. diagrams, text etc, and
> how do these perform (downloads?)? Are there any compression algorithms for
> lossless compression of video, and how do these perform?

> I've heard that an algorithm by the name LOCO is supposed to be very good,
> anybody heard of it?

> Who as interest in lossless image compression? That is, who needs lossless
> compression? I'm interested in knowing what the market is for a lossless
> image compressor.

> I have a lot of questions, and I hope somebody can answer at least some of
> them.

> Greetings from

> John Reidar Mathiassen
> Department of Engineering *netics (ITK)
> Norwegian University of Science and Technology (NTNU)

### World's best lossless image compression?

the worlds best compression technique is the bzip2 algorithem.
it is used by bzip2 (duh!), a linux compression program. have a look at
ftp://ftp.kernel.org and go to the linux kernels to see the differences
between
gzip (like pkzip) and bzip2. i don't think it's a good idea to use it as an
image
compressor, because it's extreemly slow, but it compresses like no other.

l8er... [zaxe] ^ nebula

> Hi everyone,

> I have been doing some preliminary research into lossless image
> compression. I've come so far that I can calculate how much my images will
> compress. Typically 2:1. Since I'm studying and otherwise busy I don't feel
> I have the time to implement it, unless it can compare in performance with
> what's state of the art. I'm doing this most for fun, but if my method
> turns out to be comparable to other methods out there I might spend some
> time working on it.

> I have the following questions: What is the best lossless compression
> algorithm for images (continuos tone) on the market (in the labs?) today,
> and how well does it perform (compression ratio). Can a working version be
> for lossless compression of images containing e.g. diagrams, text etc, and
> how do these perform (downloads?)? Are there any compression algorithms for
> lossless compression of video, and how do these perform?

> I've heard that an algorithm by the name LOCO is supposed to be very good,
> anybody heard of it?

> Who as interest in lossless image compression? That is, who needs lossless
> compression? I'm interested in knowing what the market is for a lossless
> image compressor.

> I have a lot of questions, and I hope somebody can answer at least some of
> them.

> Greetings from

> John Reidar Mathiassen
> Department of Engineering *netics (ITK)
> Norwegian University of Science and Technology (NTNU)

### World's best lossless image compression?

|
| CALIC and TMW are the bad-asses in lossless image compression.
|
| It's usefull for medical image compression.

That's putting it mildly. Give a sleazy lawyer the term "lossy
compression" and the makeup of the typical jury after lawyers throw off
the people who finished high school, no reasonable company wants to be a
test case.
--

"If I were a diplomat, in the best case I'd go hungry.  In the worst
case, people would die."
-- Robert Lipe

### World's best lossless image compression?

| the worlds best compression technique is the bzip2 algorithem.
| it is used by bzip2 (duh!), a linux compression program. have a look at
|  ftp://ftp.kernel.org and go to the linux kernels to see the differences
| between
| gzip (like pkzip) and bzip2. i don't think it's a good idea to use it as an
| image
| compressor, because it's extreemly slow, but it compresses like no other.

Boy, are you going to get an education...

The bzip2 algorithm is not even as good as bzip, which was a research
algorithm and infringed patents. I had a good exchange of mail with the
author when zip came out, and I believe the docs with bzip2 include this
information.
--

"If I were a diplomat, in the best case I'd go hungry.  In the worst
case, people would die."
-- Robert Lipe

### World's best lossless image compression?

Hi again,

Quote:> the worlds best compression technique is the bzip2 algorithem.
> it is used by bzip2 (duh!), a linux compression program. have a look at
>  ftp://ftp.kernel.org and go to the linux kernels to see the differences
> between
> gzip (like pkzip) and bzip2. i don't think it's a good idea to use it as
an
> image
> compressor, because it's extreemly slow, but it compresses like no other.

I checked out the link above and it says there that "bzip2 is a freely
available, patent free (see below), high-quality data compressor. It
typically compresses files to within 10% to 15% of the best available
techniques", therefore it is not the best compression technique in the
world.

Just thought I'd mention it.

John Reidar

### World's best lossless image compression?

> I have the following questions: What is the best lossless compression
> algorithm for images (continuos tone) on the market (in the labs?) today,
> and how well does it perform (compression ratio). Can a working version be
> for lossless compression of images containing e.g. diagrams, text etc, and
> how do these perform (downloads?)? Are there any compression algorithms for
> lossless compression of video, and how do these perform?

Best continuous-tone image compressor wrt compression ratio? CALIC.
See http://www.elis.rug.ac.be/~denecker/doc/papers/prorisc97/
For other types of images (e.g. GIF: 256-colormap) the answer is
more complicated.

Quote:> I've heard that an algorithm by the name LOCO is supposed to be very good,
> anybody heard of it?

It is a very good tradeoff between good compression ratio, low memory
consumption and high coding/decoding speed.

Quote:> Who as interest in lossless image compression? That is, who needs lossless
> compression? I'm interested in knowing what the market is for a lossless
> image compressor.

Medical imaging. And they tend to use images of 256 over 4096 to 65536
gray levels (some people claim to see more than 8 bits). Some of the
least-significant bits usually are noise. And the underlying physics,
scanners and software causes a much greater diversion in the imagery
than the undetectable loss introduced by near-lossless or lossy
compression schemes. It's mainly a matter of responsability and legal
arguments. Another reason for lossless compression might be that though
our eyes don't see the subtle gray-levels, statistical image processing
(or others) might and you never know what the images from today will be
used for in the future.

Other fields might include: prepress imagery (especially the high-end)
and digital printing, satellite images.

Many companies/organizations are very reluctant to compression
algorithms which are covered by patents, so don't expect to make a lot
of money from a new world-record breaking algorithm which is 10% better
than the state-of-the-art. The economical value of a good compression
scheme lies in its "added value": fast, transparent in use, portable to
a whole gamma of architectures ranging from palm-top computer
communication to really huge archiving systems, applicable to all types
of data, progressive reconstruction, randomly accessible, etc. By
introducing lossless compression, people only want the benefits and not

The research field of lossless compression is comparable to trying to
achieve the "0 Kelvin point". By tuning and optimizing, you could do it
maybe 10% or 20% better than CALIC, but never expect to achieve twice as
good compression as CALIC does. No one can compress noise.

--

Vriendelijke groeten,
Kind Regards,

Koen Denecker                                      RUG - ELIS - MEDISIP

Tel: ++32(9)264.89.08    GSM: ++32(477)55.39.04   Fax: ++32(9)264.35.94
University of Ghent - Department of Electronics and Information Systems
Sint-Pietersnieuwstraat 41       B-9000 Gent                    Belgium
=======================================================================

### World's best lossless image compression?

> The results don't support the assertion that bzip2 is
> the world's best image compressor.  Also, bzip2 decompression
> is relativaly slow.
> --

i am working on compression on PL3D data utilizing block sorting
algorithm which is the key tech. in bzip2. i was told this algorithm
is patent-free. the slow speed of bzip2 is due to algorithm of
block sorting which include long string comparison.

Yanlin

### World's best lossless image compression?

Quote:

> (some people claim to see more than 8 bits).

BTW it's a myth that humans can't distinguish better than 1/256
increments of the CIE white.  You can test this easily by filling
an image at gray value 127, then paint half of it with gray 128.
You will clearly be able to see the line where the colors change.
It would be nice to go to an 11-10-11 bit hardware (colors in 32-bit
words) except for the computational complexity on the software
side...

--------------------------------------------

http://www.its.caltech.edu/~bloom/index.html

### World's best lossless image compression?

Personally, I would like see the native display to be in YCbCr(12|10|10).
That would give you 12 bits where it's needed most.

says...

>> (some people claim to see more than 8 bits).

>BTW it's a myth that humans can't distinguish better than 1/256
>increments of the CIE white.  You can test this easily by filling
>an image at gray value 127, then paint half of it with gray 128.
>You will clearly be able to see the line where the colors change.
>It would be nice to go to an 11-10-11 bit hardware (colors in 32-bit
>words) except for the computational complexity on the software
>side...

>--------------------------------------------

>http://www.its.caltech.edu/~bloom/index.html

### World's best lossless image compression?

Quote:

>Personally, I would like see the native display to be in YCbCr(12|10|10).
>That would give you 12 bits where it's needed most.

Yeah, I was thinking that, but that would require some sort of conversion
somewhere.  Of course it would be more natural if more people used
YCC ... (for example, as the default internal format in paint
programs, and so on)

--------------------------------------------

http://www.its.caltech.edu/~bloom/index.html

### World's best lossless image compression?

> > (some people claim to see more than 8 bits).

> BTW it's a myth that humans can't distinguish better than 1/256
> increments of the CIE white.  You can test this easily by filling
> an image at gray value 127, then paint half of it with gray 128.
> You will clearly be able to see the line where the colors change.
> It would be nice to go to an 11-10-11 bit hardware (colors in 32-bit
> words) except for the computational complexity on the software
> side...

True - what I meant is the following: any given "typical" medical image
viewed under "typical circumstances" will not be "visually different"
for an "average viewer" if it is represented using 8 bits or more.

Basically the question of how much bits are needed is very complex
and the answer depends on a lot of factors, such as: origin of the
image (tomographic, digital photographic, scanned from film), use
of the image (standard viewing, enhanced viewing on a workstation,
pre- and/or postprocessing, image processing techniques) and final
destination of the image (mere archival, clinical follow-up). And
many other factors which and I have not mentioned. Considering the
future workflow of medical imaging, 12 bits might be needed to
for the most demanding applications.

--

Vriendelijke groeten,
Kind Regards,

Koen Denecker                                      RUG - ELIS - MEDISIP

Tel: ++32(9)264.89.08    GSM: ++32(477)55.39.04   Fax: ++32(9)264.35.94
University of Ghent - Department of Electronics and Information Systems
Sint-Pietersnieuwstraat 41       B-9000 Gent                    Belgium
=======================================================================

Anyone know any research done in 'Progressive Lossless Image
Compression'?
That is, a compression method that allows one to transmit an image
from low resolution to higher resolution to original image.

A simple solution would be pyramid coding (eg: wavelet) +
entropy coding of residuals, but I suppose we can do better than
that.

I know that some wavelet-based algorithm can encode image
in such a way that the decoder can decode the image at any resolution.
But what if one already has a lossy version of the image and would like
to reconstruct the original image? How much more data does the
decoder need in order to reconstruct the original one?

Thanks.