Fog Creek Software
g
Discussion Board




Compression Algorithm Suggestion


Can someone recommend me an Algorithm
for compressing 15000 JPG files.

I'm aware that JPG is already a compressed format but
Can I reduce the size any further of those 15000 JPG files by compiling them into one single file.

What I would like to do then ...
Is to uncompress a certain area of this big file (which contains the 15000 pix) so I can display the pic on a form
(It's for a Pocket PC Application)

But maybe technically what I'm trying to achieve is not possible ?

CycleBurner
Friday, December 19, 2003

You can't compress them further unless you decode then resample them with higher compression parameters (which will degrade image quality), resize them smaller, or use a "better" compression codec than what was used originally.

Rick
Friday, December 19, 2003

Rick :

Yes I want to decode them (in one Single File) then recode them later (in RAM)

CycleBurner
Friday, December 19, 2003

Actually I just want to recode ONE SINGLE PICTURE IN RAM at a time ...

But I'm note sure if that's technically possible

CycleBurner
Friday, December 19, 2003

While you may have to do some work to get it working on the Pocket PC, couldn't you accomplish this using any of the zip/rar/whatever compression packages out there?  Throwing all of the jpgs into a zip file, and then just pulling out the jpg you need when you need it seems like exactly what you want to do.


Friday, December 19, 2003

It sounds more like you just need to zip them up (using a zip library that allows NO compression) and then retrieve by name. I wouldn't bother trying to compress the images further, jpeg is pretty good for photos (assuming photos).

Or you could write your own archiver :)

Dominic Fitzpatrick
Friday, December 19, 2003

arrrgh, can we have timestamps to prove I posted that at the same time :)

Dominic Fitzpatrick
Friday, December 19, 2003


I'm not sure if WinZip is available for Pocket PC

CycleBurner
Friday, December 19, 2003

"I'm aware that JPG is already a compressed format but
Can I reduce the size any further of those 15000 JPG files by compiling them into one single file."

No.  Image compression works a bit differently from (zip-style) file compression so you don't get any extra compression from compiling them into a single file.

Almost Anonymous
Friday, December 19, 2003

Cycle, "decode" means to decompress, and "recode" (encode) means to re-compress. To draw the image, you (or the system) must DEcode, not "recode".

Rick
Friday, December 19, 2003

Not sure about the file system on the pcoket PC but putting them in a single file will probably save space by just getting rid of the wasted space at end of each cluster that are caused by storing the individual files.

DJ
Friday, December 19, 2003

Phhh... zipping a jpeg file gives you minimal, if any, size reduction.

DP
Friday, December 19, 2003

To OP:  For this additional compression, are you looking for lossy compression or loss-less compression?

Michael Kale
Friday, December 19, 2003

Just concatenate the jpgs into a single file?

A table listing the offset of each image at the front of the file should make seeking a breeze.

i like i
Friday, December 19, 2003

For better compression you could try out JPG2000.

ZIP won't help with file size anymore (you'd best turn off compression totally), but it'd neatly solve the problem of having it in one big file. And if you have no compression the file format is _really_ easy. Basically a header file with some offsets. I once implemented such a reader and it was half an hour's work, 1 hour with testing.

Sebastian Wagner
Friday, December 19, 2003

Unless you are willing to have a loss of image quality that would result from decoding them and reencoding them with more compact settings, you aren't going to get much more than 1 or 2% compression by zipping them or using any other compression.

If you need them in 1 file for convenience, just zip or tar them them up into 1 file (with compression turned off).

T. Norman
Friday, December 19, 2003

As everyone has pointed out, you cannot compress JPEGs any more without using lossy compression. I would recommend that you just choose a different image file format, such as JPEG2000.

runtime
Friday, December 19, 2003

Another thought is to make sure that the images are sized appropriately and that you are not scaling them down to fit on the screen of the pocket pc.

I realize that this is bone head obvious, yet I've seen the same mistake repeatedly in some programs from vertical vendors.

Steve Barbour
Friday, December 19, 2003

If those JPEG's are similar in nature eyou could put them together as a movie - mpeg file which would allow you not only  the ability to compress across one picture as in case of JPEG but across multiple ones since each one would be a frame in a movie.

Interestingly MPEG applies the same intra-frame compression as JPEG for each frame

Code Monkey
Friday, December 19, 2003

oh my,.. The things people do just to carry their porn collection in their pocket device..

.NET Developer
Friday, December 19, 2003

Decode all your jpegs partially to the pre-huffman coding stage. From there you can use arithmetic coding and/or do a mpeg-like motion vector search. Theoretically that can compress better without losing quality. It really depends on how much processing time you're willing to spend.

JP2K Coder
Saturday, December 20, 2003

well, there might be a way, i think... if the jpg's are all similarly sized and have similar content (e.g., backgrounds are often the same)... i would think you could break apart each jpg and interleave it into the larger repository file.

so the top, left pel would be next to the top, left pel of the second image, followed by the top, left pel of the third image and so forth.  in this manner, you might be a zip algorithm to deflate certain areas of the interleaved jpg's...

again, depending upon similarity of size and content...

dir at badblue com
Saturday, December 20, 2003

I don't agree that what the OP wants is impossible. It is true that putting one JPEG in a zip file will not significantly reduce its size, however that is not what the OP wanted. He wanted to put a whole bunch in there, and that certainly gives the possibility of further compression, since although there is little redundancy within the individual files, there is surely some redundancy accross all the files.

As a proof of concept, what if two of the pictures were identical? In that case the second picture could simply be a reference to the first. (And what I just described is, in very simplistic form, exactly how LZW compression works.) Of course of more practical significance, what if the pictures are all very similar? Then that similarity could be extracted as additional redundancy.

It would certainly be worth trying to dump all the pics into a zip file and try to compress them.

Two other alternatives you might try: convert all the files back to an uncompressed format such as bmp, and then put all these files into the zip (This will give the compresser access to more redundant data.) Or, tesselate the pictures into one gigantic picture, and save that using gif, jpeg, or zip.

I don't guarantee any of these suggestions will work, however I am sure that it is theoretically possible to get more compression that simple concatenating all the compressed files together.

Jessica Boxer
Saturday, December 20, 2003

Actually, in retrospect, using zip will probably not do it for you, because I suspect that zip compresses files individually, and then concatenates them, so cross file redundacy will not be eliminated.


One approach that may work would be to use the tesselation approach I mentioned earlier, or, more simply, convert the files to an uncompressed format such as bmp, and then concatenate the files together (making an index of what file started where.) Then compress that one big file, and you could see some significant reduction.

Jessica Boxer
Saturday, December 20, 2003

Well, you can't find Winzip for the ppc, but there are other programs. Total Commander is capable of zipping/unzipping if I'm not mistaken, but Resco Explorer does it as well.

Lucid Nightmare
Thursday, June 10, 2004

*  Recent Topics

*  Fog Creek Home