>Was hoping to come up with some image compression scheme to save anything I >could on the file size, for my proprietary File type for Compositor. > I have never studied compression algorithms, but just off the top of my head, I would experiment with something like this: Determine the least frequently used value (LFV) Insert a zero byte after each LFV to indicate is should retain its byte value. Find the largest blocks repeated the most times (i.e. 10 blocks of 5 bytes would get precedence over 5 blocks of 9 bytes) Use the LFV as a flag to indicate that the next byte (if non-zero) represents an index to a block record, which would hold a 2-byte length and a 4-byte offset to the first instance. Replace each of the repeated blocks (after the first) with LFV (1 byte) plus an index (1 byte) to the separate index table. You could iterate this process up to 255 times, ending up with repeated blocks which themselves held markers for other repeated blocks. Amount of compression would depend on how many times the LFV occurs (because you will add a byte for each of them), and how many times blocks are repeated (the first replacement saves blocklength - 8, each additional replacement saves blocklength - 2). Just an idea. I don't know how effective it would be, or how difficult to implement. Could be this is the wheel some of the schemes have been riding on all along. :-) 0"0 =J= a y "