Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Efficient data compression of 100 bytes

Status
Not open for further replies.

123jack

Advanced Member level 2
Advanced Member level 2
Joined
Apr 21, 2010
Messages
547
Helped
91
Reputation
182
Reaction score
78
Trophy points
1,308
Location
UK
Activity points
5,357
Can anyone suggest the most efficient means of compressing 100 bytes of data?
By efficient I mean the runtime code should be as small as possible and
the data result as small as possible. The data must be accurately recoverable.
Time to compress is not a serious consideration so it doesnt have to be very fast.

It should be implementable in C if possible.

Obviously I've looked around at the popular stuff but I wondered (given the time
not being a great issue) if there was an algorythm that stood out to do this.


jack
 

Compression is based on statistical properties of your data. So you should know these properties to implement a compression algorithm. Look e.g. at Huffman coding and Lempel-Ziv algorithm.
Do you want lossless compression? Or lossy compression?
 

Even with compression, the overhead to compress 100 bytes would be larger than the amount of saving. In general, compression would add a block of data to the beginning of the 100 bytes, holding information on the decompression method and block identifiers. In only 100 bytes it would be difficult to make the overall result smaller. Compression works best on large data sets where the header is small in comparison to the part being reduced.

Best case is when the 100 bytes are all the same value, worst case is when they are all random. As 'htg' states, you need to know more about the properties of the data before even considering if compression is worth the effort.

Brian.
 

No guys - as I said the data must be recoverable.
And yes I know LZ will be inneficient which is why I'm asking.
LZ is one form of compression that at its core is the same
as huffman and that used for GIF images. There are many other
very different methods. I've implented those myself in the past.

There are a couple of methods for doing this but I'm not into
information theory which is why I'm asking if there's something
the more math oriented may know about for this sort of thing.
Remember 100 bytes is 800 bits of data.

jack
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top