Parallel Computing
Links
Huffman Compression
Code
Data
TO DO

Build an arraybased binary tree using each symbol's codeword.

Then, walk that tree (again and again) to decode the message.


Not counting any of the overhead bits, calculate the ratio of...

...decoded (i.e., ASCII) bits minus encoded bits, to decoded bits.


Calculate the theoretical (Shannon) minimum number of bits to

encode this message with any scheme. How close was Huffman?


minimum number of bits = SUM(frequency * log2(probability))


Now instead start with plain text and encode it using Huffman.

TreeNode


sqr.jpg
Back to TJ CompSci
28 August 2017