Chapter 5: Q13E (page 162)
A long string consists of the four characters ; they appear with frequency and respectively. What is the Huffman encoding of these four characters?
Short Answer
Huffman encoding of the characters is respectively.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 5: Q13E (page 162)
A long string consists of the four characters ; they appear with frequency and respectively. What is the Huffman encoding of these four characters?
Huffman encoding of the characters is respectively.
All the tools & learning materials you need for study success - in one app.
Get started for free
In this problem, we will develop a new algorithm for finding minimum spanning trees. It is based upon the following property:
Pick any cycle in the graph, and let e be the heaviest edge in that cycle. Then there is a minimum spanning tree that does not contain e.
(a) Prove this property carefully.
(b) Here is the new MST algorithm. The input is some undirected graph G=(V,E) (in adjacency list format) with edge weights {we}.sort the edges according to their weights for each edge , in decreasing order of we:
if e is part of a cycle of G:
G = G - e (that is, remove e from G )
return G , Prove that this algorithm is correct.
(c) On each iteration, the algorithm must check whether there is a cycle containing a specific edge . Give a linear-time algorithm for this task, and justify its correctness.
(d) What is the overall time taken by this algorithm, in terms of ? Explain your answer.
Entropy: Consider a distribution overpossible outcomes, with probabilities .
a. Just for this part of the problem, assume that each is a power of 2 (that is, of the form ). Suppose a long sequence of samples is drawn from the distribution and that for all , the outcome occurs exactly times in the sequence. Show that if Huffman encoding is applied to this sequence, the resulting encoding will have length
b. Now consider arbitrary distributions-that is, the probabilities are noy restricted to powers of 2. The most commonly used measure of the amount of randomness in the distribution is the entropy.
For what distribution (over outcomes) is the entropy the largest possible? The smallest possible?
Suppose you are given a weighted graph with a distinguished vertex s and where all edge weights are positive and distinct. Is it possible for a tree of shortest paths from s and a minimum spanning tree in G to not share any edges? If so, give an example. If not, give a reason.
Under a Huffman encoding of symbols with frequencies , what is the longest a codeword could possibly be? Give an example set of frequencies that would produce this case.
Give a linear-time algorithm that takes as input a tree and determines whether it has a perfect matching: a set of edges that touches each node exactly once.
A feedback edge set of an undirected graph G(V,E) is a subset of edgesthat intersects every cycle of the graph. Thus, removing the edges will render the graph acyclic.
Give an efficient algorithm for the following problem:
Input: Undirected graph G(V,E) with positive edge weights .
Output: A feedback edge set minimum total weight .
What do you think about this solution?
We value your feedback to improve our textbook solutions.