/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q19E Entropy: Consider a distribution... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Entropy: Consider a distribution overnpossible outcomes, with probabilities p1,p2,K,pn.

a. Just for this part of the problem, assume that each piis a power of 2 (that is, of the form 1/2k). Suppose a long sequence of msamples is drawn from the distribution and that for all 1≤i≤n, the ithoutcome occurs exactly times in the sequence. Show that if Huffman encoding is applied to this sequence, the resulting encoding will have length

∑i-1nmpilog1pi

b. Now consider arbitrary distributions-that is, the probabilities pi are noy restricted to powers of 2. The most commonly used measure of the amount of randomness in the distribution is the entropy.

∑i-1nmpilog1pi

For what distribution (over outcomes) is the entropy the largest possible? The smallest possible?

Short Answer

Expert verified
  1. It can be proved that the length of the sequence is ∑i-1nmpilog1pi.
  2. pi=1n is the largest entropy. pk=1,pi,i≠k=0is the smallest entropy.

Step by step solution

01

Explain the information given

Consider a distribution over possible outcomes with probabilities p1,p2,Kpn. Assume that each piis a power of 2. Consider the long sequence of msamples is drawn from the distribution and for all 1≤i≤n, iththe outcome occurs exactly mpitimes in the sequence.

02

Step 2:Show the length of the sequence

(a)

Consider the encoded length of the element piis with the probability log1pi. The probability of 12kis 1. The ckis at the level kin Huffman tree (assuming that the root is at level 0) and 12>25, and for all role="math" localid="1659009731170" 12n<13,n>1. Since the outcome is c12 and the probability of all the elements in the first layer is 1.

Consider the root to be 1, the subtree must be at the first level and this level elements have the probability 14.Then the probability ck, results in the Huffman code is reduced to the length role="math" localid="1659010017242" ∑i=1nmpilog1pi.

Therefore, the length of the sequence is role="math" localid="1659010031265" ∑i=1nmpilog1pi.

03

Step 3:Calculate the entropy of largest possible and the smallest possible.

(b)

pi=1n is the largest entropy. pk=1,pi,i≠k=0is the smallest entropy.

Therefore, the largest and the smallest possible entropies are obtained.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Consider an undirected graph G=(V,E)with nonnegative edge weights role="math" localid="1658915178951" we≥0. Suppose that you have computed a minimum spanning tree of G, and that you have also computed shortest paths to all nodes from a particular node role="math" localid="1658915296891" s∈V. Now suppose each edge weight is increased by 1: the new weights are w0e=we+1.

(a) Does the minimum spanning tree change? Give an example where it changes or prove it cannot change.

(b) Do the shortest paths change? Give an example where they change or prove they cannot change.

Give the state of the disjoint-sets data structure after the following sequence of operations, starting from singleton sets 1,…,8. Usepath compression. In the case of ties, always make the lower numbered root point to the higher numbered ones.

union1,2,union3,4,union5,6,union7,8

,union1,4,union6,7,union4,5,find1

A prefix-free encoding of a finite alphabet Γ assigns each symbol in Γ a binary codeword, such that no codeword is a prefix of another codeword. A prefix-free encoding is minimal if it is not possible to arrive at another prefix-free encoding (of the same symbols) by contracting some of the keywords. For instance, the encoding {0,101} is not minimal since the codeword 101 can be contracted to 1 while still maintaining the prefix-free property.

Show that a minimal prefix-free encoding can be represented by a full binary tree in which each leaf corresponds to a unique element of Γ, whose codeword is generated by the path from the root to that leaf (interpreting a left branch as 0 and a right branch as 1 ).

Give You are given a graphG=(V,E)with positive edge weights, and a minimum spanning tree T=(V,E)with respect to these weights; you may assume GandTare given as adjacency lists. Now suppose the weight of a particular edge e∉E'is modified fromw(e)to a new value w'(e). You wish to quickly update the minimum spanning tree T to reflect this change, without recomputing the entire tree from scratch. There are four cases. In each case give a linear-time algorithm for updating the tree.

(a) e∉E'and w'(e)>w(e) .

(b) role="math" localid="1658907878059" e∉E'and w'(e)>w(e) .

(c) role="math" localid="1658907882667" e∉E'and w'(e)>w(e) .

(d) role="math" localid="1658907887400" e∉E'and w'(e)>w(e) .

Give a linear-time algorithm that takes as input a tree and determines whether it has a perfect matching: a set of edges that touches each node exactly once.

A feedback edge set of an undirected graph G(V,E) is a subset of edgesE'⊂Ethat intersects every cycle of the graph. Thus, removing the edges will render the graph acyclic.

Give an efficient algorithm for the following problem:

Input: Undirected graph G(V,E) with positive edge weights we.

Output: A feedback edge set E'⊂Eminimum total weight ∑e∈E'we.

See all solutions

Recommended explanations on Computer Science Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.