Forgot Password?

We've sent you an email verification code. Please enter it here to set a new password.

Now share this with your friends to help them.

Define self information, entropy of the long independent messages, information rate, symbol rate and mutual information.

The output of an information source consists of 128 symbols, 16 of which occur with a probability of 1/32 and the remaining occur with a probability of 1/224. The source emits 1000 symbols per second. Assuming that the symbols are chosen independently, find the average information rate of this source.

For the Markov source model shown in the following figure;i) Compute the state probabilities. ii) Compute the entropy of each state. iii) Compute the entropy of the source.

State the properties of entropy.

A source emits one of the 5 symbols A, B, C, D & E with probabilities 1/4, 1/8, 1/8, 3/16 and 5/16 respectively in an independent sequence of symbols. Using Shannon's binary encoding algorithm find all the code words for the each symbol. Also find coding efficiency and redundancy.

Construct a Shannon-Fano ternary code for the following ensemble and find code efficiency and redundancy. Also draw the corresponding code-tree. with

Show that H(X,Y) = H(Y)+H(X/Y)

The noise characteristics of a non-symmetric binary channel is given in the following figure. i)ii) Also find the capacity of the channel with

A source has an alphabet consisting of seven symbols A, B, C, D, E, F & G with probabilities 1/4, 1/4, 1/8, 1/8, 1/8, 1/16, and 1/16 respectively. Construct Huffman Quarternery code. Find coding efficiency.

State Shannon-Hartley theorem and explain its implications.

A Gaussian channel has a bandwidth of 4 kHz and a two-side noise power spectral density watts/Hz. The signal power at the receiver has to be maintained at a level less than or equal to 1/10th of milliwatt. Calculate the capacity of this channel.

Explain the properties of mutual information.

What are the types of errors and types of codes in error control coding?

Consider a (6,3) linear code whose generator matrix is. i) Find all code vectors. ii) Find all the Hamming weights. iii) Find minimum weight parity check matrix,iv) Draw the encoder circuit for the above codes.

The parity check bits of a (7,4) Hamming code are generated by,where are the message bits.i)Find generator matrix and parky check matrix. ii) Prove that

Define Binary cyclic codes. Explain the properties of cyclic codes.

A (15,5) linear cyclic code has a generator polynomial,i) Draw the block diagram of an encoder for this code ii) Find the code vector for the message polynomial in systematic form iii) is code polynomial?

Witte short notes on:a. BCH codes. b. RS codes. c. Golay codes. d. Brust error correcting codes.

What are convolutional codes? Explain encoding of convolutional codes using transform domain approach.

Consider the (3, 1, 2) convolutional code with and i) Draw the encoder block diagram. ii) Find the generator matrix. iii) Find the code word corresponding to the information sequence (1 1 1 0 1) using time domain approach.

Ipjugaad