Use these filters to find papers

  Question Marks
0
requests

Define self information, entropy of the long independent messages, information rate, symbol rate and mutual information.

This question has 0 answers so far.
5
0
requests

The output of an information source consists of 128 symbols, 16 of which occur with a probability of 1/32 and the remaining occur with a probability of  1/224. The source emits 1000 symbols per second. Assuming that the symbols are chosen independently, find the average information rate of this source.

This question has 0 answers so far.
5
0
requests

For the Markov source model shown in the following figure;
i) Compute the state probabilities.
ii) Compute the entropy of each state.
iii) Compute the entropy of the source.

This question has 0 answers so far.
10
0
requests

State the properties of entropy.

This question has 0 answers so far.
4
0
requests

A source emits one of the 5 symbols A, B, C, D & E with probabilities 1/4, 1/8, 1/8, 3/16 and 5/16 respectively in an independent sequence of symbols. Using Shannon's binary encoding algorithm find all the code words for the each symbol. Also find coding efficiency and redundancy.

This question has 0 answers so far.
8
0
requests

Construct a Shannon-Fano ternary code for the following ensemble and find code efficiency and redundancy. Also draw the corresponding code-tree. with

This question has 0 answers so far.
8
0
requests

Show that H(X,Y) = H(Y)+H(X/Y)

This question has 0 answers so far.
5
0
requests

The noise characteristics of a non-symmetric binary channel is given in the following figure.

i)
ii) Also find the capacity of the channel with

This question has 0 answers so far.
10
0
requests

A source has an alphabet consisting of seven symbols A, B, C, D, E, F & G with probabilities 1/4, 1/4, 1/8, 1/8, 1/8, 1/16, and 1/16 respectively. Construct Huffman Quarternery code. Find coding efficiency.

This question has 0 answers so far.
5
0
requests

State Shannon-Hartley theorem and explain its implications.

This question has 0 answers so far.
8
0
requests

A Gaussian channel has a bandwidth of 4 kHz and a two-side noise power spectral density  watts/Hz. The signal power at the receiver has to be maintained at a level less than or equal to 1/10th of milliwatt. Calculate the capacity of this channel.

This question has 0 answers so far.
6
0
requests

Explain the properties of mutual information.

This question has 0 answers so far.
6
0
requests

What are the types of errors and types of codes in error control coding?

This question has 0 answers so far.
4
0
requests

Consider a (6,3) linear code whose generator matrix is.

i) Find all code vectors.
ii) Find all the Hamming weights.
iii) Find minimum weight parity check matrix,
iv) Draw the encoder circuit for the above codes.

This question has 0 answers so far.
10
0
requests

The parity check bits of a (7,4) Hamming code are generated by,

where  are the message bits.
i)Find generator matrix and parky check matrix.
ii) Prove that

This question has 0 answers so far.
6
0
requests

Define Binary cyclic codes. Explain the properties of cyclic codes.

This question has 0 answers so far.
8
0
requests

A (15,5) linear cyclic code has a generator polynomial,

i) Draw the block diagram of an encoder for this code 
ii) Find the code vector for the message polynomial  in systematic form
iii) is code polynomial?

This question has 0 answers so far.
12
0
requests

Witte short notes on:
a. BCH codes.
b. RS codes.
c. Golay codes.
d. Brust error correcting codes.

This question has 0 answers so far.
20
0
requests

What are convolutional codes? Explain encoding of convolutional codes using transform domain approach.

This question has 0 answers so far.
8
0
requests

Consider the (3, 1, 2) convolutional code with
and
i) Draw the encoder block diagram.
ii) Find the generator matrix.
iii) Find the code word corresponding to the information sequence (1 1 1 0 1) using time domain approach.

This question has 0 answers so far.
12
Chat with us