if you want to remove an article from website contact us from top.

    for m equally likely messages the average amount of information h is

    Mohammed

    Guys, does anyone know the answer?

    get for m equally likely messages the average amount of information h is from screen.

    UNIT 1

    Find and create gamified quizzes, lessons, presentations, and flashcards for students, employees, and everyone else. Get started for free!

    Science

    University

    Science University UNIT 1 - INFORMATION THEORY

    VIGNESH M 27 plays

    20 Qs

    Show Answers See Preview 1. Multiple-choice 30 seconds 1 pt Q.

    For M equally likely messages, the average amount of information H is

    answer choices H = log10M H = log2M H = log10M2 H = 2log10M 2. Multiple-choice 30 seconds 1 pt Q.

    The channel capacity is

    answer choices

    The maximum information transmitted by one symbol over the channel

    Information contained in a signal

    The amplitude of the modulated signal

    All of the above 3. Multiple-choice 30 seconds 1 pt Q.

    According to Shannon Hartley theorem,

    answer choices

    The channel capacity becomes infinite with infinite bandwidth

    The channel capacity does not become infinite with infinite bandwidth

    Has a tradeoff between bandwidth and Signal to noise ratio

    The channel capacity does not become infinite with infinite bandwidth and has a tradeoff between bandwidth and Signal to noise ratio

    4. Multiple-choice 30 seconds 1 pt Q.

    The channel capacity according to Shannon's equation is

    answer choices

    Maximum error free communication

    Defined for optimum system

    Information transmitted

    All of the above 5. Multiple-choice 30 seconds 1 pt Q.

    The technique that may be used to increase average information per bit is

    answer choices

    Shannon-Fano algorithm

    ASK FSK

    Digital modulation techniques

    6. Multiple-choice 30 seconds 1 pt Q.

    Code rate r, k information bits and n as total bits, is defined as

    answer choices r = k/n k = n/r r = k * n n = r * k 7. Multiple-choice 30 seconds 1 pt Q.

    Information rate is defined as

    answer choices

    Information per unit time

    Average number of bits of information per second

    rH All of the above 8. Multiple-choice 30 seconds 1 pt Q.

    The mutual information

    answer choices Is symmetric Always non negative

    Is symmetric & Always non negative

    None of the above 9. Multiple-choice 30 seconds 1 pt Q.

    Entropy is

    answer choices

    Average information per message

    Information in a signal

    Amplitude of signal All of the above 10. Multiple-choice 30 seconds 1 pt Q.

    The memory less source refers to

    answer choices

    No previous information

    No message storage

    Emitted message is independent of previous message

    None of the above 11. Multiple-choice 30 seconds 1 pt Q.

    The expected information contained in a message is called

    answer choices Entropy Efficiency Coded signal None of the above 12. Multiple-choice 30 seconds 1 pt Q.

    In discrete memoryless source, the current letter produced by a source is statistically independent of _____

    answer choices Past output Future output

    Past output & Future output

    None of the above 13. Multiple-choice 30 seconds 1 pt Q.

    Huffman coding technique is adopted for constructing the source code with ________ redundancy.

    answer choices Maximum Constant Minimum Unpredictable 14. Multiple-choice 30 seconds 1 pt Q.

    Which type of channel does not represent any correlation between input and output symbols?

    answer choices Noiseless Channel Lossless Channel Useless Channel

    Deterministic Channel

    15. Multiple-choice 30 seconds 1 pt Q.

    In digital communication system, smaller the code rate, _________are the redundant bits.

    answer choices less more equal unpredictable 16. Multiple-choice 30 seconds 1 pt Q.

    In channel coding theorem, channel capacity decides the _________permissible rate at which error free transmission is possible.

    answer choices Maximum Minimum Constant None of the above 17. Multiple-choice 30 seconds 1 pt Q.

    According to Shannon's second theorem, it is not feasible to transmit information over the channel with ______error probability, although by using any coding technique.

    answer choices small large stable unpredictable 18. Multiple-choice 30 seconds 1 pt Q.

    Which among the following is/are the essential condition/s for a good error control coding technique?

    answer choices

    Faster coding & decoding methods

    Better error correcting capability

    Maximum transfer of information in bits/sec

    All of the above 19. Multiple-choice 30 seconds 1 pt Q.

    Shannon and Weaver’s (1949) model of communication has the following components: ______.

    स्रोत : quizizz.com

    [Solved] The average information associated with an extremely likely

    The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution. It is calculated as: \(H =

    Home Communication Systems Information Theory Entropy Coding

    Question

    Download Solution PDF

    The average information associated with an extremely likely message is zero. What is the average information associated with an equally likely message?

    This question was previously asked in

    ESE Electronics 2012 Paper 2: Official Paper

    Attempt Online

    View all UPSC IES Papers >

    Zero Infinity

    Depends on total number of messages

    Depends on speed of transmission of the message

    Answer (Detailed Solution Below)

    Option 3 : Depends on total number of messages

    Crack GATE Electrical with

    India's Super Teachers

    FREE

    Demo Classes Available*

    Explore Supercoaching For FREE

    Free Tests

    View all Free tests >

    FREE

    CT 1: Building Materials

    4.6 K Users

    10 Questions 20 Marks 12 Mins

    Start Now

    Detailed Solution

    Download Solution PDF

    The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.

    It is calculated as:

    H=∑i=1M⁡pilog21pi bits

    pi is the probability of the occurrence of a symbol.

    The number of outcomes = M.

    Since all the outcomes are equally likely, the probability of each outcome happening is:

    p=1M

    The information associated with each outcome will be:

    H=∑i=1M⁡1Mlog211/M H=log2M

    Hence, Entropy depends on the number of meassages.

    Download Solution PDF

    Share on Whatsapp

    Latest UPSC IES Updates

    Last updated on Sep 21, 2022

    The Union Public Service Commission has released the UPSC IES Prelims Exam Date. The Exam will be conducted on 19th February 2023 for both Paper I and Paper II. Earlier, the revised UPSC IES Interview Travel Allowances for 2022 which stages that the applicant must book the lower air ticket of economy class at least 21 days before the journey. The candidates must note that this is with the reference to 2022 cycle. Recently, the board has also released the UPSC IES Notification 2023 for a total number of 327 vacancies. The candidates can apply between 14th September 2022 to 4th October 2022. The candidates must meet the USPC IES Eligibility Criteria to attend the recruitment.

    India’s #1 Learning Platform

    Start Complete Exam Preparation

    Daily Live MasterClasses

    Practice Question Bank

    Mock Tests & Quizzes

    Get Started for Free

    Download App

    Trusted by 3.4 Crore+ Students

    ‹‹ Previous Ques Next Ques ››

    More Entropy Coding Questions

    Q1.  _______ tells if the transmission rate is less than channel capacity, then there exists _______ that permit error free transmission.Q2. In Huffman coding, data in a tree always occur at -Q3. Which of the following algorithms is the best approach for solving Huffman codes?Q4. Huffman coding gives -Q5. If the constraint length of an (n, k, L) convolution code is defined as number of encoder output bits influenced by each message bit, then the constraint length is given by -Q6. Match List-I with List-II and select the correct answer using the code given below the lists- List-I List-II (A) Huffman code (1) Elimination of redundancy (B) Error-correcting (2) Reduces bit rate (C) NRZ-coding (3) Adapts the transmitted signal to the line (D) Delta modulation (4) Channel codingQ7. The Shannon-Hartley law -Q8. Let H(X) denote the entropy of a discrete random variable X taking K possible distinct real values. Which of the following statements is/are necessarily true?Q9. Let U and V be two independent and identically distributed random variables such that  P(U = +1) = P(U = -1) = 1/2. The entropy H(U + V) in bits isQ10. What is the entropy of a communication system that consists of six messages with probabilities 1/8, 1/8, 1/8, 1/8, 1/4, and 1/4 respectively ?

    More Information Theory Questions

    Q1.  _______ tells if the transmission rate is less than channel capacity, then there exists _______ that permit error free transmission.Q2. An event has two possible outcomes with probability P1 =

    12 and P1 = 164

    . The rate of information with 16 outcomes per second is -

    Q3. In Huffman coding, data in a tree always occur at -Q4. Which of the following algorithms is the best approach for solving Huffman codes?Q5. Channel capacity is a measure of -Q6. Huffman coding gives -Q7. If the constraint length of an (n, k, L) convolution code is defined as number of encoder output bits influenced by each message bit, then the constraint length is given by -Q8. Match List-I with List-II and select the correct answer using the code given below the lists- List-I List-II (A) Huffman code (1) Elimination of redundancy (B) Error-correcting (2) Reduces bit rate (C) NRZ-coding (3) Adapts the transmitted signal to the line (D) Delta modulation (4) Channel codingQ9. The Shannon-Hartley law -Q10. Directions:  In the following question there are two statements, one labelled as 'Statement (I)' and the other as 'Statement (II)'. You are to examine these two statements carefully and select the answers to these items using the code given below. Statement (I): Shannon has shown that it is possible to achieve error-free communication by adding sufficient redundancy. Statement(II): The addition of an extra check digit increases redundancy.

    स्रोत : testbook.com

    CareerRide

    Study with Quizlet and memorize flashcards containing terms like In uniform quantization process a. The step size remains same b. Step size varies according to the values of the input signal c. The quantizer has linear characteristics d. Both a and c are correct, The process of converting the analog sample into discrete form is called a. Modulation b. Multiplexing c. Quantization d. Sampling, The characteristics of compressor in μ-law companding are a. Continuous in nature b. Logarithmic in nature c. Linear in nature d. Discrete in nature and more.

    CareerRide--Digital Communication

    5.0 (1 review) Term 1 / 124

    In uniform quantization process

    a. The step size remains same

    b. Step size varies according to the values of the input signal

    c. The quantizer has linear characteristics

    d. Both a and c are correct

    Click the card to flip 👆

    Definition 1 / 124 d

    Click the card to flip 👆

    Created by Ernest_Villanueva

    Terms in this set (124)

    In uniform quantization process

    a. The step size remains same

    b. Step size varies according to the values of the input signal

    c. The quantizer has linear characteristics

    d. Both a and c are correct

    d

    The process of converting the analog sample into discrete form is called

    a. Modulation b. Multiplexing c. Quantization d. Sampling c

    The characteristics of compressor in μ-law companding are

    a. Continuous in nature

    b. Logarithmic in nature

    c. Linear in nature

    d. Discrete in nature

    a

    The modulation techniques used to convert analog signal into digital signal are

    a. Pulse code modulation

    b. Delta modulation

    c. Adaptive delta modulation

    d. All of the above d

    The sequence of operations in which PCM is done is

    a. Sampling, quantizing, encoding

    b. Quantizing, encoding, sampling

    c. Quantizing, sampling, encoding

    d. None of the above

    a

    In PCM, the parameter varied in accordance with the amplitude of the modulating signal is

    a. Amplitude b. Frequency c. Phase

    d. None of the above

    d

    One of the disadvantages of PCM is

    a. It requires large bandwidth

    b. Very high noise

    c. Cannot be decoded easily

    d. All of the above a

    The expression for bandwidth BW of a PCM system, where v is the number of bits per sample and fm is the modulating frequency, is given by

    a. BW ≥ vfm b. BW ≤ vfm c. BW ≥ 2 vfm d. BW ≥ 1/2 vfm a

    The error probability of a PCM is

    a. Calculated using noise and inter symbol interference

    b. Gaussian noise + error component due to inter symbol interference

    c. Calculated using power spectral density

    d. All of the above d

    In Delta modulation,

    a. One bit per sample is transmitted

    b. All the coded bits used for sampling are transmitted

    c. The step size is fixed

    d. Both a and c are correct

    d

    Sign up and see the remaining cards. It’s free!

    By signing up, you accept Quizlet's

    Terms of Service and Privacy Policy

    Continue with Google

    Continue with Facebook

    Sets found in the same folder

    Fourozan Chapter 10-23

    308 terms Ernest_Villanueva

    CareerRide--Antenna and Wave Propagation

    100 terms Ernest_Villanueva

    CareerRide--Telecommunications & Switching Sy…

    70 terms Ernest_Villanueva

    [EST] Frenzel Q&A - Chapter 7: Communications…

    41 terms ellisiver

    Other sets by this creator

    ECT-Refresher 2 75 terms Ernest_Villanueva ECT-Refresher 1 89 terms Ernest_Villanueva Final-Elecs 201 terms Ernest_Villanueva

    Fourozan--Chapter 19

    54 terms Ernest_Villanueva

    Other Quizlet sets

    NHA questions 46 terms classy_na econ cht 13 24 terms kody_williams8 mgt 210 51 terms Addison_11_tc3 HS 3 Final Exam 150 terms mcnamarakadih08 PLUS 1/3

    स्रोत : quizlet.com

    Do you want to see answer or more ?
    Mohammed 7 day ago
    4

    Guys, does anyone know the answer?

    Click For Answer