Shannon entropy of logic gate output.

In summary, the given conversation discusses two logic gates with different input-output relationships and their corresponding Shannon entropies. The first gate has an input entropy of 2 bits and an output entropy of 2 bits, while the second gate has an input entropy of 2 bits and an output entropy of 3/2 bits. The formula for Shannon entropy is used to calculate the output entropy, taking into account the probabilities of each possible outcome. The 'events' in this context refer to the different input and output combinations, and they are not necessarily independent in the second example.
  • #1
Dazed&Confused
191
3

Homework Statement


A particular logic gate takes two binary inputs [itex] A[/itex] and [itex] B [/itex] and has two binary outputs [itex]A'[/itex] and [itex]B'[/itex]. I won't reproduce the truth table. Suffice to say every combination of [itex] A[/itex] and [itex]B[/itex] is given. The output is produced by [itex] A' = \text{NOT} \ A[/itex] and [itex] B' = \text{NOT} \ B [/itex]. The input has Shannon entropy of 2 bits. Show that the output has a Shannon entropy of 2 bits.

A second logic has output produced by [itex] A' = A \ \text{OR} \ B[/itex] and [itex] B' = A \ \text{AND} \ B [/itex]. Show that the output now has an entropy of [itex] \frac32 [/itex] bits.

Homework Equations


[tex] S = - \sum_{i} k P_i \log P_i [/tex]

The Attempt at a Solution


From what I (don't) understand, [itex] P = \frac12 [/itex] in the first example for [itex] A, B, A',B' [/itex] so the total number of bits is the same for both input and output. For the second example, I would say [itex] P_{A'} = \frac34 [/itex] and [itex] P_{B'} = \frac14 [/itex], but that does not produce the correct number of bits.
 
Physics news on Phys.org
  • #2
It helps greatly if you draw up a truth table to represent the inputs and outcomes. The [itex]P_{i}[/itex]'s that appear in the formula for Shannon entropy represent the probabilities of the various outcomes, so expressions like [itex]P_{A'}[/itex] make no sense at all.
As a quick example to illustrate the use of the Shannon entropy formula, consider the flipping of a fair coin. There are two possible outcomes - heads or tails, each which occur with 50% probability. The Shannon entropy is then [itex]S = - \frac{1}{2} \log_{2} \frac{1}{2} - \frac{1}{2} \log_{2} \frac{1}{2} = 1 [/itex].
 
  • Like
Likes Dazed&Confused
  • #3
Thanks. There is a truth table, so do you mean I should add one here? By [/itex] P_{A'} [/itex] I meant the probability of [itex] A' [/itex] being 1 but I agree this is confusing. I'm still not sure about the answer though. It works out fine for the first example as it is [itex] S = -4 \times \frac12 \log_2 \frac12 = 2 [/itex] but for the second example I think it is [tex]
S = - \frac32 \log_2 \frac34 - \frac12 \log_2 \frac14. [/tex]
 
  • #4
Take note that there are three outcomes for the second example. Unlike the first example, the 'events' A' and B' are not independent.
Your result doesn't agree with mine, so perhaps you might want to list down the truth table for this example?
 
  • Like
Likes Dazed&Confused
  • #5
Oh I see where I went wrong. I'm confusing what the 'events' are. I have the correct answer now.
 

Related to Shannon entropy of logic gate output.

1. What is the Shannon entropy of logic gate output?

The Shannon entropy of logic gate output is a measure of the uncertainty or randomness in the output of a logic gate. It is named after Claude Shannon, who developed the concept of information entropy in communication theory.

2. How is Shannon entropy calculated for logic gate output?

Shannon entropy is calculated by taking the negative sum of the probabilities of each possible output multiplied by the logarithm of those probabilities. The formula is: H = - Σ p(x) log p(x), where p(x) is the probability of each output.

3. What does a higher Shannon entropy value indicate for logic gate output?

A higher Shannon entropy value indicates a higher degree of uncertainty or randomness in the output of a logic gate. This means that the output is less predictable and contains more information.

4. How is Shannon entropy useful in evaluating logic gate performance?

Shannon entropy is useful in evaluating logic gate performance because it provides a quantitative measure of the quality of the output. A lower entropy value indicates a more predictable output, which is desirable in many applications.

5. Can Shannon entropy be used to compare different logic gates?

Yes, Shannon entropy can be used to compare different logic gates. It allows for a standardized measurement of the randomness in the output of different gates, making it a useful tool for evaluating and comparing their performance.

Similar threads

  • Introductory Physics Homework Help
Replies
25
Views
4K
  • Introductory Physics Homework Help
Replies
12
Views
2K
  • Introductory Physics Homework Help
Replies
3
Views
746
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Programming and Computer Science
Replies
9
Views
3K
  • Programming and Computer Science
Replies
1
Views
694
  • Introductory Physics Homework Help
Replies
15
Views
1K
Replies
5
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
4
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
10
Views
4K
Back
Top