Member-only story

Entropy Demystified

Naoki
10 min readJul 23, 2018

--

Is it a disorder, uncertainty or surprise?

The idea of entropy is confusing at first because so many words are used to describe it: disorder, uncertainty, surprise, unpredictability, amount of information and so on. If you’ve got confused with the word “entropy”, you are in the right place. I am going to demystify it for you.

Who Invented Entropy and Why?

In 1948, Claude Shannon introduced the concept of information entropy in his paper “A Mathematical Theory of Communication”.

Claude Shannon

Source: https://en.wikipedia.org/wiki/Claude_Shannon

Shannon was looking for a way to efficiently send messages without losing any information.

Shannon measured the efficiency in terms of average message length. Therefore, he was thinking about how to encode the original message into the smallest possible data structure. At the same time, he required that there should be no information loss. As such, the decoder at the destination must be able to restore the original message losslessly.

Shannon defined the entropy as the smallest possible average size of lossless encoding of the messages sent from the source to the destination. He showed how to calculate the entropy which is a useful thing to know as to make efficient use of the communication channel.

The above definition of the entropy might not be obvious to you at this moment. We shall see a lot of examples to learn what it means to efficiently and losslessly send messages.

How to Make Efficient and Lossless Encoding?

Suppose you want to send a message from Tokyo to New York regarding Tokyo’s weather today.

Is this efficient?

In this scenario, the sender in Tokyo and the receiver in New York both know that they are only communicating today’s Tokyo’s weather. So, we don’t need to send the words like “Today”, “Tokyo’s” and “weather”. We need to tell “Fine” or “Not fine”, and that’s it.

--

--

Responses (24)

Write a response