I live the chaos, but I don't understand it :(
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
There is a simple equation that defines entropy. Chemical Engineers make use of the term to describe the behavior of substances. We use it, for example, to evaluate the performance of a steam turbine. It has been misused by zealous promoters to obfuscate information.
-
There is a simple equation that defines entropy. Chemical Engineers make use of the term to describe the behavior of substances. We use it, for example, to evaluate the performance of a steam turbine. It has been misused by zealous promoters to obfuscate information.
-
I think , I'm very save with binary numbers. But again translated by google, I don't get the point on that:
Quote:
Since it is well known that 2^n different bit patterns can be formed with n bits, it is immediately clear that a system with 2^n states can be completely described with an n-bit file, so that in this case H=n
How one can desribe 256 condistions in a file with one byte? I'm pretty sure I have a problem understanding the article, but I'm also very happy if somebody can explain what I'm missinterpreting ;)
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
-
Thanks for this. My usual problem is they state 'the probability is this and that' but no explanation why it is ;)
In the case of the video, the probabilities were made up to create two different machines. They then went on to illustrate how you could calculate the information entropy for each machine to compare them. The machine with the lower calculated entropy was more predictable Or better organized. The higher the calculated entropy the more disorganized a system is, so it is harder to predict what the output of the next cycle will be.
-
There is a simple equation that defines entropy. Chemical Engineers make use of the term to describe the behavior of substances. We use it, for example, to evaluate the performance of a steam turbine. It has been misused by zealous promoters to obfuscate information.
-
[4.6 Entropy - University Physics Volume 2 | OpenStax](https://openstax.org/books/university-physics-volume-2/pages/4-6-entropy)
-
I think , I'm very save with binary numbers. But again translated by google, I don't get the point on that:
Quote:
Since it is well known that 2^n different bit patterns can be formed with n bits, it is immediately clear that a system with 2^n states can be completely described with an n-bit file, so that in this case H=n
How one can desribe 256 condistions in a file with one byte? I'm pretty sure I have a problem understanding the article, but I'm also very happy if somebody can explain what I'm missinterpreting ;)
my understanding is that a system at any particular moment is in a state . a single state . of course a system can not be in more than one state at any moment unless of course we are discussing Quantum Mechanics which of course we are not i presume . if it is known the system can be in any state of 256 possible states then at any moment only 8 bits are required to specify that state . QED
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
video below gives a nice explanation of entropy in information theory . Information entropy | Journey into information theory | Computer Science | Khan Academy - YouTube[^]
-
my understanding is that a system at any particular moment is in a state . a single state . of course a system can not be in more than one state at any moment unless of course we are discussing Quantum Mechanics which of course we are not i presume . if it is known the system can be in any state of 256 possible states then at any moment only 8 bits are required to specify that state . QED
-
video below gives a nice explanation of entropy in information theory . Information entropy | Journey into information theory | Computer Science | Khan Academy - YouTube[^]
-
[4.6 Entropy - University Physics Volume 2 | OpenStax](https://openstax.org/books/university-physics-volume-2/pages/4-6-entropy)
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
Entropy is an attempt to give a macroscopic number that describes a large number of microscopic states. Taking the "classic" deck of cards, there is only 1 way of arranging the cards in suit order (Clubs, Diamonds, Hearts, Spades) in increasing order in each suit (2, 3, ..., J, Q, K, A). We take the log of the number of states, and the entropy is 0. If the suits can be in any order, we have 4! possibilities, with an entropy of log(4!). If the cards are in the order red-black-red-black, we have 26! * 26! possibilities, with an entropy of 2*log(26!). If the cards can be in any order, we have 52! possibilities, with an entropy of log(52!). This is also the maximum entropy for the card system. Each of the different ways to arrange the cards in the above examples is called a "micro-state". In physics/chemistry we usually multiply the entropy calculated above by Bolzman's constant, to fit it in with other units such as temperature, energy, etc. The connection to information theory comes from the fact that the lower the entropy of the system, the easier it is to predict the next card.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
Entropy is an attempt to give a macroscopic number that describes a large number of microscopic states. Taking the "classic" deck of cards, there is only 1 way of arranging the cards in suit order (Clubs, Diamonds, Hearts, Spades) in increasing order in each suit (2, 3, ..., J, Q, K, A). We take the log of the number of states, and the entropy is 0. If the suits can be in any order, we have 4! possibilities, with an entropy of log(4!). If the cards are in the order red-black-red-black, we have 26! * 26! possibilities, with an entropy of 2*log(26!). If the cards can be in any order, we have 52! possibilities, with an entropy of log(52!). This is also the maximum entropy for the card system. Each of the different ways to arrange the cards in the above examples is called a "micro-state". In physics/chemistry we usually multiply the entropy calculated above by Bolzman's constant, to fit it in with other units such as temperature, energy, etc. The connection to information theory comes from the fact that the lower the entropy of the system, the easier it is to predict the next card.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
Thank you very much for your answer. I will go through the card deck example and hope I will understand it in detail. Anyway statements like
Quote:
Entropy is an attempt to give a macroscopic number that describes a large number of macroscopic states.
makes it hard to understand. What exactly 'macroscopic' and 'macroscopic' means....
-
Thank you very much for your answer. I will go through the card deck example and hope I will understand it in detail. Anyway statements like
Quote:
Entropy is an attempt to give a macroscopic number that describes a large number of macroscopic states.
makes it hard to understand. What exactly 'macroscopic' and 'macroscopic' means....
The second 'macroscopic' should have been '_micro_scopic'. :-O Corrected in the original.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
The second 'macroscopic' should have been '_micro_scopic'. :-O Corrected in the original.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
In the case of the video, the probabilities were made up to create two different machines. They then went on to illustrate how you could calculate the information entropy for each machine to compare them. The machine with the lower calculated entropy was more predictable Or better organized. The higher the calculated entropy the more disorganized a system is, so it is harder to predict what the output of the next cycle will be.
-
my understanding is that a system at any particular moment is in a state . a single state . of course a system can not be in more than one state at any moment unless of course we are discussing Quantum Mechanics which of course we are not i presume . if it is known the system can be in any state of 256 possible states then at any moment only 8 bits are required to specify that state . QED
-
You make an excellent observation. Like measurements of Entropy, Lower Cyclomatic complexity programs are more stable (better organization), their behavior is more predictable and you have a higher confidence in predicting their processing results. In both cases lower is better. Thanks, did not see the connection.
-
You make an excellent observation. Like measurements of Entropy, Lower Cyclomatic complexity programs are more stable (better organization), their behavior is more predictable and you have a higher confidence in predicting their processing results. In both cases lower is better. Thanks, did not see the connection.