I live the chaos, but I don't understand it :(
-
Here is a link to a youtube video on Shannon Information Entropy. See if this makes more sense. Information entropy | Journey into information theory | Computer Science | Khan Academy - YouTube[^]
-
Learning this could be a first step https://machinelearningmastery.com/what-is-information-entropy/[^]
Thank you for this. I read a lot, but I'm stumbling again and again. E.g: from here https://www.physik.uni-wuerzburg.de/fileadmin/11030300/_imported/fileadmin/tp3/ThermoEDynamik/Entropie.pdf[^]
Quote:
Da man mit n Bits bekanntlich 2^n verschiedene Bitmuster bilden kann, ist sofort klar, dass ein System mit 2^n Zuständen mit einer n-Bit-Datei vollständig beschrieben werden kann, so dass in diesem Fall H=n is
For me nothing is clear... :( Especally that one can describe e.g. the 2^8 conditions in an 8 bit file. But most probably this is a lack of my understanding
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
0x01AA wrote:
Nobody knows what entropy really is
As a Mechanical Engineer by education, I can only say that Second Law of Thermodynamics dictates that the entropy of the Universe is continuously increasing, dS > 0.
-
0x01AA wrote:
Nobody knows what entropy really is
As a Mechanical Engineer by education, I can only say that Second Law of Thermodynamics dictates that the entropy of the Universe is continuously increasing, dS > 0.
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
-
As a suggestion attempting to learn one part of a system of study well, probably is always going to require that one learns more about the system that contains it first. So study information theory first.
-
Thank you for this. I read a lot, but I'm stumbling again and again. E.g: from here https://www.physik.uni-wuerzburg.de/fileadmin/11030300/_imported/fileadmin/tp3/ThermoEDynamik/Entropie.pdf[^]
Quote:
Da man mit n Bits bekanntlich 2^n verschiedene Bitmuster bilden kann, ist sofort klar, dass ein System mit 2^n Zuständen mit einer n-Bit-Datei vollständig beschrieben werden kann, so dass in diesem Fall H=n is
For me nothing is clear... :( Especally that one can describe e.g. the 2^8 conditions in an 8 bit file. But most probably this is a lack of my understanding
I'm sure you understand it (but you do not know that)! Wortbreite 8 Bit Bereiche mit Vorzeichen -128 ~ 127 Bereich ohne Vorzeichen 0 ~ 255 2^8 = 256 and 1 to 256 you can define with 8 bit as shown above with 0 to 255. Now you should know that you already knew that :-D
-
I'm sure you understand it (but you do not know that)! Wortbreite 8 Bit Bereiche mit Vorzeichen -128 ~ 127 Bereich ohne Vorzeichen 0 ~ 255 2^8 = 256 and 1 to 256 you can define with 8 bit as shown above with 0 to 255. Now you should know that you already knew that :-D
I think , I'm very save with binary numbers. But again translated by google, I don't get the point on that:
Quote:
Since it is well known that 2^n different bit patterns can be formed with n bits, it is immediately clear that a system with 2^n states can be completely described with an n-bit file, so that in this case H=n
How one can desribe 256 condistions in a file with one byte? I'm pretty sure I have a problem understanding the article, but I'm also very happy if somebody can explain what I'm missinterpreting ;)
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
There is a simple equation that defines entropy. Chemical Engineers make use of the term to describe the behavior of substances. We use it, for example, to evaluate the performance of a steam turbine. It has been misused by zealous promoters to obfuscate information.
-
There is a simple equation that defines entropy. Chemical Engineers make use of the term to describe the behavior of substances. We use it, for example, to evaluate the performance of a steam turbine. It has been misused by zealous promoters to obfuscate information.
-
I think , I'm very save with binary numbers. But again translated by google, I don't get the point on that:
Quote:
Since it is well known that 2^n different bit patterns can be formed with n bits, it is immediately clear that a system with 2^n states can be completely described with an n-bit file, so that in this case H=n
How one can desribe 256 condistions in a file with one byte? I'm pretty sure I have a problem understanding the article, but I'm also very happy if somebody can explain what I'm missinterpreting ;)
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
-
Thanks for this. My usual problem is they state 'the probability is this and that' but no explanation why it is ;)
In the case of the video, the probabilities were made up to create two different machines. They then went on to illustrate how you could calculate the information entropy for each machine to compare them. The machine with the lower calculated entropy was more predictable Or better organized. The higher the calculated entropy the more disorganized a system is, so it is harder to predict what the output of the next cycle will be.
-
There is a simple equation that defines entropy. Chemical Engineers make use of the term to describe the behavior of substances. We use it, for example, to evaluate the performance of a steam turbine. It has been misused by zealous promoters to obfuscate information.
-
[4.6 Entropy - University Physics Volume 2 | OpenStax](https://openstax.org/books/university-physics-volume-2/pages/4-6-entropy)
-
I think , I'm very save with binary numbers. But again translated by google, I don't get the point on that:
Quote:
Since it is well known that 2^n different bit patterns can be formed with n bits, it is immediately clear that a system with 2^n states can be completely described with an n-bit file, so that in this case H=n
How one can desribe 256 condistions in a file with one byte? I'm pretty sure I have a problem understanding the article, but I'm also very happy if somebody can explain what I'm missinterpreting ;)
my understanding is that a system at any particular moment is in a state . a single state . of course a system can not be in more than one state at any moment unless of course we are discussing Quantum Mechanics which of course we are not i presume . if it is known the system can be in any state of 256 possible states then at any moment only 8 bits are required to specify that state . QED
-
I have to use up the unused vacation for 2023. I usually use this as an opportunity to do a little further training. At the end of this year 2023, I have decided to finally understand the term entropy in information technology to the end. But I fail once again:( All I can confirm so far is the statement of 'John von Neumann' (John von Neumann - Wikipedia[^]) to Claude Shannon - Wikipedia[^] : “You should call it entropy. […] Nobody knows what entropy really is, so in a debate “You will always have the advantage.” :sigh: :laugh: [Edit] Btw. any idea which forum here is ok to ask questions about that theme? [Edit1] Now I think it becomes out of 'lounge'disussion, sorry for that. Still I'm looking for a place where one can disuss that. I'm pretty shure that @KornfeldEliyahuPeter can help me of this ;)
video below gives a nice explanation of entropy in information theory . Information entropy | Journey into information theory | Computer Science | Khan Academy - YouTube[^]
-
my understanding is that a system at any particular moment is in a state . a single state . of course a system can not be in more than one state at any moment unless of course we are discussing Quantum Mechanics which of course we are not i presume . if it is known the system can be in any state of 256 possible states then at any moment only 8 bits are required to specify that state . QED
-
[4.6 Entropy - University Physics Volume 2 | OpenStax](https://openstax.org/books/university-physics-volume-2/pages/4-6-entropy)
-
video below gives a nice explanation of entropy in information theory . Information entropy | Journey into information theory | Computer Science | Khan Academy - YouTube[^]