Book covering basic physics/electronics relevant to computer science?
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
Physics for computer science ? Not much from what I remember. In CompSci we never really dig below high level logic gate stuff (basic layout with VHDL); That was left for the engineers. I suggest going back to basic school books (high-school, early college courses) and see if you can bring back your mind to learn back what you learned at that time.
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
MartinSmith wrote:
Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand.
Unfortunately transistors work based on quantum mechanical principles. According to Classical Physics they won't work. The closest you're going to get in laymans terms is "When voltage is applied to the control lead MAGIC HAPPENS and it connects the input and output leads. When the voltage is removed from the control the connection is broken." At the level of how logic gates are built up from transistors (as switches) any decent computer architecture book should cover this.
Today's lesson is brought to you by the word "niggardly". Remember kids, don't attribute to racism what can be explained by Scandinavian language roots. -- Robert Royall
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
I've got a EE degree, so I should know most of that stuff, but it's all gotten a little foggy in the last 8 years or so. You can figure out what voltage and electricity physically are by reading a circuit analysis or 1st year college physics text. Basically all you're going to get is that voltage is the potential energy between 2 points. Current is the movement of electrons through a point. Electricity is electrons i guess. How a logic gate is physically implemented is something I don't think anybody should really want to know about :( Although I guess if you understand diodes you can get a qualitative understanding of how transistors work, and then you can move on to groups of transistors to create logic gates. If you really want to know exactly how a transistor works, you're going to be dealing with insanely long equations and things only get worse when you want to put them together into a gate.
-
I've got a EE degree, so I should know most of that stuff, but it's all gotten a little foggy in the last 8 years or so. You can figure out what voltage and electricity physically are by reading a circuit analysis or 1st year college physics text. Basically all you're going to get is that voltage is the potential energy between 2 points. Current is the movement of electrons through a point. Electricity is electrons i guess. How a logic gate is physically implemented is something I don't think anybody should really want to know about :( Although I guess if you understand diodes you can get a qualitative understanding of how transistors work, and then you can move on to groups of transistors to create logic gates. If you really want to know exactly how a transistor works, you're going to be dealing with insanely long equations and things only get worse when you want to put them together into a gate.
Thanks all, I've ordered myself a couple of general text books on physics and electronics, I'll see how far they go in bridging the gaps. I'm quite happy to treat some of it as a black box (or magic!) especially if it starts veering into hard core maths as really I'm just trying to reduce the size of that gap in my knowledge and have at least some sort of conceptual overview. I'd still be grateful for recommendations of any particular texts that I should be looking at.
-
Thanks all, I've ordered myself a couple of general text books on physics and electronics, I'll see how far they go in bridging the gaps. I'm quite happy to treat some of it as a black box (or magic!) especially if it starts veering into hard core maths as really I'm just trying to reduce the size of that gap in my knowledge and have at least some sort of conceptual overview. I'd still be grateful for recommendations of any particular texts that I should be looking at.
Here's a link to the text that we used in a class I took on designing logic circuits with transistors: here[^] That one is going to be very technical, but it won't go too deep into the quantum effects that make transistors work. I really don't know of any books that will be more qualitative in describing these kinds of circuits. The best explanation I can give of a transistor is to put 2 diodes facing each other:
a -|>|--x--|<|-- b
and put a wire going into that x in the middle. Now, you have a voltage controlled switch with the control point at x. If you put a signal at point a with x connected to ground, you don't get anything at b, but if you connect x to a voltage, the switch closes and you get a -> b.
-
Thanks all, I've ordered myself a couple of general text books on physics and electronics, I'll see how far they go in bridging the gaps. I'm quite happy to treat some of it as a black box (or magic!) especially if it starts veering into hard core maths as really I'm just trying to reduce the size of that gap in my knowledge and have at least some sort of conceptual overview. I'd still be grateful for recommendations of any particular texts that I should be looking at.
I am an electronics engineer who programs - so the exact opposite of you! At college I used the book 'The Art of Electronics', by Horowitz and Hill, extensively and I still pick it up now from time to time. You will find it anywhere, it is often considered 'the' electronics reference and I think it is quite readable. See how you get on with the books you have ordered and the title keep it in mind for the future. Happy reading! :)
Ali
-
Here's a link to the text that we used in a class I took on designing logic circuits with transistors: here[^] That one is going to be very technical, but it won't go too deep into the quantum effects that make transistors work. I really don't know of any books that will be more qualitative in describing these kinds of circuits. The best explanation I can give of a transistor is to put 2 diodes facing each other:
a -|>|--x--|<|-- b
and put a wire going into that x in the middle. Now, you have a voltage controlled switch with the control point at x. If you put a signal at point a with x connected to ground, you don't get anything at b, but if you connect x to a voltage, the switch closes and you get a -> b.
I don't know the book you referred to, but I happen to know its author, he is a Belgian guy I went to uni with; he is now a professor at Berkeley University. I'm sure he knows all you want to know and more, I don't know whether he manages to explain or even intend to explain in layman's terms though. Wouldn't a stroll in Wikipedia be more appropriate for satisfying some curiosity? :)
Luc Pattyn [Forum Guidelines] [My Articles]
I use ListBoxes for line-oriented text output (not TextBoxes), and PictureBoxes for pictures (not drawings).
-
Here's a link to the text that we used in a class I took on designing logic circuits with transistors: here[^] That one is going to be very technical, but it won't go too deep into the quantum effects that make transistors work. I really don't know of any books that will be more qualitative in describing these kinds of circuits. The best explanation I can give of a transistor is to put 2 diodes facing each other:
a -|>|--x--|<|-- b
and put a wire going into that x in the middle. Now, you have a voltage controlled switch with the control point at x. If you put a signal at point a with x connected to ground, you don't get anything at b, but if you connect x to a voltage, the switch closes and you get a -> b.
thebeekeeper wrote:
a -|>Poke tongue--x--| I can only add one thing to this discussion, especially for a nubie. The circuit annotation for a diode has an arrow pointing in one direction or the other. This annotation was developed in the dark ages when electricity was thought to flow from + to -. The arrow indicated this direction of positive current flow. When recent (middle ages) electronics discovered that it was really electron flow from - to positive, it was too late to change the annotations, so the arrow in a diode will forever indicate the direction of positive current flow, electron flow will always be against the arrow. Dave.
-
thebeekeeper wrote:
a -|>Poke tongue--x--| I can only add one thing to this discussion, especially for a nubie. The circuit annotation for a diode has an arrow pointing in one direction or the other. This annotation was developed in the dark ages when electricity was thought to flow from + to -. The arrow indicated this direction of positive current flow. When recent (middle ages) electronics discovered that it was really electron flow from - to positive, it was too late to change the annotations, so the arrow in a diode will forever indicate the direction of positive current flow, electron flow will always be against the arrow. Dave.
Not quite. The flow of positive charge convention predates the diode. So did the discovery that it was actually the negative charge that moves. The positive charge flow convention does have some benefits over the negative flow convention, which's probably a big factor in why it persists, most notably that it makes directionality of induced magnetic fields around a wire a right hand rule phenomena, just like torque in rotational systems, and in vector cross product calculations. Doing it the other way would splatter an extra batch of negative signs into Maxwell's equations, which're already enough of a pita without having to worry more about sign conventions.
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
I think about the only overlapping science is boolean algebra. IMO you dont even need more than high school math, and science would only be beneficial if you actually had to apply it.
xacc.ide - now with TabsToSpaces support
IronScheme - 1.0 beta 1 - out now!
((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x)) -
Not quite. The flow of positive charge convention predates the diode. So did the discovery that it was actually the negative charge that moves. The positive charge flow convention does have some benefits over the negative flow convention, which's probably a big factor in why it persists, most notably that it makes directionality of induced magnetic fields around a wire a right hand rule phenomena, just like torque in rotational systems, and in vector cross product calculations. Doing it the other way would splatter an extra batch of negative signs into Maxwell's equations, which're already enough of a pita without having to worry more about sign conventions.
What! You mean that everything my daddy taught me when I was 7 years old wasn't the whole truth? Of course he was often heard to opine "Gimmy my horse". He didn't get along well with technology.
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
I am a 3rd year EE major at UC San Diego. I also do programming. I've been on both sides of the fence and can tell you that you don't need an understanding of electronics on the transistor level to program. I find it helpful to know the gate level and everything up, especially the von neumann architecture. But if you are curious about the magic underneath, here are some books I would recommend: - Semiconductor Device Fundamentals, Pierret - Microelectronic Circuits, Sedra and Smith - Digital Integrated Circuits, Rabaey et al The first will give you a good understanding of how the ideal device equations are derived from first principles. The second will introduce you to all the major devices: diodes, bjt's, MOS, CMOS and how to use them on a circuit level. This book leans towards analog circuits like amplifiers. The third is probably what you want. Goes over the CMOS manufacturing process, the MOS transistor, interconnects, combinational logic, sequential logic, arithmetic building blocks, and memory structures. But if I were you, I would just read wikipedia because electronics texts are long, dry, and hard to read.
-
+1 I was just going to post that. "Code The Hidden Language of Computer Hardware and Software" by Charles Petzold [^]is a great book that goes from the ground up. I would highly recommend it to any level of programmer. In the book, he consructs a binary adding machine out of telegraph relays, it is pretty informatiuve to say the least :D
DrewG, MCSD .Net
-
I was going to suggest that book as well. It's a great read and very entertaining. It's not so much a physics book, but it does start out very simple with basic logic gates and progresses to explain the circuitry and logic of how computers work. I enjoyed this book so much that I’m going to read his latest book on Alan Turing. Linky[^]
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
-
I am an electronics engineer who programs - so the exact opposite of you! At college I used the book 'The Art of Electronics', by Horowitz and Hill, extensively and I still pick it up now from time to time. You will find it anywhere, it is often considered 'the' electronics reference and I think it is quite readable. See how you get on with the books you have ordered and the title keep it in mind for the future. Happy reading! :)
Ali
I would also vote for "The Art of Electronics" great book, I originally started out studying electronics and then some how fell into more computer related programming/scripting/application packaging and support based work. The book is one of the few books that still has pride of place on the shelf and it ocassionally gets thumbed through even 15+ years after its purchase (OMG!!! I hadnt realised how old it was... :) )
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
The basics of computer operation do not rely on electronics - there are many ways to make logic gates, with many different technologies. Electronics is just the handiest way to do it - size, speed, cost are all optimized. But things change, and who knows how we'll be making them in the future. For example, imagine making logic gates with water, pipes and valves - just like your home plumbing system. If we have a piece of piping with two valves in series, then water can only flow through the pipe if both valves are open - this is essentially an AND gate. If the two valves had been in parallel, then opening either of them would allow water to flow - this is an OR gate. Making an inverter (NOT gate) is trickier - imagine having a valve in the pipe which works when it is turned the opposite way from normal. The cunning thing is now to make the valves be operated by water-pressure, rather than by hand. Now the output of one bit of plumbing can control another bit of plumbing. Since any computer can be built from logic gates, you end up being able to build a computer - very large, very slow... and a bit wet. Transistors used in digital electronics are just doing the same thing. (Most other people here have been descibing "bipolar transistors", which are never used these days.) Everything uses CMOS transistors these days, which are very close to water valves descibed above. The 5 volt power supply is equivalent to the water supply used above. The more volts, the greater the pressure. (Too many volts and you get a leak... well, an arc, or a failure of some sort.) The flow of electricity is equivalent to the flow of water, above. CMOS transistors just act as valves - but which can be controled by voltage (pressure) supplied from another part of the circuit. Their operation is a bit more like treading on a garden hose pipe, but the principle is the same: pressure (voltage) controls flow (current). When you design a CMOS transistor circuit, you generally try to avoid constantly running current - instead you attempt to route voltage (pressure), or lack of it, around the place, rather than having to deal with flowing currents. Why? Well, the pipes can be thinner, the supply doesn't have to provide much water, and pressure changes travel quickly. In most CMOS circuits, current only flows in tiny amounts, and only during some transition - the slight amount of water needed to pressurize the next valve down the line, for example. This tells you why power consumption of circuits increase as they are used at higher speeds
-
After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?
Boylstad. Search for it. Its not an easy read. Thats about as dumded down as you can get and still be able to apply the physics behind the reality. Electircal theory has to come first, then you can build the fundamentals of gate theory on top of that. Without the electrical theory down pat though, gate theory is NOT going to make any sense. I was weaned on Boylstad and came out with an understanding that I haven't seen in many other people, including masters and doctorates.