Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Book covering basic physics/electronics relevant to computer science?

Book covering basic physics/electronics relevant to computer science?

Scheduled Pinned Locked Moved The Lounge
learninggame-devhelpquestion
34 Posts 24 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M MartinSmith

    Thanks all, I've ordered myself a couple of general text books on physics and electronics, I'll see how far they go in bridging the gaps. I'm quite happy to treat some of it as a black box (or magic!) especially if it starts veering into hard core maths as really I'm just trying to reduce the size of that gap in my knowledge and have at least some sort of conceptual overview. I'd still be grateful for recommendations of any particular texts that I should be looking at.

    T Offline
    T Offline
    thebeekeeper
    wrote on last edited by
    #6

    Here's a link to the text that we used in a class I took on designing logic circuits with transistors: here[^] That one is going to be very technical, but it won't go too deep into the quantum effects that make transistors work. I really don't know of any books that will be more qualitative in describing these kinds of circuits. The best explanation I can give of a transistor is to put 2 diodes facing each other:

    a -|>|--x--|<|-- b

    and put a wire going into that x in the middle. Now, you have a voltage controlled switch with the control point at x. If you put a signal at point a with x connected to ground, you don't get anything at b, but if you connect x to a voltage, the switch closes and you get a -> b.

    L M 2 Replies Last reply
    0
    • M MartinSmith

      Thanks all, I've ordered myself a couple of general text books on physics and electronics, I'll see how far they go in bridging the gaps. I'm quite happy to treat some of it as a black box (or magic!) especially if it starts veering into hard core maths as really I'm just trying to reduce the size of that gap in my knowledge and have at least some sort of conceptual overview. I'd still be grateful for recommendations of any particular texts that I should be looking at.

      L Offline
      L Offline
      LittleYellowBird
      wrote on last edited by
      #7

      I am an electronics engineer who programs - so the exact opposite of you! At college I used the book 'The Art of Electronics', by Horowitz and Hill, extensively and I still pick it up now from time to time. You will find it anywhere, it is often considered 'the' electronics reference and I think it is quite readable. See how you get on with the books you have ordered and the title keep it in mind for the future. Happy reading! :)

      Ali

      M 1 Reply Last reply
      0
      • T thebeekeeper

        Here's a link to the text that we used in a class I took on designing logic circuits with transistors: here[^] That one is going to be very technical, but it won't go too deep into the quantum effects that make transistors work. I really don't know of any books that will be more qualitative in describing these kinds of circuits. The best explanation I can give of a transistor is to put 2 diodes facing each other:

        a -|>|--x--|<|-- b

        and put a wire going into that x in the middle. Now, you have a voltage controlled switch with the control point at x. If you put a signal at point a with x connected to ground, you don't get anything at b, but if you connect x to a voltage, the switch closes and you get a -> b.

        L Offline
        L Offline
        Luc Pattyn
        wrote on last edited by
        #8

        I don't know the book you referred to, but I happen to know its author, he is a Belgian guy I went to uni with; he is now a professor at Berkeley University. I'm sure he knows all you want to know and more, I don't know whether he manages to explain or even intend to explain in layman's terms though. Wouldn't a stroll in Wikipedia be more appropriate for satisfying some curiosity? :)

        Luc Pattyn [Forum Guidelines] [My Articles]


        I use ListBoxes for line-oriented text output (not TextBoxes), and PictureBoxes for pictures (not drawings).


        1 Reply Last reply
        0
        • T thebeekeeper

          Here's a link to the text that we used in a class I took on designing logic circuits with transistors: here[^] That one is going to be very technical, but it won't go too deep into the quantum effects that make transistors work. I really don't know of any books that will be more qualitative in describing these kinds of circuits. The best explanation I can give of a transistor is to put 2 diodes facing each other:

          a -|>|--x--|<|-- b

          and put a wire going into that x in the middle. Now, you have a voltage controlled switch with the control point at x. If you put a signal at point a with x connected to ground, you don't get anything at b, but if you connect x to a voltage, the switch closes and you get a -> b.

          M Offline
          M Offline
          Member 4194593
          wrote on last edited by
          #9

          thebeekeeper wrote:

          a -|>Poke tongue--x--| I can only add one thing to this discussion, especially for a nubie. The circuit annotation for a diode has an arrow pointing in one direction or the other. This annotation was developed in the dark ages when electricity was thought to flow from + to -. The arrow indicated this direction of positive current flow. When recent (middle ages) electronics discovered that it was really electron flow from - to positive, it was too late to change the annotations, so the arrow in a diode will forever indicate the direction of positive current flow, electron flow will always be against the arrow. Dave.

          D 1 Reply Last reply
          0
          • M Member 4194593

            thebeekeeper wrote:

            a -|>Poke tongue--x--| I can only add one thing to this discussion, especially for a nubie. The circuit annotation for a diode has an arrow pointing in one direction or the other. This annotation was developed in the dark ages when electricity was thought to flow from + to -. The arrow indicated this direction of positive current flow. When recent (middle ages) electronics discovered that it was really electron flow from - to positive, it was too late to change the annotations, so the arrow in a diode will forever indicate the direction of positive current flow, electron flow will always be against the arrow. Dave.

            D Offline
            D Offline
            Dan Neely
            wrote on last edited by
            #10

            Not quite. The flow of positive charge convention predates the diode. So did the discovery that it was actually the negative charge that moves. The positive charge flow convention does have some benefits over the negative flow convention, which's probably a big factor in why it persists, most notably that it makes directionality of induced magnetic fields around a wire a right hand rule phenomena, just like torque in rotational systems, and in vector cross product calculations. Doing it the other way would splatter an extra batch of negative signs into Maxwell's equations, which're already enough of a pita without having to worry more about sign conventions.

            M 1 Reply Last reply
            0
            • M MartinSmith

              After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

              L Offline
              L Offline
              leppie
              wrote on last edited by
              #11

              I think about the only overlapping science is boolean algebra. IMO you dont even need more than high school math, and science would only be beneficial if you actually had to apply it.

              xacc.ide - now with TabsToSpaces support
              IronScheme - 1.0 beta 1 - out now!
              ((lambda (x) `((lambda (x) ,x) ',x)) '`((lambda (x) ,x) ',x))

              1 Reply Last reply
              0
              • D Dan Neely

                Not quite. The flow of positive charge convention predates the diode. So did the discovery that it was actually the negative charge that moves. The positive charge flow convention does have some benefits over the negative flow convention, which's probably a big factor in why it persists, most notably that it makes directionality of induced magnetic fields around a wire a right hand rule phenomena, just like torque in rotational systems, and in vector cross product calculations. Doing it the other way would splatter an extra batch of negative signs into Maxwell's equations, which're already enough of a pita without having to worry more about sign conventions.

                M Offline
                M Offline
                Member 4194593
                wrote on last edited by
                #12

                What! You mean that everything my daddy taught me when I was 7 years old wasn't the whole truth? Of course he was often heard to opine "Gimmy my horse". He didn't get along well with technology.

                1 Reply Last reply
                0
                • M MartinSmith

                  After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

                  B Offline
                  B Offline
                  Barry H
                  wrote on last edited by
                  #13

                  Try Code by Charles Petzold.

                  W J 2 Replies Last reply
                  0
                  • M MartinSmith

                    After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

                    M Offline
                    M Offline
                    makkak
                    wrote on last edited by
                    #14

                    I am a 3rd year EE major at UC San Diego. I also do programming. I've been on both sides of the fence and can tell you that you don't need an understanding of electronics on the transistor level to program. I find it helpful to know the gate level and everything up, especially the von neumann architecture. But if you are curious about the magic underneath, here are some books I would recommend: - Semiconductor Device Fundamentals, Pierret - Microelectronic Circuits, Sedra and Smith - Digital Integrated Circuits, Rabaey et al The first will give you a good understanding of how the ideal device equations are derived from first principles. The second will introduce you to all the major devices: diodes, bjt's, MOS, CMOS and how to use them on a circuit level. This book leans towards analog circuits like amplifiers. The third is probably what you want. Goes over the CMOS manufacturing process, the MOS transistor, interconnects, combinational logic, sequential logic, arithmetic building blocks, and memory structures. But if I were you, I would just read wikipedia because electronics texts are long, dry, and hard to read.

                    1 Reply Last reply
                    0
                    • B Barry H

                      Try Code by Charles Petzold.

                      W Offline
                      W Offline
                      werD
                      wrote on last edited by
                      #15

                      +1 I was just going to post that. "Code The Hidden Language of Computer Hardware and Software" by Charles Petzold [^]is a great book that goes from the ground up. I would highly recommend it to any level of programmer. In the book, he consructs a binary adding machine out of telegraph relays, it is pretty informatiuve to say the least :D

                      DrewG, MCSD .Net

                      1 Reply Last reply
                      0
                      • B Barry H

                        Try Code by Charles Petzold.

                        J Offline
                        J Offline
                        Jason Denton
                        wrote on last edited by
                        #16

                        I was going to suggest that book as well. It's a great read and very entertaining. It's not so much a physics book, but it does start out very simple with basic logic gates and progresses to explain the circuitry and logic of how computers work. I enjoyed this book so much that I’m going to read his latest book on Alan Turing. Linky[^]

                        1 Reply Last reply
                        0
                        • M MartinSmith

                          After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

                          S Offline
                          S Offline
                          sibrowne
                          wrote on last edited by
                          #17

                          Not a book, but the Royal Institution Christmas Lectures this year were about microprocessors. They were broadcast on Channel 5 in the UK and look like they are still available to view online here[^]. I think it's UK only though. The first couple covered the basics like how transistors work.

                          1 Reply Last reply
                          0
                          • L LittleYellowBird

                            I am an electronics engineer who programs - so the exact opposite of you! At college I used the book 'The Art of Electronics', by Horowitz and Hill, extensively and I still pick it up now from time to time. You will find it anywhere, it is often considered 'the' electronics reference and I think it is quite readable. See how you get on with the books you have ordered and the title keep it in mind for the future. Happy reading! :)

                            Ali

                            M Offline
                            M Offline
                            MattBeavis
                            wrote on last edited by
                            #18

                            I would also vote for "The Art of Electronics" great book, I originally started out studying electronics and then some how fell into more computer related programming/scripting/application packaging and support based work. The book is one of the few books that still has pride of place on the shelf and it ocassionally gets thumbed through even 15+ years after its purchase (OMG!!! I hadnt realised how old it was... :) )

                            1 Reply Last reply
                            0
                            • M MartinSmith

                              After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

                              M Offline
                              M Offline
                              mi5ke
                              wrote on last edited by
                              #19

                              The basics of computer operation do not rely on electronics - there are many ways to make logic gates, with many different technologies. Electronics is just the handiest way to do it - size, speed, cost are all optimized. But things change, and who knows how we'll be making them in the future. For example, imagine making logic gates with water, pipes and valves - just like your home plumbing system. If we have a piece of piping with two valves in series, then water can only flow through the pipe if both valves are open - this is essentially an AND gate. If the two valves had been in parallel, then opening either of them would allow water to flow - this is an OR gate. Making an inverter (NOT gate) is trickier - imagine having a valve in the pipe which works when it is turned the opposite way from normal. The cunning thing is now to make the valves be operated by water-pressure, rather than by hand. Now the output of one bit of plumbing can control another bit of plumbing. Since any computer can be built from logic gates, you end up being able to build a computer - very large, very slow... and a bit wet. Transistors used in digital electronics are just doing the same thing. (Most other people here have been descibing "bipolar transistors", which are never used these days.) Everything uses CMOS transistors these days, which are very close to water valves descibed above. The 5 volt power supply is equivalent to the water supply used above. The more volts, the greater the pressure. (Too many volts and you get a leak... well, an arc, or a failure of some sort.) The flow of electricity is equivalent to the flow of water, above. CMOS transistors just act as valves - but which can be controled by voltage (pressure) supplied from another part of the circuit. Their operation is a bit more like treading on a garden hose pipe, but the principle is the same: pressure (voltage) controls flow (current). When you design a CMOS transistor circuit, you generally try to avoid constantly running current - instead you attempt to route voltage (pressure), or lack of it, around the place, rather than having to deal with flowing currents. Why? Well, the pipes can be thinner, the supply doesn't have to provide much water, and pressure changes travel quickly. In most CMOS circuits, current only flows in tiny amounts, and only during some transition - the slight amount of water needed to pressurize the next valve down the line, for example. This tells you why power consumption of circuits increase as they are used at higher speeds

                              1 Reply Last reply
                              0
                              • M MartinSmith

                                After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

                                B Offline
                                B Offline
                                Brian W King
                                wrote on last edited by
                                #20

                                Boylstad. Search for it. Its not an easy read. Thats about as dumded down as you can get and still be able to apply the physics behind the reality. Electircal theory has to come first, then you can build the fundamentals of gate theory on top of that. Without the electrical theory down pat though, gate theory is NOT going to make any sense. I was weaned on Boylstad and came out with an understanding that I haven't seen in many other people, including masters and doctorates.

                                1 Reply Last reply
                                0
                                • M MartinSmith

                                  After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

                                  T Offline
                                  T Offline
                                  tom1443
                                  wrote on last edited by
                                  #21

                                  I'm a EE with an MS in Comp Sci who works as an embedded programmer. Honestly after the first year of college practically nobody except research scientists and chip developers give a rat's ass about the physics of electronics. Well, maybe that is not totally true but close. If your purpose though is to learn some practical electronics that can help you be a better programmer, I would get one of those science kits that you can get at Radio Shack that lets you breadboard a bunch of circuits. They usually come with an experiment book and all the components including resistors, capacitors, leds, potentiometers, timer/counters, and op-amps, transistors, and logic gates and will give you a good start on the basics. I bought one for my teenage son and he loved it. The one I bought had a built in power supply, voltmeter and current meter. Unless you want to interface complex external devices to a computer yourself that is probably all you need. Once you feel good about that I'd be looking for really good computer architecture book. Here you can learn about machine language design and how it relates to the hardware, memory layouts, bus architecture, chips selects and chip select logic, keyboard scanning, and the like. That will get you to the point where you can read a computer processor manual or just about any chip manual and understand it. After that I'd be thinking about buying a single board computer. There are a bunch of them available in $100-$200 range that have some built in I/O and prototype areas. They let you program on a PC and download to the SBC via RS232. Stick with an 8051 based board, it will be simple to learn and lots of free tools are available, plus it is a pretty widely used chip still. I'd stay away from the stamp processors at first, they are a more specialized type of processor. Now you can play around with some assembly (or even C code) and wiggle some lights or read some switches. That will marry up the electronics with the programming and put you well on the way to be a robotics programmer.

                                  1 Reply Last reply
                                  0
                                  • M MartinSmith

                                    After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

                                    D Offline
                                    D Offline
                                    DiscoJimmy
                                    wrote on last edited by
                                    #22

                                    A really good basic overview without getting too complex would be the Radio Shack series on electronics by Forest M.Mimms III - they're really basic, with lots of illustrations, but they give you a good grounding of how electricity, basic components, and logic gates work.

                                    1 Reply Last reply
                                    0
                                    • M MartinSmith

                                      After programming for a number of years without coming from a comp sci background. I've recently been spending some time reading some more theoretical books about how processors work, how networks/data communications work etc. So currently I'm learning from top down and it's going OK. I feel though that as I didn't take physics at school beyond age 13 there are some fundamental foundations that I still don't have a handle on, that would help me understand the whole picture. Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand. Also I'm flaky on things like the physics of sound waves and how it relates to various audio formats. Has anyone come across any good book that explains basic physics for computer science from the ground up?

                                      C Offline
                                      C Offline
                                      Charles Pikell
                                      wrote on last edited by
                                      #23

                                      :thumbsup:I would also highly recommend the Feynman Lectures, easy and comprehensive for both electronics and physics. Charles Pikell

                                      M 1 Reply Last reply
                                      0
                                      • D Dan Neely

                                        MartinSmith wrote:

                                        Particularly the absolute basics of electronics. What electricity is, What voltage is, how logic gates are physically implemented in a way a layman can understand.

                                        Unfortunately transistors work based on quantum mechanical principles. According to Classical Physics they won't work. The closest you're going to get in laymans terms is "When voltage is applied to the control lead MAGIC HAPPENS and it connects the input and output leads. When the voltage is removed from the control the connection is broken." At the level of how logic gates are built up from transistors (as switches) any decent computer architecture book should cover this.

                                        Today's lesson is brought to you by the word "niggardly". Remember kids, don't attribute to racism what can be explained by Scandinavian language roots. -- Robert Royall

                                        C Offline
                                        C Offline
                                        Charvak Karpe
                                        wrote on last edited by
                                        #24

                                        dan neely wrote:

                                        Unfortunately transistors work based on quantum mechanical principles. According to Classical Physics they won't work. The closest you're going to get in laymans terms is "When voltage is applied to the control lead MAGIC HAPPENS and it connects the input and output leads. When the voltage is removed from the control the connection is broken."

                                        Modern 45 nm transistors may have quantum mechanical principles, but I think regular NPN transistors and MOSFETs have classical operation. 1. Pure silicon does not conduct because there are no floating electrons. 2. By "doping" silicon with impurities to the left or right on the periodic table, you create atoms in the lattice structure with extra free electrons or missing electrons (known as holes). 3. Look up the operation of a P-N junction. It's something about how current can only flow from the positively doped silicon to the negatively doped silicon because there are no excess electrons in the P-type silicon to start current flow the other way. If you make the junction too small or raise the voltage too high, you get quantum tunneling of electrons across the P-type silicon, but that's not an issue with transistors that you can see. 4. MOSFETs are fun. That stands for Metal Oxide Semiconductor Field Effect Transistor, which explains their function. They have an N-P-N arrangement of silicon with a layer of oxide covering the trio and then a layer of metal above that. If you try to pass current across through the N-P-N silicon, the N-P reverse diode junction blocks it. But if you apply a positive voltage to the metal at the top, the whole thing acts as a capacitor, pulling a layer of electrons out of the P-type silicon to create a temporary N-type channel right up against the oxide layer. This allows current to flow across because now you have some N-N-N above the N-P-N. Martin, I highly commend you for your curiosity and desire to learn the basics of physics and how they power our computers. The knowledge itself is not necessary to be a coder, but that inquisitive mentality is what makes someone an excellent problem solver. Sort of changing the subject, but still on the topic of physics basics, how do people feel about the current level of knowledge about conservation of energy, flows of heat, etc.? How many people realize that energy saving light bulbs don't help in places that are simultaneously running electric heaters? Or, why is running air conditioner in reverse bet

                                        A P D D 4 Replies Last reply
                                        0
                                        • C Charvak Karpe

                                          dan neely wrote:

                                          Unfortunately transistors work based on quantum mechanical principles. According to Classical Physics they won't work. The closest you're going to get in laymans terms is "When voltage is applied to the control lead MAGIC HAPPENS and it connects the input and output leads. When the voltage is removed from the control the connection is broken."

                                          Modern 45 nm transistors may have quantum mechanical principles, but I think regular NPN transistors and MOSFETs have classical operation. 1. Pure silicon does not conduct because there are no floating electrons. 2. By "doping" silicon with impurities to the left or right on the periodic table, you create atoms in the lattice structure with extra free electrons or missing electrons (known as holes). 3. Look up the operation of a P-N junction. It's something about how current can only flow from the positively doped silicon to the negatively doped silicon because there are no excess electrons in the P-type silicon to start current flow the other way. If you make the junction too small or raise the voltage too high, you get quantum tunneling of electrons across the P-type silicon, but that's not an issue with transistors that you can see. 4. MOSFETs are fun. That stands for Metal Oxide Semiconductor Field Effect Transistor, which explains their function. They have an N-P-N arrangement of silicon with a layer of oxide covering the trio and then a layer of metal above that. If you try to pass current across through the N-P-N silicon, the N-P reverse diode junction blocks it. But if you apply a positive voltage to the metal at the top, the whole thing acts as a capacitor, pulling a layer of electrons out of the P-type silicon to create a temporary N-type channel right up against the oxide layer. This allows current to flow across because now you have some N-N-N above the N-P-N. Martin, I highly commend you for your curiosity and desire to learn the basics of physics and how they power our computers. The knowledge itself is not necessary to be a coder, but that inquisitive mentality is what makes someone an excellent problem solver. Sort of changing the subject, but still on the topic of physics basics, how do people feel about the current level of knowledge about conservation of energy, flows of heat, etc.? How many people realize that energy saving light bulbs don't help in places that are simultaneously running electric heaters? Or, why is running air conditioner in reverse bet

                                          A Offline
                                          A Offline
                                          Alan Balkany
                                          wrote on last edited by
                                          #25

                                          Very interesting, especially the parts about applying physics to daily activities. You should write some articles on this! If you did, I'd be interested in reading them.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups