Actually not that philosophical. There is a discrete amount of energy that electronic engineers and solid state physicists have agreed upon as the minimum amount to accurately represent digital quantities. We are a few orders of magnitude from this number still but the gap is closing fast. Additionally, when designing a power supply circuit, to deliver optimal energy from your voltage source, you have to dissipate(emit as heat) as many watts of power in the supply as you do in the target circuit. And why do computers use energy in the first place? Computers used to use a lot more energy before they started using CMOS (complimentary metal oxide silicon) technology. Thanks to CMOS technology energy is technically only dissipated when a logic gate switches state from 1 to 0, CMOS is larger, more difficult to fabricate, and slower than other logic technologies but overcomes the heat and energy barriers and therefore in practical applications can be fabricated to be smaller and faster.
marr75
Posts
-
Why does electrical equipment get hot? -
Plagiarism Detection ???There is a huge message in the industry targeted right at universities. The message is clear, the graduates you are pumping out are not prepared to work here. They do not have the networking, code reuse, application, or quick decision making skills to cut the mustard. One of the most valuable skills I ever learned was how to read other people's code, therefore "plagiarism" checking applications with complex algorithms are a trivial tool. A quick program that gives totals of whitespace and tokens so that a professor can identify suspicious submissions followed by a quick interview to make sure both students understand the code that they have submitted is more than sufficient to ensure that a student did not simply "copy" the solution, if his friend aided him by providing source and making sure that his fellow student understood the operation and mechanics of it then they both gained valuable knowledge from the exercise.