Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. John Wilder Tukey: Contribution to Computer Science

John Wilder Tukey: Contribution to Computer Science

Scheduled Pinned Locked Moved The Lounge
commcphardwarequestionlearning
19 Posts 8 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R raddevus

    While reading the fanastic book, Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) by Charles Petzold[^] I stumbled upon this:

    Charles Petzold said:

    Sometime around 1948, the American mathematician John Wilder Tukey (born 1915) realized that the words binary digit were likely to assume a much greater importance in the years ahead as computers be came more prevalent. He decided to coin a new, shorter word to replace the unwieldy five syllables of binary digit . He considered bigit and binit but settled instead on the short, simple, elegant, and perfectly lovely word bit .

    Verified by wikipedia. John Tukey - Wikipedia[^]

    wikipedia:

    While working with John von Neumann on early computer designs, Tukey introduced the word "bit" as a contraction of "binary digit".[8] The term "bit" was first used in an article by Claude Shannon in 1948.

    Have any of you read Code? It's really fantastic.

    S Offline
    S Offline
    Slacker007
    wrote on last edited by
    #5

    "Joan Wilder!' "Thee Joan Wilder?!....Juuaniiita."

    1 Reply Last reply
    0
    • D David Crow

      raddevus wrote:

      Have any of you read Code?

      Yes.

      "One man's wage rise is another man's price increase." - Harold Wilson

      "Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons

      "You can easily judge the character of a man by how he treats those who can do nothing for him." - James D. Miles

      R Offline
      R Offline
      raddevus
      wrote on last edited by
      #6

      +1 for you! :) But you got that +1 from reading the book already so I'm late to the game. :)

      1 Reply Last reply
      0
      • R raddevus

        While reading the fanastic book, Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) by Charles Petzold[^] I stumbled upon this:

        Charles Petzold said:

        Sometime around 1948, the American mathematician John Wilder Tukey (born 1915) realized that the words binary digit were likely to assume a much greater importance in the years ahead as computers be came more prevalent. He decided to coin a new, shorter word to replace the unwieldy five syllables of binary digit . He considered bigit and binit but settled instead on the short, simple, elegant, and perfectly lovely word bit .

        Verified by wikipedia. John Tukey - Wikipedia[^]

        wikipedia:

        While working with John von Neumann on early computer designs, Tukey introduced the word "bit" as a contraction of "binary digit".[8] The term "bit" was first used in an article by Claude Shannon in 1948.

        Have any of you read Code? It's really fantastic.

        Sander RosselS Offline
        Sander RosselS Offline
        Sander Rossel
        wrote on last edited by
        #7

        You verify a book with Wikipedia? When I was in school we verified Wikipedia with books! :sigh:

        Best, Sander arrgh.js - Bringing LINQ to JavaScript SQL Server for C# Developers Succinctly Object-Oriented Programming in C# Succinctly

        R P 3 Replies Last reply
        0
        • Sander RosselS Sander Rossel

          You verify a book with Wikipedia? When I was in school we verified Wikipedia with books! :sigh:

          Best, Sander arrgh.js - Bringing LINQ to JavaScript SQL Server for C# Developers Succinctly Object-Oriented Programming in C# Succinctly

          R Offline
          R Offline
          raddevus
          wrote on last edited by
          #8

          Sander Rossel wrote:

          When I was in school we verified Wikipedia with books! :sigh:

          This cracked me up! I'm in pieces here. I know. I did that for the younger scriptkiddies here who cannot/willnot read books and who only believe it if it has a http at the beginning. :laugh: I also tweeted the message and snapchatted and instagrammed it.

          1 Reply Last reply
          0
          • Sander RosselS Sander Rossel

            You verify a book with Wikipedia? When I was in school we verified Wikipedia with books! :sigh:

            Best, Sander arrgh.js - Bringing LINQ to JavaScript SQL Server for C# Developers Succinctly Object-Oriented Programming in C# Succinctly

            R Offline
            R Offline
            raddevus
            wrote on last edited by
            #9

            Sander Rossel wrote:

            When I was in school we verified Wikipedia with books! :sigh:

            This cracked me up! I'm in pieces here. I know. I did that for the younger scriptkiddies here who cannot/willnot read books and who only believe it if it has a http at the beginning. :laugh: I also tweeted the message and snapchatted and instagrammed it. :laugh:

            1 Reply Last reply
            0
            • R raddevus

              While reading the fanastic book, Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) by Charles Petzold[^] I stumbled upon this:

              Charles Petzold said:

              Sometime around 1948, the American mathematician John Wilder Tukey (born 1915) realized that the words binary digit were likely to assume a much greater importance in the years ahead as computers be came more prevalent. He decided to coin a new, shorter word to replace the unwieldy five syllables of binary digit . He considered bigit and binit but settled instead on the short, simple, elegant, and perfectly lovely word bit .

              Verified by wikipedia. John Tukey - Wikipedia[^]

              wikipedia:

              While working with John von Neumann on early computer designs, Tukey introduced the word "bit" as a contraction of "binary digit".[8] The term "bit" was first used in an article by Claude Shannon in 1948.

              Have any of you read Code? It's really fantastic.

              P Offline
              P Offline
              PIEBALDconsult
              wrote on last edited by
              #10

              raddevus wrote:

              Have any of you read Code?

              I have. It's one of the things that got me modeling relays, gates, adders, etc. in C#. :-D

              R 1 Reply Last reply
              0
              • Sander RosselS Sander Rossel

                You verify a book with Wikipedia? When I was in school we verified Wikipedia with books! :sigh:

                Best, Sander arrgh.js - Bringing LINQ to JavaScript SQL Server for C# Developers Succinctly Object-Oriented Programming in C# Succinctly

                P Offline
                P Offline
                PIEBALDconsult
                wrote on last edited by
                #11

                Once a book is wrong; it stays wrong.

                Sander RosselS 1 Reply Last reply
                0
                • R raddevus

                  While reading the fanastic book, Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) by Charles Petzold[^] I stumbled upon this:

                  Charles Petzold said:

                  Sometime around 1948, the American mathematician John Wilder Tukey (born 1915) realized that the words binary digit were likely to assume a much greater importance in the years ahead as computers be came more prevalent. He decided to coin a new, shorter word to replace the unwieldy five syllables of binary digit . He considered bigit and binit but settled instead on the short, simple, elegant, and perfectly lovely word bit .

                  Verified by wikipedia. John Tukey - Wikipedia[^]

                  wikipedia:

                  While working with John von Neumann on early computer designs, Tukey introduced the word "bit" as a contraction of "binary digit".[8] The term "bit" was first used in an article by Claude Shannon in 1948.

                  Have any of you read Code? It's really fantastic.

                  L Offline
                  L Offline
                  Lost User
                  wrote on last edited by
                  #12

                  Interesting. When I saw the title of your post I thought you may be referring to one of his other achievements. In the 70's I wrote a Basic program for a PDP-11[^] taken directly from this paper: An Algorithm for the Machine Calculation of Complex Fourier Series on JSTOR[^]

                  Peter Wasser "The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts." - Bertrand Russell

                  R 1 Reply Last reply
                  0
                  • L Lost User

                    Interesting. When I saw the title of your post I thought you may be referring to one of his other achievements. In the 70's I wrote a Basic program for a PDP-11[^] taken directly from this paper: An Algorithm for the Machine Calculation of Complex Fourier Series on JSTOR[^]

                    Peter Wasser "The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts." - Bertrand Russell

                    R Offline
                    R Offline
                    raddevus
                    wrote on last edited by
                    #13

                    That's really cool. Must've been a serious challenge. We're so lucky today to have so much access to compute time and be able to REPL through code and try things over and over. Seems like it would've been so hard to do things back then. Plus no Internet to look up answers & code on StackOverflow. :laugh:

                    L 1 Reply Last reply
                    0
                    • P PIEBALDconsult

                      raddevus wrote:

                      Have any of you read Code?

                      I have. It's one of the things that got me modeling relays, gates, adders, etc. in C#. :-D

                      R Offline
                      R Offline
                      raddevus
                      wrote on last edited by
                      #14

                      I really love the way Petzold builds up the "story of computing". I think it opens a lot of understanding about why many things are the way they are in computing. It is also interesting that way down there at the bottom the computer is still the same thing it always was: just a bit machine.

                      1 Reply Last reply
                      0
                      • R raddevus

                        That's really cool. Must've been a serious challenge. We're so lucky today to have so much access to compute time and be able to REPL through code and try things over and over. Seems like it would've been so hard to do things back then. Plus no Internet to look up answers & code on StackOverflow. :laugh:

                        L Offline
                        L Offline
                        Lost User
                        wrote on last edited by
                        #15

                        raddevus wrote:

                        like it would've been so hard to do things back then

                        It wasn't hard it was fun. No CodeProject or StackOverflow. The main resource was piles of manuals which usually contained the information one needed and other PDP-11 users of which most campuses had a few.

                        Peter Wasser "The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts." - Bertrand Russell

                        R 1 Reply Last reply
                        0
                        • P PIEBALDconsult

                          Once a book is wrong; it stays wrong.

                          Sander RosselS Offline
                          Sander RosselS Offline
                          Sander Rossel
                          wrote on last edited by
                          #16

                          I know, and a book author is not such a different person than a Wikipedia author so books will have mistakes too. Study suggested that Wikipedia is only slightly less accurate than the Britannica encyclopedia. Wikipedia has a lot more info though. It's just that these arrogant academics feel threatened by Wikipedia so it's forbidden to use even though that makes no sense :sigh: We still used it, of course, and then used the Wikipedia sources as though we consulted them instead of Wikipedia. And even though they hate to admit it, the teachers used Wikipedia in their research too, in exactly the way we used it.

                          Best, Sander arrgh.js - Bringing LINQ to JavaScript SQL Server for C# Developers Succinctly Object-Oriented Programming in C# Succinctly

                          1 Reply Last reply
                          0
                          • R raddevus

                            While reading the fanastic book, Code: The Hidden Language of Computer Hardware and Software (Developer Best Practices) by Charles Petzold[^] I stumbled upon this:

                            Charles Petzold said:

                            Sometime around 1948, the American mathematician John Wilder Tukey (born 1915) realized that the words binary digit were likely to assume a much greater importance in the years ahead as computers be came more prevalent. He decided to coin a new, shorter word to replace the unwieldy five syllables of binary digit . He considered bigit and binit but settled instead on the short, simple, elegant, and perfectly lovely word bit .

                            Verified by wikipedia. John Tukey - Wikipedia[^]

                            wikipedia:

                            While working with John von Neumann on early computer designs, Tukey introduced the word "bit" as a contraction of "binary digit".[8] The term "bit" was first used in an article by Claude Shannon in 1948.

                            Have any of you read Code? It's really fantastic.

                            N Offline
                            N Offline
                            Nathan Minier
                            wrote on last edited by
                            #17

                            I've given out 2 copies to friends that were thinking about getting into IT. It's a great starting point for those that don't go to school for CS.

                            "There are three kinds of lies: lies, damned lies and statistics." - Benjamin Disraeli

                            R 1 Reply Last reply
                            0
                            • N Nathan Minier

                              I've given out 2 copies to friends that were thinking about getting into IT. It's a great starting point for those that don't go to school for CS.

                              "There are three kinds of lies: lies, damned lies and statistics." - Benjamin Disraeli

                              R Offline
                              R Offline
                              raddevus
                              wrote on last edited by
                              #18

                              Nathan Minier wrote:

                              It's a great starting point for those that don't go to school for CS.

                              I agree. The build-up of the story of computing is really fantastic. I actually read the section on relays years ago and my head exploded because it made so much sense. :)

                              1 Reply Last reply
                              0
                              • L Lost User

                                raddevus wrote:

                                like it would've been so hard to do things back then

                                It wasn't hard it was fun. No CodeProject or StackOverflow. The main resource was piles of manuals which usually contained the information one needed and other PDP-11 users of which most campuses had a few.

                                Peter Wasser "The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts." - Bertrand Russell

                                R Offline
                                R Offline
                                raddevus
                                wrote on last edited by
                                #19

                                You are honestly a part of a an elite few. Really great stuff. Thanks for sharing.

                                1 Reply Last reply
                                0
                                Reply
                                • Reply as topic
                                Log in to reply
                                • Oldest to Newest
                                • Newest to Oldest
                                • Most Votes


                                • Login

                                • Don't have an account? Register

                                • Login or register to search.
                                • First post
                                  Last post
                                0
                                • Categories
                                • Recent
                                • Tags
                                • Popular
                                • World
                                • Users
                                • Groups