Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Utilization

Utilization

Scheduled Pinned Locked Moved The Lounge
graphicsdesignasp-netcomsysadmin
46 Posts 20 Posters 3 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • honey the codewitchH Offline
    honey the codewitchH Offline
    honey the codewitch
    wrote on last edited by
    #1

    Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since. When your CPU core(s) aren't performing tasks, they are idle hands. When your RAM is not allocated, it's doing no useful work. (Still drawing power though!) While your I/O was idle, it could have been preloading something for you. I see people complain about resource utilization in modern applications, and I can't help but think of the above. RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100% Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly. Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster. My point is this: Utilization is a good thing, in many if not most cases. What's that old saw? Idle hands are the devil's playground. Your computer is like that. I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes. This means my computer isn't wasting my time. Just sayin'

    Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

    F Kornfeld Eliyahu PeterK D J G 13 Replies Last reply
    0
    • honey the codewitchH honey the codewitch

      Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since. When your CPU core(s) aren't performing tasks, they are idle hands. When your RAM is not allocated, it's doing no useful work. (Still drawing power though!) While your I/O was idle, it could have been preloading something for you. I see people complain about resource utilization in modern applications, and I can't help but think of the above. RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100% Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly. Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster. My point is this: Utilization is a good thing, in many if not most cases. What's that old saw? Idle hands are the devil's playground. Your computer is like that. I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes. This means my computer isn't wasting my time. Just sayin'

      Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

      F Offline
      F Offline
      fgs1963
      wrote on last edited by
      #2

      I recall reading a short essay years ago by a senior OS engineer (Microsoft or Apple, not sure) that said much the same. It makes good sense IMO. Thanks for the reminder.

      1 Reply Last reply
      0
      • honey the codewitchH honey the codewitch

        Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since. When your CPU core(s) aren't performing tasks, they are idle hands. When your RAM is not allocated, it's doing no useful work. (Still drawing power though!) While your I/O was idle, it could have been preloading something for you. I see people complain about resource utilization in modern applications, and I can't help but think of the above. RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100% Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly. Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster. My point is this: Utilization is a good thing, in many if not most cases. What's that old saw? Idle hands are the devil's playground. Your computer is like that. I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes. This means my computer isn't wasting my time. Just sayin'

        Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

        Kornfeld Eliyahu PeterK Offline
        Kornfeld Eliyahu PeterK Offline
        Kornfeld Eliyahu Peter
        wrote on last edited by
        #3

        I think you didn't think that to the end... If any single software would take up all the resources it would kill any real productivity... Let us say VS takes all memory just when opening a solution... Now I ask it to compile that solution... VS - by default IIRC - will compile 8 projects in parallel, so it will try and fire-up 8 instances of msbuild... But there is no memory, so before each and every of those 8 instances the OS will do a memory swap... And memory swap for the 4th instance of msbuild may take memory from the 1st instance as it may be in IO (blocked) and considered inactive... And memory swapping is very expensive... I do agree that any app should utilize all resource when it needs it, but it also should release it the moment it needs it not...

        "If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization." ― Gerald Weinberg

        "It never ceases to amaze me that a spacecraft launched in 1977 can be fixed remotely from Earth." ― Brian Cox

        honey the codewitchH 1 Reply Last reply
        0
        • Kornfeld Eliyahu PeterK Kornfeld Eliyahu Peter

          I think you didn't think that to the end... If any single software would take up all the resources it would kill any real productivity... Let us say VS takes all memory just when opening a solution... Now I ask it to compile that solution... VS - by default IIRC - will compile 8 projects in parallel, so it will try and fire-up 8 instances of msbuild... But there is no memory, so before each and every of those 8 instances the OS will do a memory swap... And memory swap for the 4th instance of msbuild may take memory from the 1st instance as it may be in IO (blocked) and considered inactive... And memory swapping is very expensive... I do agree that any app should utilize all resource when it needs it, but it also should release it the moment it needs it not...

          "If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization." ― Gerald Weinberg

          honey the codewitchH Offline
          honey the codewitchH Offline
          honey the codewitch
          wrote on last edited by
          #4

          I did. I said in an ideal world RAM utilization would always be at 100%. That's a hypothetical. It's not intended to be real world, but rather illustrative of a point: RAM is always drawing power, even at idle. The most efficient way to use it is to allocate it for something, even if you do so ahead of time. I did not say that it would or even should be utilized by one application.

          Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

          Mircea NeacsuM T S R 4 Replies Last reply
          0
          • honey the codewitchH honey the codewitch

            Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since. When your CPU core(s) aren't performing tasks, they are idle hands. When your RAM is not allocated, it's doing no useful work. (Still drawing power though!) While your I/O was idle, it could have been preloading something for you. I see people complain about resource utilization in modern applications, and I can't help but think of the above. RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100% Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly. Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster. My point is this: Utilization is a good thing, in many if not most cases. What's that old saw? Idle hands are the devil's playground. Your computer is like that. I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes. This means my computer isn't wasting my time. Just sayin'

            Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

            D Offline
            D Offline
            den2k88
            wrote on last edited by
            #5

            If the firmware I have to develop end up using 100% of resources it's a disaster, as it would be impossible to add functionalities without changing hardware or messing with already available features. Also a personal computer is a flexible tool: flexibility requires that resources must be available at any given time.

            GCS/GE d--(d) s-/+ a C+++ U+++ P-- L+@ E-- W+++ N+ o+ K- w+++ O? M-- V? PS+ PE Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++*      Weapons extension: ma- k++ F+2 X The shortest horror story: On Error Resume Next

            honey the codewitchH 1 Reply Last reply
            0
            • D den2k88

              If the firmware I have to develop end up using 100% of resources it's a disaster, as it would be impossible to add functionalities without changing hardware or messing with already available features. Also a personal computer is a flexible tool: flexibility requires that resources must be available at any given time.

              GCS/GE d--(d) s-/+ a C+++ U+++ P-- L+@ E-- W+++ N+ o+ K- w+++ O? M-- V? PS+ PE Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++*      Weapons extension: ma- k++ F+2 X The shortest horror story: On Error Resume Next

              honey the codewitchH Offline
              honey the codewitchH Offline
              honey the codewitch
              wrote on last edited by
              #6

              Firmware has other considerations. I'm talking PCs primarily, user machines. If those resources are queued up and preallocated they are that much *more* ready to use than if you suddenly need gigs of RAM waiting in the wings. This is precisely why modern apps, and frameworks (like .NET) do it.

              Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

              U 1 Reply Last reply
              0
              • honey the codewitchH honey the codewitch

                Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since. When your CPU core(s) aren't performing tasks, they are idle hands. When your RAM is not allocated, it's doing no useful work. (Still drawing power though!) While your I/O was idle, it could have been preloading something for you. I see people complain about resource utilization in modern applications, and I can't help but think of the above. RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100% Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly. Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster. My point is this: Utilization is a good thing, in many if not most cases. What's that old saw? Idle hands are the devil's playground. Your computer is like that. I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes. This means my computer isn't wasting my time. Just sayin'

                Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                J Offline
                J Offline
                jschell
                wrote on last edited by
                #7

                honey the codewitch wrote:

                I like to see my CPU work hard when it works at all.

                In the space that I work in which is different than yours I like it when the CPU load is less than 50%. That gives me a buffer when the new feature I added for some reason starts chewing up that additional space. And for a database I want to see it at even less than that. Similar reason but I expect more surprises with the db than with the application. It gets real scary when the database is running at a sustained utilization of 80%.

                honey the codewitchH 1 Reply Last reply
                0
                • J jschell

                  honey the codewitch wrote:

                  I like to see my CPU work hard when it works at all.

                  In the space that I work in which is different than yours I like it when the CPU load is less than 50%. That gives me a buffer when the new feature I added for some reason starts chewing up that additional space. And for a database I want to see it at even less than that. Similar reason but I expect more surprises with the db than with the application. It gets real scary when the database is running at a sustained utilization of 80%.

                  honey the codewitchH Offline
                  honey the codewitchH Offline
                  honey the codewitch
                  wrote on last edited by
                  #8

                  I probably should have been clear that I am primarily talking about traditionally user facing machines like desktops and laptops here rather than servers and embedded. Utilization is important in those arenas too, but both how you achieve it, and where you want it are going to be dramatically different. I sure hope that when I'm searching a distributed partitioned view in SQL Server, that all the logical "spindles" its partitioned across are speeding right along together. I also expect a database server to be less CPU heavy and more storage heavy, meaning your utilization metric will be your storage and I/O primarily. That's how you know your queries are being properly parallelized, for example. It's different considerations to be sure, even if utilization sits at the center of all of them.

                  Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                  1 Reply Last reply
                  0
                  • honey the codewitchH honey the codewitch

                    Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since. When your CPU core(s) aren't performing tasks, they are idle hands. When your RAM is not allocated, it's doing no useful work. (Still drawing power though!) While your I/O was idle, it could have been preloading something for you. I see people complain about resource utilization in modern applications, and I can't help but think of the above. RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100% Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly. Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster. My point is this: Utilization is a good thing, in many if not most cases. What's that old saw? Idle hands are the devil's playground. Your computer is like that. I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes. This means my computer isn't wasting my time. Just sayin'

                    Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                    G Offline
                    G Offline
                    Gary Wheeler
                    wrote on last edited by
                    #9

                    At one time unused physical RAM in Windows machines was used for disk cache, thereby keeping RAM utilization at 100% for all intents and purposes.

                    Software Zen: delete this;

                    honey the codewitchH 1 Reply Last reply
                    0
                    • honey the codewitchH honey the codewitch

                      I did. I said in an ideal world RAM utilization would always be at 100%. That's a hypothetical. It's not intended to be real world, but rather illustrative of a point: RAM is always drawing power, even at idle. The most efficient way to use it is to allocate it for something, even if you do so ahead of time. I did not say that it would or even should be utilized by one application.

                      Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                      Mircea NeacsuM Offline
                      Mircea NeacsuM Offline
                      Mircea Neacsu
                      wrote on last edited by
                      #10

                      If I may interject: memory is always used at 100% by an app called "operating system". Parts that are not urgently needed are relinquished to other apps upon request. In the scenario pointed by Peter, how is VS going to know how much memory MSBuild instances are gonna need? Should they ask VS pretty please to release the memory? Is VS going to act as some type of surrogate OS? Memory hogging is not a disease of VS only; it's a virus that has spread to browsers and many others.

                      Mircea

                      honey the codewitchH 1 Reply Last reply
                      0
                      • Mircea NeacsuM Mircea Neacsu

                        If I may interject: memory is always used at 100% by an app called "operating system". Parts that are not urgently needed are relinquished to other apps upon request. In the scenario pointed by Peter, how is VS going to know how much memory MSBuild instances are gonna need? Should they ask VS pretty please to release the memory? Is VS going to act as some type of surrogate OS? Memory hogging is not a disease of VS only; it's a virus that has spread to browsers and many others.

                        Mircea

                        honey the codewitchH Offline
                        honey the codewitchH Offline
                        honey the codewitch
                        wrote on last edited by
                        #11

                        I'm not necessarily endorsing this approach so much as observing it, since I haven't run any performance metrics on alternatives but: It seems to me that the OS effectively has the information it needs due to paging. It doesn't play out directly to where an app knows how much memory is free, *but* firstly there are plenty of ways to get an idea of general "memory pressure" in windows, and paging allows for an app to preallocate and let the OS manage which parts are actually in RAM at a given period. Does it work? Well, I mean if it worked perfectly, people wouldn't keep needing new computers every 5 years I suppose. (I'm half kidding here, there's a lot that goes in to that) I don't know. But that seems to be how things operate now, say with VS, Chrome and Windows for example.

                        Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                        1 Reply Last reply
                        0
                        • G Gary Wheeler

                          At one time unused physical RAM in Windows machines was used for disk cache, thereby keeping RAM utilization at 100% for all intents and purposes.

                          Software Zen: delete this;

                          honey the codewitchH Offline
                          honey the codewitchH Offline
                          honey the codewitch
                          wrote on last edited by
                          #12

                          That's actually in theory a good idea. I wonder why they stopped allocating all of it.

                          Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                          G J 2 Replies Last reply
                          0
                          • honey the codewitchH honey the codewitch

                            That's actually in theory a good idea. I wonder why they stopped allocating all of it.

                            Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                            G Offline
                            G Offline
                            Gary Wheeler
                            wrote on last edited by
                            #13

                            At one time they did use all of it, minus a fraction to keep handy as reserve. In today's world with SSD's and much faster 'disk' interfaces, I don't know if this is still valuable or not. The fact that the unallocated RAM was used for disk cache wasn't visible to the user or to applications.

                            Software Zen: delete this;

                            honey the codewitchH 1 Reply Last reply
                            0
                            • G Gary Wheeler

                              At one time they did use all of it, minus a fraction to keep handy as reserve. In today's world with SSD's and much faster 'disk' interfaces, I don't know if this is still valuable or not. The fact that the unallocated RAM was used for disk cache wasn't visible to the user or to applications.

                              Software Zen: delete this;

                              honey the codewitchH Offline
                              honey the codewitchH Offline
                              honey the codewitch
                              wrote on last edited by
                              #14

                              I don't even necessarily mean for disk cache, but as something.

                              Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                              1 Reply Last reply
                              0
                              • honey the codewitchH honey the codewitch

                                Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since. When your CPU core(s) aren't performing tasks, they are idle hands. When your RAM is not allocated, it's doing no useful work. (Still drawing power though!) While your I/O was idle, it could have been preloading something for you. I see people complain about resource utilization in modern applications, and I can't help but think of the above. RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100% Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly. Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster. My point is this: Utilization is a good thing, in many if not most cases. What's that old saw? Idle hands are the devil's playground. Your computer is like that. I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes. This means my computer isn't wasting my time. Just sayin'

                                Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                T Offline
                                T Offline
                                trønderen
                                wrote on last edited by
                                #15

                                I am having the same relationship to my car. When I am not driving it, its total utilization is falling, so I keep it running as much as possible. I pick up four friends to go with me on trips, to utilize the seat capacity as much as possible. To utilize the engine to the maximum extent we have to go out on the highway (otherwise we would break the speed limit all the time). This part of the year, I am happy about the utilization factor of the headlights; I keep them on at all times to rise utilization. Also, with lots of rain, the windshield wipers is another component that can contribute to the total utilization. Obviously, the car stereo is active all the time, to make sure it is utilized to the fullest extent possible. Making the maximum possible use of everything you have at your disposition is essential for a good life. Keep your fridge and freezer filled up to utilize its capacity. If you have spare beds in your home, invite someone to sleep in them. Keep all your electric lights at maximum utilization. Maybe even your SO!

                                honey the codewitchH 1 Reply Last reply
                                0
                                • honey the codewitchH honey the codewitch

                                  I did. I said in an ideal world RAM utilization would always be at 100%. That's a hypothetical. It's not intended to be real world, but rather illustrative of a point: RAM is always drawing power, even at idle. The most efficient way to use it is to allocate it for something, even if you do so ahead of time. I did not say that it would or even should be utilized by one application.

                                  Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                  T Offline
                                  T Offline
                                  trønderen
                                  wrote on last edited by
                                  #16

                                  honey the codewitch wrote:

                                  I said in an ideal world RAM utilization would always be at 100%.

                                  A perfectly balanced system has bottlenecks everywhere.

                                  1 Reply Last reply
                                  0
                                  • honey the codewitchH honey the codewitch

                                    Edit: To be clear I'm talking about user facing machines rather than server or embedded, and a hypothetical ideal. In practice CPUs need about 10% off the top to keep their scheduler working, for example, and there are a lot of details I'm glossing over in this post, so it would be a good idea to read the comments before replying. There has been a lot of ground covered since. When your CPU core(s) aren't performing tasks, they are idle hands. When your RAM is not allocated, it's doing no useful work. (Still drawing power though!) While your I/O was idle, it could have been preloading something for you. I see people complain about resource utilization in modern applications, and I can't help but think of the above. RAM does not work like non-volatile storage in that it's best to keep some free space available. Frankly, in an ideal world, your RAM allocation would always be 100% Assuming your machine is performing any work at all (and not just idling) ideally it would do so utilizing the entire CPU, so it could complete quickly. Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster. My point is this: Utilization is a good thing, in many if not most cases. What's that old saw? Idle hands are the devil's playground. Your computer is like that. I like to see my CPU work hard when it works at all. I like to see my RAM utilization be *at least* half even at idle. I like to see my storage ticking away a bit in the background, doing its lazy writes. This means my computer isn't wasting my time. Just sayin'

                                    Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                    D Offline
                                    D Offline
                                    dandy72
                                    wrote on last edited by
                                    #17

                                    honey the codewitch wrote:

                                    Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster.

                                    That's all fine and dandy when it *knows* what it is I'm going to be using, but making the wrong guess means someone's expended resources to do that work for nothing. Leaving less memory for other things that could've been cached. It's really all a balancing act, every OS has its own guidelines explaining what each app should do or avoid in order to be a good citizen. Then it's up to the OS to juggle it all and try to make the correct guesses. Bottom line, I'm with you, if you have the resources, by all means, use them. But the key, as already mentioned, is that you have to be smart about it, you can't act as if you're the only one around, 'cuz everybody else is trying to do the same...

                                    honey the codewitchH 1 Reply Last reply
                                    0
                                    • D dandy72

                                      honey the codewitch wrote:

                                      Assuming you're going to be using your machine in the near future, your I/O may be sitting idle, but ideally it would be preloading things you were planning to use, so it could launch faster.

                                      That's all fine and dandy when it *knows* what it is I'm going to be using, but making the wrong guess means someone's expended resources to do that work for nothing. Leaving less memory for other things that could've been cached. It's really all a balancing act, every OS has its own guidelines explaining what each app should do or avoid in order to be a good citizen. Then it's up to the OS to juggle it all and try to make the correct guesses. Bottom line, I'm with you, if you have the resources, by all means, use them. But the key, as already mentioned, is that you have to be smart about it, you can't act as if you're the only one around, 'cuz everybody else is trying to do the same...

                                      honey the codewitchH Offline
                                      honey the codewitchH Offline
                                      honey the codewitch
                                      wrote on last edited by
                                      #18

                                      Of course, and by ideally, I am indeed intending a hypothetical scenario with ideal conditions, as illustrative of a point, rather than an attempt to reflect reality. That point is that if you can get your I/O to do useful work when it's not doing anything else, that is typically a net win. Even if you can't, as long as you win more times than you lose, it's still a win - like blackjack and card counting, if you do it right, you'll win a lot. But again, these situations are only intended as illustrative hypotheticals, and broadly articulated ones at that. I didn't want to get lost in the weeds.

                                      Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                      D 1 Reply Last reply
                                      0
                                      • T trønderen

                                        I am having the same relationship to my car. When I am not driving it, its total utilization is falling, so I keep it running as much as possible. I pick up four friends to go with me on trips, to utilize the seat capacity as much as possible. To utilize the engine to the maximum extent we have to go out on the highway (otherwise we would break the speed limit all the time). This part of the year, I am happy about the utilization factor of the headlights; I keep them on at all times to rise utilization. Also, with lots of rain, the windshield wipers is another component that can contribute to the total utilization. Obviously, the car stereo is active all the time, to make sure it is utilized to the fullest extent possible. Making the maximum possible use of everything you have at your disposition is essential for a good life. Keep your fridge and freezer filled up to utilize its capacity. If you have spare beds in your home, invite someone to sleep in them. Keep all your electric lights at maximum utilization. Maybe even your SO!

                                        honey the codewitchH Offline
                                        honey the codewitchH Offline
                                        honey the codewitch
                                        wrote on last edited by
                                        #19

                                        I think you may be having a laugh at me, I'm not sure. :laugh: But either way, I agree with parts of what you wrote.

                                        trønderen wrote:

                                        If you have spare beds in your home, invite someone to sleep in them.

                                        I used to do this when I was younger and could get away with it. I was a homeless teenager, so when i was in my twenties and living in seattle among a sea of homeless young adults, I'd let them crash where I lived. I lost some stuff to theft, and a little peace to some drama, but I'm still glad I did it. Because once or twice I met someone who did that for others, when I needed a place to stay. Imagine a world where people with more than they need were very open to sharing their excess with others.

                                        Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                        1 Reply Last reply
                                        0
                                        • honey the codewitchH honey the codewitch

                                          Of course, and by ideally, I am indeed intending a hypothetical scenario with ideal conditions, as illustrative of a point, rather than an attempt to reflect reality. That point is that if you can get your I/O to do useful work when it's not doing anything else, that is typically a net win. Even if you can't, as long as you win more times than you lose, it's still a win - like blackjack and card counting, if you do it right, you'll win a lot. But again, these situations are only intended as illustrative hypotheticals, and broadly articulated ones at that. I didn't want to get lost in the weeds.

                                          Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix

                                          D Offline
                                          D Offline
                                          dandy72
                                          wrote on last edited by
                                          #20

                                          Sure. Cache hit is a thing, and so is cache miss. It doesn't mean we shouldn't try to cache anything at all. Just that the algorithm used to decide what to cache vs what to let go of is very much something that's still in development. I'm not aware of any magic bullet.

                                          honey the codewitchH 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups