Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Vista memory usage

Vista memory usage

Scheduled Pinned Locked Moved The Lounge
questionhtmlcomperformance
42 Posts 19 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • P Patrick Etc

    John C wrote:

    This got me curious as to why it does use approx 1gb of memory after boot and here is at least in part the answer:

    This seems like a fantastically bad idea to me. Ok, it may work well for the "average" user who is almost never going to run a high-memory-load application like a game or Visual Studio, but for everyone else it's going to get in the way when the app you're loading suddenly asks for 500MB or 1GB of RAM. Not to mention that there is a distinct benefit to keeping some of that RAM unused, even if it is powered - power usage and hardware life expectancy. It's an idea that makes sense, and yet, it seems to be lacking something..


    It has become appallingly obvious that our technology has exceeded our humanity. - Albert Einstein

    M Offline
    M Offline
    Member 96
    wrote on last edited by
    #18

    I honestly haven't seen that as a problem. The only thing that has annoyed me about it is the constant hard drive access when I'm doing nothing that should be accessing the hard drive. I think it won't be long before hard drives with moving parts are obsolete (hopefully) anyway so it's all kind of a moot point then.


    All programmers are playwrights and all computers are lousy actors.

    P R R 3 Replies Last reply
    0
    • P Paul Sanders the other one

      That someone was me, and there seems to be a bit of FUD concerning the article you cite. When Vista reports 500MB (or whatever) of memory used, that is what it means, and this figure _excludes_ any memory used for caching data read from disk. On the other hand, when Vista reports 3MB free, it is trying to tell you that it has used all your spare RAM for caching, which is exactly what it should do. So I make two observations: 1. Vista uses a lot more memory than XP 2. Vista caches more effectively than XP I think you make a lot of good points but the bottom line is simple. To run Vista effectively, you need at least 1GB, and preferably 2GB of RAM. This doesn't seem to stop hardware vendors badging machines with 512MB of RAM as 'Vista capable'; Vista incompetent would be a better description. Personally, I can put up with this. I know the score and my own machines are up to the job (some of them, anyhow). But for many it means a trip to the computer shop and I really can't condone such profligacy by Microsoft's development teams. When I have time, I might do a bit of benchmarking to see if Vista is really as slow as it feels on modest (but far from uncommon) hardware. I know for a fact that the audio system has higher CPU overheads than XP because I have (informally) benchmarked that already.

      Paul Sanders http://www.alpinesoft.co.uk

      M Offline
      M Offline
      Member 96
      wrote on last edited by
      #19

      Paul Sanders (AlpineSoft) wrote:

      To run Vista effectively, you need at least 1GB, and preferably 2GB of RAM

      No argument here.

      Paul Sanders (AlpineSoft) wrote:

      This doesn't seem to stop hardware vendors badging machines with 512MB of RAM as 'Vista capable'; Vista incompetent would be a better description.

      Yeah and this is entirely at the feet of Microsoft. They set the bar too low and are now paying the price with bad publicity.

      Paul Sanders (AlpineSoft) wrote:

      I know for a fact that the audio system has higher CPU overheads than XP because I have (informally) benchmarked that already.

      Probably all the DRM stuff. I've done some informal benchmarking which is easy here because we have test systems for testing our software before release (although that's all going virtual these days) and it's easy for me to load up an identical computer with xp, 95, 200, 2003, Vista etc. In my own testing I was mostly interested in the performance of our .net 2.0 app which is a pretty big multi tier business application. I found it to be very noticeably faster when running under Vista on the same hardware. But the bottom line for me as a developer is that I knew our customers were going to be using it (in fact they were running the beta before we had even installed it over a year ago as many of our customers are computer networking service companies) and I knew I needed to support it so I went out and bought the fastest computer I could get my hands on within reason that was certified for Vista and have had no issues. Again, as I said before, any of us old enough have seen this whole discussion go around and around before. When Micrsoft releases an OS they have to build against what will be common hardware at some predetermined sweet spot in the life of that OS. When windows 95 came out all the same arguments about it being a hog came out. When XP was released the fervor was equal to or perhaps even a little higher than it is now for Vista and it was all down to existing hardware being underpowered for the features of the OS. This time I think Microsoft was a little too optimistic than they have been in the past in what they say is approved for use with Vista but I bet you one dollar that when the big successor to Vista comes out we will see the same arguments again with people saying how much they loved vista and what a hog the

      T P 2 Replies Last reply
      0
      • L Luis Alonso Ramos

        David Lockwood wrote:

        because it's waiting there to be filled with my stuff when I choose

        As I said in a post above, I don't know the answer, but I believe that while loading your stuff in memory, whether that memory is "used" or "free" (marked as such in the OS's internal tables) would virtually take the same time (the freeing of used memory is logical only, and the OS doesn't actually go and set every bit to 0 -- correct me if I am wrong). And that one day, when by chace the stuff you need is already in memory, you'll see a speed improvement. Maybe for your case it's hard to predict, but what about those people that only use IE, Outlook, Word and Excel? they are predictable and thus, might see an improvement.

        Luis Alonso Ramos Intelectix Chihuahua, Mexico

        My Blog!

        P Offline
        P Offline
        Paul Sanders the other one
        wrote on last edited by
        #20

        You're right - caching is important. Windows would run like a dog without it. It's easy to demonstrate just how important it is, like so: 1. Restart your computer. 2. WAIT, until the hard disk light stops flashing. 3. Start up IE and time how long it takes. 4. Shut down IE, then start it up again, again timing how long it takes. The difference is startling. That's caching for you. Utilising all your RAM on the offchance that it might avoid a disk access is a no-brainer. The clever part is deciding what to hang on to and what to throw away. Is Vista driving down the cost of RAM, like XP once did? Methinks it is.

        Paul Sanders http://www.alpinesoft.co.uk

        G 1 Reply Last reply
        0
        • M Member 96

          Paul Sanders (AlpineSoft) wrote:

          To run Vista effectively, you need at least 1GB, and preferably 2GB of RAM

          No argument here.

          Paul Sanders (AlpineSoft) wrote:

          This doesn't seem to stop hardware vendors badging machines with 512MB of RAM as 'Vista capable'; Vista incompetent would be a better description.

          Yeah and this is entirely at the feet of Microsoft. They set the bar too low and are now paying the price with bad publicity.

          Paul Sanders (AlpineSoft) wrote:

          I know for a fact that the audio system has higher CPU overheads than XP because I have (informally) benchmarked that already.

          Probably all the DRM stuff. I've done some informal benchmarking which is easy here because we have test systems for testing our software before release (although that's all going virtual these days) and it's easy for me to load up an identical computer with xp, 95, 200, 2003, Vista etc. In my own testing I was mostly interested in the performance of our .net 2.0 app which is a pretty big multi tier business application. I found it to be very noticeably faster when running under Vista on the same hardware. But the bottom line for me as a developer is that I knew our customers were going to be using it (in fact they were running the beta before we had even installed it over a year ago as many of our customers are computer networking service companies) and I knew I needed to support it so I went out and bought the fastest computer I could get my hands on within reason that was certified for Vista and have had no issues. Again, as I said before, any of us old enough have seen this whole discussion go around and around before. When Micrsoft releases an OS they have to build against what will be common hardware at some predetermined sweet spot in the life of that OS. When windows 95 came out all the same arguments about it being a hog came out. When XP was released the fervor was equal to or perhaps even a little higher than it is now for Vista and it was all down to existing hardware being underpowered for the features of the OS. This time I think Microsoft was a little too optimistic than they have been in the past in what they say is approved for use with Vista but I bet you one dollar that when the big successor to Vista comes out we will see the same arguments again with people saying how much they loved vista and what a hog the

          T Offline
          T Offline
          Thunderbox666
          wrote on last edited by
          #21

          John C wrote:

          I bet you one dollar that when the big successor to Vista comes out we will see the same arguments again with people saying how much they loved vista and what a hog the new os is and how slow it is etc etc.

          WOW big bet, you must be confident.... You wont hear me ever say how much I loved Vista... EVER Sure it will get crap thrown all over it when it comes out (thats happened every time), but Vista seems to have copped it worst so far, and it is taking far more time to get accepted then XP or any of the previous versions have

          "There are three sides to every story. Yours, mine and the truth" ~ unknown

          M 1 Reply Last reply
          0
          • M Member 96

            Paul Sanders (AlpineSoft) wrote:

            To run Vista effectively, you need at least 1GB, and preferably 2GB of RAM

            No argument here.

            Paul Sanders (AlpineSoft) wrote:

            This doesn't seem to stop hardware vendors badging machines with 512MB of RAM as 'Vista capable'; Vista incompetent would be a better description.

            Yeah and this is entirely at the feet of Microsoft. They set the bar too low and are now paying the price with bad publicity.

            Paul Sanders (AlpineSoft) wrote:

            I know for a fact that the audio system has higher CPU overheads than XP because I have (informally) benchmarked that already.

            Probably all the DRM stuff. I've done some informal benchmarking which is easy here because we have test systems for testing our software before release (although that's all going virtual these days) and it's easy for me to load up an identical computer with xp, 95, 200, 2003, Vista etc. In my own testing I was mostly interested in the performance of our .net 2.0 app which is a pretty big multi tier business application. I found it to be very noticeably faster when running under Vista on the same hardware. But the bottom line for me as a developer is that I knew our customers were going to be using it (in fact they were running the beta before we had even installed it over a year ago as many of our customers are computer networking service companies) and I knew I needed to support it so I went out and bought the fastest computer I could get my hands on within reason that was certified for Vista and have had no issues. Again, as I said before, any of us old enough have seen this whole discussion go around and around before. When Micrsoft releases an OS they have to build against what will be common hardware at some predetermined sweet spot in the life of that OS. When windows 95 came out all the same arguments about it being a hog came out. When XP was released the fervor was equal to or perhaps even a little higher than it is now for Vista and it was all down to existing hardware being underpowered for the features of the OS. This time I think Microsoft was a little too optimistic than they have been in the past in what they say is approved for use with Vista but I bet you one dollar that when the big successor to Vista comes out we will see the same arguments again with people saying how much they loved vista and what a hog the

            P Offline
            P Offline
            Paul Sanders the other one
            wrote on last edited by
            #22

            Yeah, I've more or less come round to your point of view actually. I just feel sorry for Joe Public who will be told at his local PC emporium that he / she needs a new machine, yet again. Like I say, we are the lucky ones.

            Paul Sanders http://www.alpinesoft.co.uk

            1 Reply Last reply
            0
            • M martin_hughes

              People do keep complaining about Vista's memory usage, and perhaps fairly. However memory is so cheap that, in my opinion, it's requires no great shakes to upgrade. When I built my new computer recently I knew I'd be getting a 64-bit O/S and saw no reason whatsoever not to get 8Gb's of RAM. With all that available space Vista with VS2008 running consumes approx 1.71 GB's... and it runs faster than just about any other computer I've ever used.

              "On one of my cards it said I had to find temperatures lower than -8. The numbers I uncovered were -6 and -7 so I thought I had won, and so did the woman in the shop. But when she scanned the card the machine said I hadn't. "I phoned Camelot and they fobbed me off with some story that -6 is higher - not lower - than -8 but I'm not having it." -Tina Farrell, a 23 year old thicky from Levenshulme, Manchester.

              D Offline
              D Offline
              Daniel Vaughan
              wrote on last edited by
              #23

              For sure, desktop memory is very cheap. My laptop, on the other hand, only has two memory slots, and memory for it costs a bit more. 8 GBs... yihaa! :) Daniel


              Daniel Vaughan
              LinkedIn Profile ShelfSpy

              1 Reply Last reply
              0
              • M Member 96

                I honestly haven't seen that as a problem. The only thing that has annoyed me about it is the constant hard drive access when I'm doing nothing that should be accessing the hard drive. I think it won't be long before hard drives with moving parts are obsolete (hopefully) anyway so it's all kind of a moot point then.


                All programmers are playwrights and all computers are lousy actors.

                P Offline
                P Offline
                Patrick Etc
                wrote on last edited by
                #24

                John C wrote:

                I think it won't be long before hard drives with moving parts are obsolete (hopefully) anyway so it's all kind of a moot point then.

                True, because preloading everything into a cache would be no faster and no more efficient than loading it from the drives themselves.


                It has become appallingly obvious that our technology has exceeded our humanity. - Albert Einstein

                1 Reply Last reply
                0
                • M Member 96

                  I honestly haven't seen that as a problem. The only thing that has annoyed me about it is the constant hard drive access when I'm doing nothing that should be accessing the hard drive. I think it won't be long before hard drives with moving parts are obsolete (hopefully) anyway so it's all kind of a moot point then.


                  All programmers are playwrights and all computers are lousy actors.

                  R Offline
                  R Offline
                  Ri Qen Sin
                  wrote on last edited by
                  #25

                  John C wrote:

                  I think it won't be long before hard drives with moving parts are obsolete (hopefully) anyway so it's all kind of a moot point then.

                  HyperOS has a product called HyperDrive. It's a hard drive in every aspect but instead of platters, it uses RAM. The data transfer rate is actually fast enough to keep the CPU busy. Hopefully, RAM gets cheap enough to get a decent-sized one for less than $500.

                  ROFLOLMFAO

                  1 Reply Last reply
                  0
                  • L Luis Alonso Ramos

                    A few questions, though. (I genuinely don't know the answers) Does it take more time to load some data into "used" memory than into free memory? Also, I suppose "free" memory simply is marked as such (in OS tables), but electronically it contains certain random information. So, if memory is used, does it really consume more power and reduces its life expecancty?

                    Luis Alonso Ramos Intelectix Chihuahua, Mexico

                    My Blog!

                    M Offline
                    M Offline
                    Mike Dimmick
                    wrote on last edited by
                    #26

                    The answers to some of your questions may be found in "Windows Internals, Fourth Edition". The only difference with loading data into 'used' memory rather than 'free' memory is that the 'used' memory has to be taken away from whatever working set it belongs to first. Windows basically has a few categories of memory status: assigned to one or more working sets, 'standby' (trimmed from working set, either read-only or writable but changes already written back to either the original file [memory-mapped files] or to the pagefile), 'modified' (trimmed from working set, unsaved changes not yet written back), 'free' (link back to working set no longer retained, contents unknown), or 'zero' (all bytes known to be zero because zeros written by the idle thread). When allocating physical memory to a working set, if the memory is going to be used by kernel-mode code only or immediately filled by data coming from disk, the OS takes a page from the 'free' list. Otherwise it will take memory from the 'zero' list to ensure that the process can't see another process's data - this is for security. If the appropriate list is exhausted, the OS tries the other one, zeroing a page from the free list if necessary. If that is also exhausted, it then tries the standby list, but to use a page from that list, it has to unlink it from the invalid page table entry that was pointing to that page, which takes a bit more time. If no pages are available from the standby list the OS will write out a page from the modified list so it can reuse it - this is the only time that eager swapping occurs. When the OS decides, periodically or if memory demands get too great, that a working set is too big, the least-recently-used pages will be trimmed, that is, the corresponding page table entries will be marked inactive. (The processor will then generate a page fault the next time any code touches the page.) If modified since last written (tracked automatically by the processor setting a bit in the page table entry), the page goes onto the modified list, otherwise onto the standby list. Background threads then lazily write back data from the modified list at which point the pages are put on the standby list. However, the page table entry still contains information about which physical page was used. When a page fault occurs, the page fault handler code first checks whether the data is still actually in memory on the standby or modified lists and if so, simply fixes up the PTE to be valid again, takes the page off the corresponding list, and dismisse

                    C L 2 Replies Last reply
                    0
                    • M Member 96

                      A few threads ago someone was slamming Vista again and one of the topics that keeps getting referenced is that it uses too much memory so it inherently bad. This got me curious as to why it does use approx 1gb of memory after boot and here is at least in part the answer: question shouldn't be "Why does Vista use all my memory?", but "Why the heck did previous versions of Windows use my memory so ineffectively?" [^]


                      All programmers are playwrights and all computers are lousy actors.

                      A Offline
                      A Offline
                      Andy Brummer
                      wrote on last edited by
                      #27

                      That sounds like it is a much better way to handle memory in principle, and I'd applaud it if it didn't require adding an extra gig of memory and another hard drive just to get back to xp levels of performance. Performance optimizations should improve performance for a common range of uses, not just a few cases. [edit]Just like you would think that moving a site from ASP to ASP.net would improve performance, and provide more options for usability. Sorry Chris.[/edit]


                      I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon

                      M 1 Reply Last reply
                      0
                      • T Thunderbox666

                        John C wrote:

                        I bet you one dollar that when the big successor to Vista comes out we will see the same arguments again with people saying how much they loved vista and what a hog the new os is and how slow it is etc etc.

                        WOW big bet, you must be confident.... You wont hear me ever say how much I loved Vista... EVER Sure it will get crap thrown all over it when it comes out (thats happened every time), but Vista seems to have copped it worst so far, and it is taking far more time to get accepted then XP or any of the previous versions have

                        "There are three sides to every story. Yours, mine and the truth" ~ unknown

                        M Offline
                        M Offline
                        Member 96
                        wrote on last edited by
                        #28

                        Thunderbox666 wrote:

                        it is taking far more time to get accepted then XP or any of the previous versions have

                        Hmm...maybe my memory is going but I seem to recall back in the old Usenet days when I ran a BBS there was an ongoing "holy war" over windows 95 that lasted at least 2 years. People talked of how it was crazy resource hungry and that microsoft was going to have to keep up support for windows 3.1 because it was so much faster and more efficient. Sound familiar? And in those days windows 95 was being attacked heavily by the OS2 guys who (rightfully) showed many things they did better and scoffed at windows 95 calling itself object oriented. I remember a lot of holdouts over XP and a lot of bitching about ram use etc etc. Sorry but none of this is new and I can gurantee you I'll win that buck. :)


                        All programmers are playwrights and all computers are lousy actors.

                        T 1 Reply Last reply
                        0
                        • A Andy Brummer

                          That sounds like it is a much better way to handle memory in principle, and I'd applaud it if it didn't require adding an extra gig of memory and another hard drive just to get back to xp levels of performance. Performance optimizations should improve performance for a common range of uses, not just a few cases. [edit]Just like you would think that moving a site from ASP to ASP.net would improve performance, and provide more options for usability. Sorry Chris.[/edit]


                          I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon

                          M Offline
                          M Offline
                          Member 96
                          wrote on last edited by
                          #29

                          Andy Brummer wrote:

                          Just like you would think that moving a site from ASP to ASP.net would improve performance, and provide more options for usability

                          Boo! :) Having been a person who attempted *really* hard over a 4 month period to write an asp site in c++ back in the day for a semi sophisticated application and then realizing it was actually easier to write the entire web *server* myself (and I did) and now having done nearly the same task in asp.net in a matter of a few weeks last year I'm dead certain Chris made a fantastic decision. ;)


                          All programmers are playwrights and all computers are lousy actors.

                          A 1 Reply Last reply
                          0
                          • M Member 96

                            Andy Brummer wrote:

                            Just like you would think that moving a site from ASP to ASP.net would improve performance, and provide more options for usability

                            Boo! :) Having been a person who attempted *really* hard over a 4 month period to write an asp site in c++ back in the day for a semi sophisticated application and then realizing it was actually easier to write the entire web *server* myself (and I did) and now having done nearly the same task in asp.net in a matter of a few weeks last year I'm dead certain Chris made a fantastic decision. ;)


                            All programmers are playwrights and all computers are lousy actors.

                            A Offline
                            A Offline
                            Andy Brummer
                            wrote on last edited by
                            #30

                            John C wrote:

                            Having been a person who attempted *really* hard over a 4 month period to write an asp site in c++ back in the day

                            masochist

                            John C wrote:

                            now having done nearly the same task in asp.net in a matter of a few weeks last year I'm dead certain Chris made a fantastic decision.

                            That's a given. It's still frustrating to watch it unfold. CP is going to rock when all the kinks get worked out.

                            I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon

                            1 Reply Last reply
                            0
                            • M Member 96

                              Thunderbox666 wrote:

                              it is taking far more time to get accepted then XP or any of the previous versions have

                              Hmm...maybe my memory is going but I seem to recall back in the old Usenet days when I ran a BBS there was an ongoing "holy war" over windows 95 that lasted at least 2 years. People talked of how it was crazy resource hungry and that microsoft was going to have to keep up support for windows 3.1 because it was so much faster and more efficient. Sound familiar? And in those days windows 95 was being attacked heavily by the OS2 guys who (rightfully) showed many things they did better and scoffed at windows 95 calling itself object oriented. I remember a lot of holdouts over XP and a lot of bitching about ram use etc etc. Sorry but none of this is new and I can gurantee you I'll win that buck. :)


                              All programmers are playwrights and all computers are lousy actors.

                              T Offline
                              T Offline
                              Thunderbox666
                              wrote on last edited by
                              #31

                              John C wrote:

                              Sound familiar?

                              Nope :-D I would have been just starting primary school lol Im still only 19


                              "There are three sides to every story. Yours, mine and the truth" ~ unknown

                              D 1 Reply Last reply
                              0
                              • M Mike Dimmick

                                The answers to some of your questions may be found in "Windows Internals, Fourth Edition". The only difference with loading data into 'used' memory rather than 'free' memory is that the 'used' memory has to be taken away from whatever working set it belongs to first. Windows basically has a few categories of memory status: assigned to one or more working sets, 'standby' (trimmed from working set, either read-only or writable but changes already written back to either the original file [memory-mapped files] or to the pagefile), 'modified' (trimmed from working set, unsaved changes not yet written back), 'free' (link back to working set no longer retained, contents unknown), or 'zero' (all bytes known to be zero because zeros written by the idle thread). When allocating physical memory to a working set, if the memory is going to be used by kernel-mode code only or immediately filled by data coming from disk, the OS takes a page from the 'free' list. Otherwise it will take memory from the 'zero' list to ensure that the process can't see another process's data - this is for security. If the appropriate list is exhausted, the OS tries the other one, zeroing a page from the free list if necessary. If that is also exhausted, it then tries the standby list, but to use a page from that list, it has to unlink it from the invalid page table entry that was pointing to that page, which takes a bit more time. If no pages are available from the standby list the OS will write out a page from the modified list so it can reuse it - this is the only time that eager swapping occurs. When the OS decides, periodically or if memory demands get too great, that a working set is too big, the least-recently-used pages will be trimmed, that is, the corresponding page table entries will be marked inactive. (The processor will then generate a page fault the next time any code touches the page.) If modified since last written (tracked automatically by the processor setting a bit in the page table entry), the page goes onto the modified list, otherwise onto the standby list. Background threads then lazily write back data from the modified list at which point the pages are put on the standby list. However, the page table entry still contains information about which physical page was used. When a page fault occurs, the page fault handler code first checks whether the data is still actually in memory on the standby or modified lists and if so, simply fixes up the PTE to be valid again, takes the page off the corresponding list, and dismisse

                                C Offline
                                C Offline
                                Cyrilix
                                wrote on last edited by
                                #32

                                Very nice post, Mike.

                                1 Reply Last reply
                                0
                                • M martin_hughes

                                  CataclysmicQuantums wrote:

                                  When are programmers going to make computers use their resources to do more useful and sophisticated things instead of being lazy asses and writing things like this...

                                  Poor coding is one thing, wanting to watch full screen HD movies on your computer is quite another.

                                  CataclysmicQuantums wrote:

                                  Ssy that to the family who worked hard to save up just enough money to buy a family computer.

                                  I will. Further more I'll tell them to seek expert advice on the specifications of their new PC before parting with the cash. Besides which, owning a computer is more affordable now than it has ever been; an additional 2GB's of RAM from Crucial costs $117.99.

                                  "On one of my cards it said I had to find temperatures lower than -8. The numbers I uncovered were -6 and -7 so I thought I had won, and so did the woman in the shop. But when she scanned the card the machine said I hadn't. "I phoned Camelot and they fobbed me off with some story that -6 is higher - not lower - than -8 but I'm not having it." -Tina Farrell, a 23 year old thicky from Levenshulme, Manchester.

                                  C Offline
                                  C Offline
                                  Cyrilix
                                  wrote on last edited by
                                  #33

                                  117.99 for 2 GB? I'm sure if you search harder, you can find much better deals than that. :)

                                  1 Reply Last reply
                                  0
                                  • D Dirk Higbee

                                    why oh why does everyone seem to have troubles with OS? Does anyone do a custome install and proper config. I am currently running Vista on a P4 with 1GB memory. I have my iTunes running, I'm here roaming around CP, and I am doing Google searches in another browser and I am using about 45% of my memory. I can open VS2008 Express and work on a project also and still not use all my memory. What is everyone doing that is giving them problems?

                                    If you can read, you can learn

                                    R Offline
                                    R Offline
                                    Rocky Moore
                                    wrote on last edited by
                                    #34

                                    That 45% can be tricky. You need to pay attention to the swap file also, virtual ram can be storing a bunch and causing a lower ram consumption rating. I know on my system running Vista Ultimate 64 with only 1 GB RAM (have a bad memory bank on my board and do not have time to fix currently), I often had my system come to almost an entire halt. Of course, having several IE instances, SQL Sever, SQL Managment Studio and Visual Studio (2008 beta back then) all going at the same time was a good part of the problem :). Anyway, I was getting to the point I figured I had to fix the system so I could add more RAM, but thought I would ReadyBoost a try and purchased a 4GB ReadyBoost compatible Flash Drive and gave it a whirl. While the system can take spawn out a lot of virtual ram at times, it no longer locks up my system, it remains mostly usable even under load. Although, that is about the only difference I noticed using ReadyBoost, it just made the virtual ram access much more survivable, but that is enough, it justified the purchase.

                                    Rocky <>< Blog Post: Silverlight goes Beta 2.0 Tech Blog Post: Cheap Biofuels and Synthetics coming soon?

                                    1 Reply Last reply
                                    0
                                    • M Member 96

                                      I honestly haven't seen that as a problem. The only thing that has annoyed me about it is the constant hard drive access when I'm doing nothing that should be accessing the hard drive. I think it won't be long before hard drives with moving parts are obsolete (hopefully) anyway so it's all kind of a moot point then.


                                      All programmers are playwrights and all computers are lousy actors.

                                      R Offline
                                      R Offline
                                      Rocky Moore
                                      wrote on last edited by
                                      #35

                                      Constant HD access is not always for this reason. If virtual RAM is moving, it can cause it. Then there is Searc Index service that can keep it spinning. Along with Defender and that one SNV client (tortise or something like that). There was one more along these lines, but do not recall what it was. The old SysInternals drive monitor was handy in finding out all the services pulling my drive around. At least now my HD drive light gets a bit of a rest :)

                                      Rocky <>< Blog Post: Silverlight goes Beta 2.0 Tech Blog Post: Cheap Biofuels and Synthetics coming soon?

                                      M 1 Reply Last reply
                                      0
                                      • R Rocky Moore

                                        Constant HD access is not always for this reason. If virtual RAM is moving, it can cause it. Then there is Searc Index service that can keep it spinning. Along with Defender and that one SNV client (tortise or something like that). There was one more along these lines, but do not recall what it was. The old SysInternals drive monitor was handy in finding out all the services pulling my drive around. At least now my HD drive light gets a bit of a rest :)

                                        Rocky <>< Blog Post: Silverlight goes Beta 2.0 Tech Blog Post: Cheap Biofuels and Synthetics coming soon?

                                        M Offline
                                        M Offline
                                        Member 96
                                        wrote on last edited by
                                        #36

                                        It was either superfetch or indexing, I fired up process monitor (replacement for the old sysinternals thing) and narrowed it down to one or the other or both and shut them both down a while back. I have a quiet office and the contstant rumbling from my SATA array was pissing me off. My computer is plenty fast without those features turned on.


                                        When everyone is a hero no one is a hero.

                                        1 Reply Last reply
                                        0
                                        • M Mike Dimmick

                                          The answers to some of your questions may be found in "Windows Internals, Fourth Edition". The only difference with loading data into 'used' memory rather than 'free' memory is that the 'used' memory has to be taken away from whatever working set it belongs to first. Windows basically has a few categories of memory status: assigned to one or more working sets, 'standby' (trimmed from working set, either read-only or writable but changes already written back to either the original file [memory-mapped files] or to the pagefile), 'modified' (trimmed from working set, unsaved changes not yet written back), 'free' (link back to working set no longer retained, contents unknown), or 'zero' (all bytes known to be zero because zeros written by the idle thread). When allocating physical memory to a working set, if the memory is going to be used by kernel-mode code only or immediately filled by data coming from disk, the OS takes a page from the 'free' list. Otherwise it will take memory from the 'zero' list to ensure that the process can't see another process's data - this is for security. If the appropriate list is exhausted, the OS tries the other one, zeroing a page from the free list if necessary. If that is also exhausted, it then tries the standby list, but to use a page from that list, it has to unlink it from the invalid page table entry that was pointing to that page, which takes a bit more time. If no pages are available from the standby list the OS will write out a page from the modified list so it can reuse it - this is the only time that eager swapping occurs. When the OS decides, periodically or if memory demands get too great, that a working set is too big, the least-recently-used pages will be trimmed, that is, the corresponding page table entries will be marked inactive. (The processor will then generate a page fault the next time any code touches the page.) If modified since last written (tracked automatically by the processor setting a bit in the page table entry), the page goes onto the modified list, otherwise onto the standby list. Background threads then lazily write back data from the modified list at which point the pages are put on the standby list. However, the page table entry still contains information about which physical page was used. When a page fault occurs, the page fault handler code first checks whether the data is still actually in memory on the standby or modified lists and if so, simply fixes up the PTE to be valid again, takes the page off the corresponding list, and dismisse

                                          L Offline
                                          L Offline
                                          Luis Alonso Ramos
                                          wrote on last edited by
                                          #37

                                          Wow Mike, great post! I just ordered Windows Internals (the Windows Internals I read was written in 1993 by Matt Pietrek about Win3.1). I guess it will be an interesting read. Thanks for your post, it really cleared many things up. I'll bookmark this in my blog in case I want to refer to it later. :)

                                          Luis Alonso Ramos Intelectix Chihuahua, Mexico

                                          My Blog!

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups