Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. To defrag a SSD or not to defrag a SSD

To defrag a SSD or not to defrag a SSD

Scheduled Pinned Locked Moved The Lounge
performancealgorithmsquestion
18 Posts 10 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Offline
    C Offline
    Cp Coder
    wrote on last edited by
    #1

    A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment: I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data. So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged. Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s! It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation. Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace! :) Note: Windows reported as follows after defragging the C: drive: Pre-Optimization Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 20% Largest free space size = 863.72 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics. The operation completed successfully. Post Defragmentation Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 0% Largest free space size = 863.75 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics.

    Ok, I have had my coffee, so you can all come out now!

    M D J F D 7 Replies Last reply
    0
    • C Cp Coder

      A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment: I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data. So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged. Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s! It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation. Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace! :) Note: Windows reported as follows after defragging the C: drive: Pre-Optimization Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 20% Largest free space size = 863.72 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics. The operation completed successfully. Post Defragmentation Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 0% Largest free space size = 863.75 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics.

      Ok, I have had my coffee, so you can all come out now!

      M Offline
      M Offline
      Maximilien
      wrote on last edited by
      #2

      The only reason we used to use the defrag tool was to watch the animation of the blocks being moved around.

      CI/CD = Continuous Impediment/Continuous Despair

      D 1 Reply Last reply
      0
      • C Cp Coder

        A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment: I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data. So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged. Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s! It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation. Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace! :) Note: Windows reported as follows after defragging the C: drive: Pre-Optimization Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 20% Largest free space size = 863.72 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics. The operation completed successfully. Post Defragmentation Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 0% Largest free space size = 863.75 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics.

        Ok, I have had my coffee, so you can all come out now!

        D Offline
        D Offline
        Dave Kreskowiak
        wrote on last edited by
        #3

        Did you run the tests more than once before and after the defrag?

        Asking questions is a skill CodeProject Forum Guidelines Google: C# How to debug code Seriously, go read these articles. Dave Kreskowiak

        C 1 Reply Last reply
        0
        • D Dave Kreskowiak

          Did you run the tests more than once before and after the defrag?

          Asking questions is a skill CodeProject Forum Guidelines Google: C# How to debug code Seriously, go read these articles. Dave Kreskowiak

          C Offline
          C Offline
          Cp Coder
          wrote on last edited by
          #4

          A valid question! In response to your question I just now ran the "after" test again and got the same result. The "before" test I ran many times over the weeks and never got the speed that I am getting now. Also: I did a clean install on the machine 3 days ago, and this may explain the 20% fragmentation.

          Ok, I have had my coffee, so you can all come out now!

          1 Reply Last reply
          0
          • C Cp Coder

            A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment: I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data. So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged. Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s! It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation. Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace! :) Note: Windows reported as follows after defragging the C: drive: Pre-Optimization Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 20% Largest free space size = 863.72 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics. The operation completed successfully. Post Defragmentation Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 0% Largest free space size = 863.75 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics.

            Ok, I have had my coffee, so you can all come out now!

            J Offline
            J Offline
            Jeremy Falcon
            wrote on last edited by
            #5

            Cp-Coder wrote:

            It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation.

            People that pretend to be experts claim that since there are no mechanical parts in an SSD that fragmentation is no longer an issue clearly do not understand how software for a file system can also be a bottleneck. But the fact is, nobody verifies anything and just repeats crap. That being said, it's much less of an issue these days. Back in the day if you had a fragmented filesystem... you would know. SSDs work substantially quicker than mechanical drives. In theory the speed of electricity. It's still a tradeoff between defraging and a shorter lifespan of the drive though. I mean they're much cheaper now and last a long time, still worth knowing about the tradeoff. Just a tip to keep the FS from fragmenting. If you have a bunch of files _that you move around a lot_ you can always dump them in a zip file that's lightly compressed. It'll spare your real file system. Granted, probably better to do this for tiny text files that aren't source controlled, so maybe it's not practical.

            Jeremy Falcon

            1 Reply Last reply
            0
            • C Cp Coder

              A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment: I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data. So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged. Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s! It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation. Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace! :) Note: Windows reported as follows after defragging the C: drive: Pre-Optimization Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 20% Largest free space size = 863.72 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics. The operation completed successfully. Post Defragmentation Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 0% Largest free space size = 863.75 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics.

              Ok, I have had my coffee, so you can all come out now!

              F Offline
              F Offline
              fgs1963
              wrote on last edited by
              #6

              So I have this Ferrari Daytona SP3... and I was wondering if I could use jet fuel instead of gasoline. I asked online and most people suggested that jet fuel was probably a bad idea since it would likely shorten the cars life. I decided to test it at the track and discovered that using gasoline got me from 0-100mph in about 5.8s and topped out at about 210mph. However, if I used jet fuel it got me from 0-100mph in 4.7s and topped out at 242mph. Awesome! I'm sticking with jet fuel!! Never mind that I live in the city and my average car trip is less than 3 miles (round trip). Bottom line: Not sure that using Macrium Reflect is the best judge of your real world system performance. Just saying...

              C J 2 Replies Last reply
              0
              • F fgs1963

                So I have this Ferrari Daytona SP3... and I was wondering if I could use jet fuel instead of gasoline. I asked online and most people suggested that jet fuel was probably a bad idea since it would likely shorten the cars life. I decided to test it at the track and discovered that using gasoline got me from 0-100mph in about 5.8s and topped out at about 210mph. However, if I used jet fuel it got me from 0-100mph in 4.7s and topped out at 242mph. Awesome! I'm sticking with jet fuel!! Never mind that I live in the city and my average car trip is less than 3 miles (round trip). Bottom line: Not sure that using Macrium Reflect is the best judge of your real world system performance. Just saying...

                C Offline
                C Offline
                Cp Coder
                wrote on last edited by
                #7

                Since I use Macrium Reflect almost on a daily basis, it is a valid metric for me. I will continue doing what is best for me, and you can do whatever works for you!

                Ok, I have had my coffee, so you can all come out now!

                F D 2 Replies Last reply
                0
                • C Cp Coder

                  A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment: I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data. So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged. Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s! It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation. Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace! :) Note: Windows reported as follows after defragging the C: drive: Pre-Optimization Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 20% Largest free space size = 863.72 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics. The operation completed successfully. Post Defragmentation Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 0% Largest free space size = 863.75 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics.

                  Ok, I have had my coffee, so you can all come out now!

                  D Offline
                  D Offline
                  dandy72
                  wrote on last edited by
                  #8

                  I would imagine there's some overhead in processing a ton of file pointers to determine where the next chunk of a badly fragmented file is, as opposed to having a file stored in one continuous chain. Would that account for the difference? I have no idea. Still. I don't know about Macrium's internals, but in theory, if a backup program worked by copying entire disks/partitions, as opposed to reading file systems, then it wouldn't matter how fragmented (or not) a disk is, or whether the software even needs to understand what file system is being used. Of course that means backing up a 1TB drive that's only 10% full will back up 1TB and not 100GB. I have a 2-disk USB enclosure that's like this. There's a button on the front that, if held when powering up, will blindly clone one drive to the other, regardless of file system (assuming the target is the same or larger capacity). And if the source drive has tons of fragmentation, the individual cloned files will be as badly fragmented.

                  C D 2 Replies Last reply
                  0
                  • C Cp Coder

                    Since I use Macrium Reflect almost on a daily basis, it is a valid metric for me. I will continue doing what is best for me, and you can do whatever works for you!

                    Ok, I have had my coffee, so you can all come out now!

                    F Offline
                    F Offline
                    fgs1963
                    wrote on last edited by
                    #9

                    Cp-Coder wrote:

                    I will continue doing what is best for me, and you can do whatever works for you!

                    Of course... BTW - I was joking with my "analogy"; hence the "joke" icon of my post. Out of curiosity, what role / function does this PC perform that necessitates such heavy use of Reflect?

                    C 1 Reply Last reply
                    0
                    • C Cp Coder

                      Since I use Macrium Reflect almost on a daily basis, it is a valid metric for me. I will continue doing what is best for me, and you can do whatever works for you!

                      Ok, I have had my coffee, so you can all come out now!

                      D Offline
                      D Offline
                      dandy72
                      wrote on last edited by
                      #10

                      Ultimately that's all that matters, doesn't it. If you have a measurable difference in performance, stick with your current method.

                      1 Reply Last reply
                      0
                      • F fgs1963

                        Cp-Coder wrote:

                        I will continue doing what is best for me, and you can do whatever works for you!

                        Of course... BTW - I was joking with my "analogy"; hence the "joke" icon of my post. Out of curiosity, what role / function does this PC perform that necessitates such heavy use of Reflect?

                        C Offline
                        C Offline
                        Cp Coder
                        wrote on last edited by
                        #11

                        I keep my data on a separate drive which I backup separately, so the C: drive only has Windows and the applications. So my Macrium images take less than a minute to create. Since it hardly takes any time, I take an image every morning first thing and I can restore my machine to a previous state if I pick up anything nasty or unwelcome.

                        Ok, I have had my coffee, so you can all come out now!

                        J 1 Reply Last reply
                        0
                        • D dandy72

                          I would imagine there's some overhead in processing a ton of file pointers to determine where the next chunk of a badly fragmented file is, as opposed to having a file stored in one continuous chain. Would that account for the difference? I have no idea. Still. I don't know about Macrium's internals, but in theory, if a backup program worked by copying entire disks/partitions, as opposed to reading file systems, then it wouldn't matter how fragmented (or not) a disk is, or whether the software even needs to understand what file system is being used. Of course that means backing up a 1TB drive that's only 10% full will back up 1TB and not 100GB. I have a 2-disk USB enclosure that's like this. There's a button on the front that, if held when powering up, will blindly clone one drive to the other, regardless of file system (assuming the target is the same or larger capacity). And if the source drive has tons of fragmentation, the individual cloned files will be as badly fragmented.

                          C Offline
                          C Offline
                          Cp Coder
                          wrote on last edited by
                          #12

                          No. I have set up Macrium so that it only includes actual files in the image. So my images reflect the size of the used parts of the disk, not the entire disk. This works very well and I have restored my C: from such images dozens of times. Macrium also includes all partitions on the systems drive by default. It is really a fantastic utility for restoring your machine in case of some disaster.

                          Ok, I have had my coffee, so you can all come out now!

                          1 Reply Last reply
                          0
                          • C Cp Coder

                            I keep my data on a separate drive which I backup separately, so the C: drive only has Windows and the applications. So my Macrium images take less than a minute to create. Since it hardly takes any time, I take an image every morning first thing and I can restore my machine to a previous state if I pick up anything nasty or unwelcome.

                            Ok, I have had my coffee, so you can all come out now!

                            J Offline
                            J Offline
                            jschell
                            wrote on last edited by
                            #13

                            Cp-Coder wrote:

                            So my Macrium images take less than a minute to create

                            So if it slowed down by say 10% then that would be too slow?

                            1 Reply Last reply
                            0
                            • F fgs1963

                              So I have this Ferrari Daytona SP3... and I was wondering if I could use jet fuel instead of gasoline. I asked online and most people suggested that jet fuel was probably a bad idea since it would likely shorten the cars life. I decided to test it at the track and discovered that using gasoline got me from 0-100mph in about 5.8s and topped out at about 210mph. However, if I used jet fuel it got me from 0-100mph in 4.7s and topped out at 242mph. Awesome! I'm sticking with jet fuel!! Never mind that I live in the city and my average car trip is less than 3 miles (round trip). Bottom line: Not sure that using Macrium Reflect is the best judge of your real world system performance. Just saying...

                              J Offline
                              J Offline
                              jschell
                              wrote on last edited by
                              #14

                              fgs1963 wrote:

                              Awesome! I'm sticking with jet fuel!!

                              Next test is with the lawn mower?

                              1 Reply Last reply
                              0
                              • C Cp Coder

                                A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment: I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data. So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged. Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s! It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation. Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace! :) Note: Windows reported as follows after defragging the C: drive: Pre-Optimization Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 20% Largest free space size = 863.72 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics. The operation completed successfully. Post Defragmentation Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 0% Largest free space size = 863.75 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics.

                                Ok, I have had my coffee, so you can all come out now!

                                L Offline
                                L Offline
                                Lost User
                                wrote on last edited by
                                #15

                                No "pointer space" reclaimed apparently. Some sort of reorg based on size and or frequency of use?

                                "Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I

                                1 Reply Last reply
                                0
                                • D dandy72

                                  I would imagine there's some overhead in processing a ton of file pointers to determine where the next chunk of a badly fragmented file is, as opposed to having a file stored in one continuous chain. Would that account for the difference? I have no idea. Still. I don't know about Macrium's internals, but in theory, if a backup program worked by copying entire disks/partitions, as opposed to reading file systems, then it wouldn't matter how fragmented (or not) a disk is, or whether the software even needs to understand what file system is being used. Of course that means backing up a 1TB drive that's only 10% full will back up 1TB and not 100GB. I have a 2-disk USB enclosure that's like this. There's a button on the front that, if held when powering up, will blindly clone one drive to the other, regardless of file system (assuming the target is the same or larger capacity). And if the source drive has tons of fragmentation, the individual cloned files will be as badly fragmented.

                                  D Offline
                                  D Offline
                                  Daniel Pfeffer
                                  wrote on last edited by
                                  #16

                                  dandy72 wrote:

                                  I would imagine there's some overhead in processing a ton of file pointers to determine where the next chunk of a badly fragmented file is, as opposed to having a file stored in one continuous chain. Would that account for the difference? I have no idea.

                                  The actual read operation(s) might not take much longer, but the host turnaround time will certainly affect things. Optimized disk: single read operation of N blocks Fragmented disk: read operation of N1 blocks (host turnaround time 1) read operation of N2 blocks (host turnaround time 2) ... where N1 + N2 + ... = N Note that if the hardware implements read gather/write scatter operations and the O/S supports them, this may mitigate much of the host overhead due to fragmentation. I know that eMMC has such operations, and I assume that most other current protocols do, too.

                                  Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

                                  1 Reply Last reply
                                  0
                                  • M Maximilien

                                    The only reason we used to use the defrag tool was to watch the animation of the blocks being moved around.

                                    CI/CD = Continuous Impediment/Continuous Despair

                                    D Offline
                                    D Offline
                                    den2k88
                                    wrote on last edited by
                                    #17

                                    And the soothing grinding noise of the HDD

                                    GCS/GE d--(d) s-/+ a C+++ U+++ P-- L+@ E-- W+++ N+ o+ K- w+++ O? M-- V? PS+ PE Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++*      Weapons extension: ma- k++ F+2 X The shortest horror story: On Error Resume Next

                                    1 Reply Last reply
                                    0
                                    • C Cp Coder

                                      A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment: I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data. So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged. Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s! It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation. Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace! :) Note: Windows reported as follows after defragging the C: drive: Pre-Optimization Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 20% Largest free space size = 863.72 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics. The operation completed successfully. Post Defragmentation Report: Volume Information: Volume size = 930.65 GB Free space = 868.64 GB Total fragmented space = 0% Largest free space size = 863.75 GB Note: File fragments larger than 64MB are not included in the fragmentation statistics.

                                      Ok, I have had my coffee, so you can all come out now!

                                      D Offline
                                      D Offline
                                      den2k88
                                      wrote on last edited by
                                      #18

                                      A long chunk of contiguous data will make the best use of the onboard precaching, allowing the disk to reach the reported speeds (e.g. 5500 mb/s reading) and it MAY be a significant improvement for the loading times of large games or large video files. It will shorten the ssd lifespan especially on cheap ones (many will use MLC or TLC to increase the available space at teh cost of longevity), so if you really have this necessity of working on large video files or regularly play huge games with long loading times even on ssd (bad game design) I'd consider buying a high quality lower capacity one for these kind of works.

                                      GCS/GE d--(d) s-/+ a C+++ U+++ P-- L+@ E-- W+++ N+ o+ K- w+++ O? M-- V? PS+ PE Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- r+++ y+++*      Weapons extension: ma- k++ F+2 X The shortest horror story: On Error Resume Next

                                      1 Reply Last reply
                                      0
                                      Reply
                                      • Reply as topic
                                      Log in to reply
                                      • Oldest to Newest
                                      • Newest to Oldest
                                      • Most Votes


                                      • Login

                                      • Don't have an account? Register

                                      • Login or register to search.
                                      • First post
                                        Last post
                                      0
                                      • Categories
                                      • Recent
                                      • Tags
                                      • Popular
                                      • World
                                      • Users
                                      • Groups