Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. C# and Large Images

C# and Large Images

Scheduled Pinned Locked Moved C#
graphicscsharpwinformsperformancehelp
15 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • L Luc Pattyn

    Hi, I would try and avoid such situation completely, probably by having a grid of smaller images to start from. Except maybe for a database I wouldn't want to have objects that are larger than main memory. What is the size of your image, what is its source, and what is your application? :)

    Luc Pattyn [Forum Guidelines] [My Articles]


    The quality and detail of your question reflects on the effectiveness of the help you are likely to get. Show formatted code inside PRE tags, and give clear symptoms when describing a problem.


    A Offline
    A Offline
    awaldro
    wrote on last edited by
    #3

    Unfortunately I can't really decrease the size of the images at least as they enter the software. Are there any other solutions that might work with super large images? [further detail on the issue] I'm working with scanned images. My users believe (incorrectly) that to get a good scan they need to work at 1000 dpi. Scanning an entire page yields a 250 MB image. I keep a the original intact for good measure and only work with a copy of it, which means another 250 MB. Finally, they are making numerous partial copies of the original image to do whatever they do, which means probably another 50 MB x 10 copies = 500 MB. Total memory for this scenario so far is somewhere around 1 GB. If the user applies filters in the software these can sip extra memory and push it over the edge. Even if the software doesn't crash in a single scenario like that the user will probably proceed to do this a few times, and possibly in a few different tabs at once. Long story short, it takes some work to get the software to crash but sooner or later they swap the software with all the images they are using. No one seems to happy about the whole ordeal...

    A H 3 Replies Last reply
    0
    • A awaldro

      Unfortunately I can't really decrease the size of the images at least as they enter the software. Are there any other solutions that might work with super large images? [further detail on the issue] I'm working with scanned images. My users believe (incorrectly) that to get a good scan they need to work at 1000 dpi. Scanning an entire page yields a 250 MB image. I keep a the original intact for good measure and only work with a copy of it, which means another 250 MB. Finally, they are making numerous partial copies of the original image to do whatever they do, which means probably another 50 MB x 10 copies = 500 MB. Total memory for this scenario so far is somewhere around 1 GB. If the user applies filters in the software these can sip extra memory and push it over the edge. Even if the software doesn't crash in a single scenario like that the user will probably proceed to do this a few times, and possibly in a few different tabs at once. Long story short, it takes some work to get the software to crash but sooner or later they swap the software with all the images they are using. No one seems to happy about the whole ordeal...

      A Offline
      A Offline
      Alan Balkany
      wrote on last edited by
      #4

      Would compression help? TIFF images support LZW compression. Maybe skip keeping the original intact to save 250 MB, and reload from disk when you want to restore it. Another idea: Instead of partial copies, create a class that provides a virtual partial copy: A link to the whole image, and a Rectangle that defines a portion of the image. If a page contains only black and white text, you could use one bit per pixel which would give you 32X compression and faster processing.

      A 1 Reply Last reply
      0
      • A Alan Balkany

        Would compression help? TIFF images support LZW compression. Maybe skip keeping the original intact to save 250 MB, and reload from disk when you want to restore it. Another idea: Instead of partial copies, create a class that provides a virtual partial copy: A link to the whole image, and a Rectangle that defines a portion of the image. If a page contains only black and white text, you could use one bit per pixel which would give you 32X compression and faster processing.

        A Offline
        A Offline
        awaldro
        wrote on last edited by
        #5

        Thanks for these great ideas - Just a few thoughts on these ideas.    + I'm not sure about compression, I shall have to try it    + I have a cache already. I shall check this to make sure it operates correctly    + Sometimes my users delete the "original" from the screen. "Virtual"       copies wouldn't work well, then.    + The "1bpp" is probably the best idea but all of the filters I have right now       only work on 32 bpp argb :( I shall have to rewrite these so I can directly       modify a 1bpp picture. By the way, can .NET/GDI+ work directly with 1bpp? Keep the ideas coming!

        1 Reply Last reply
        0
        • A awaldro

          Unfortunately I can't really decrease the size of the images at least as they enter the software. Are there any other solutions that might work with super large images? [further detail on the issue] I'm working with scanned images. My users believe (incorrectly) that to get a good scan they need to work at 1000 dpi. Scanning an entire page yields a 250 MB image. I keep a the original intact for good measure and only work with a copy of it, which means another 250 MB. Finally, they are making numerous partial copies of the original image to do whatever they do, which means probably another 50 MB x 10 copies = 500 MB. Total memory for this scenario so far is somewhere around 1 GB. If the user applies filters in the software these can sip extra memory and push it over the edge. Even if the software doesn't crash in a single scenario like that the user will probably proceed to do this a few times, and possibly in a few different tabs at once. Long story short, it takes some work to get the software to crash but sooner or later they swap the software with all the images they are using. No one seems to happy about the whole ordeal...

          H Offline
          H Offline
          Huxley Stronghead
          wrote on last edited by
          #6

          awaldro wrote:

          I'm working with scanned images. My users believe (incorrectly) that to get a good scan they need to work at 1000 dpi. Scanning an entire page yields a 250 MB image.

          If I may ask, what is on their pages that would require such a resolution? The last time I scanned with such a resolution, I could count almost every single atom! If you know what I mean. :omg:

          A 1 Reply Last reply
          0
          • H Huxley Stronghead

            awaldro wrote:

            I'm working with scanned images. My users believe (incorrectly) that to get a good scan they need to work at 1000 dpi. Scanning an entire page yields a 250 MB image.

            If I may ask, what is on their pages that would require such a resolution? The last time I scanned with such a resolution, I could count almost every single atom! If you know what I mean. :omg:

            A Offline
            A Offline
            awaldro
            wrote on last edited by
            #7

            I certainly do - the resolution you can get from a reasonable scanner is quite amazing! The users are scanning latent prints from crime scenes and lift cards. Unfortunately, none of them (or you) would be happy with a invalid positive id - "inclusion" - so they use 1000 dpi. Additionally there is a standards board that regulates the latent software market that 'requires' 1000 dpi. Its really funny - some users think they are going "above and beyond" by scanning at 1200 dpi! I believe I am going to try writing my own C++ code (piecing together parts of FreeImage and a few other free libraries I have found). Who knows how this experiment shall turn out. Of course, if there are any other ideas - I am still open!

            1 Reply Last reply
            0
            • A awaldro

              Unfortunately I can't really decrease the size of the images at least as they enter the software. Are there any other solutions that might work with super large images? [further detail on the issue] I'm working with scanned images. My users believe (incorrectly) that to get a good scan they need to work at 1000 dpi. Scanning an entire page yields a 250 MB image. I keep a the original intact for good measure and only work with a copy of it, which means another 250 MB. Finally, they are making numerous partial copies of the original image to do whatever they do, which means probably another 50 MB x 10 copies = 500 MB. Total memory for this scenario so far is somewhere around 1 GB. If the user applies filters in the software these can sip extra memory and push it over the edge. Even if the software doesn't crash in a single scenario like that the user will probably proceed to do this a few times, and possibly in a few different tabs at once. Long story short, it takes some work to get the software to crash but sooner or later they swap the software with all the images they are using. No one seems to happy about the whole ordeal...

              H Offline
              H Offline
              Huxley Stronghead
              wrote on last edited by
              #8

              Why is 1GB memory a problem? Are you talking about RAM memory? :confused:

              I came, saw, and then coded.

              A 1 Reply Last reply
              0
              • H Huxley Stronghead

                Why is 1GB memory a problem? Are you talking about RAM memory? :confused:

                I came, saw, and then coded.

                A Offline
                A Offline
                awaldro
                wrote on last edited by
                #9

                The problem isn't "1 GB". The problem is "1 GB per Tab" - which is a bit different!

                H 2 Replies Last reply
                0
                • A awaldro

                  The problem isn't "1 GB". The problem is "1 GB per Tab" - which is a bit different!

                  H Offline
                  H Offline
                  Huxley Stronghead
                  wrote on last edited by
                  #10

                  Oh, yes, the tabs you have mentioned. How much RAM do they have? 3GB RAM plus Virtual Memory, may still be enough. Have a look at the Virtual Memory, if that fills up then you have a problem.

                  I came, saw, and then coded.

                  A 1 Reply Last reply
                  0
                  • A awaldro

                    The problem isn't "1 GB". The problem is "1 GB per Tab" - which is a bit different!

                    H Offline
                    H Offline
                    Huxley Stronghead
                    wrote on last edited by
                    #11

                    OK, I have my doubts that you actually have a problem here but this could be a solution if you really want to code something ... Now this idea I already has been mentioned. But I shall explain it a bit deeper and I would like to call it the Google Map approach. It is about cutting the big monster image into smaller pieces, store those pieces on the HDD as temporary files and only load those that are within the view. Which means you will require some sort of zooming function too. Also you will have to calculate the actual visible pixel in the view and the pixel of your image pieces you use. I think this way you could determine whether an image downsize could help. Of course the downsize only will happen to the loaded pieces in the RAM, neither the original nor the cut version on the HDD. If they apply a filter then you should load all the image pieces from HDD, process them and save back. There may occur some lag because of the image loading time from HDD but hey, they want to work with monster images! However, this approach is some sort of streaming which means you really only load that what you need to work with and not the entire thing. I never coded it before but I think Google did. :^)

                    I came, saw, and then coded.

                    A 1 Reply Last reply
                    0
                    • H Huxley Stronghead

                      Oh, yes, the tabs you have mentioned. How much RAM do they have? 3GB RAM plus Virtual Memory, may still be enough. Have a look at the Virtual Memory, if that fills up then you have a problem.

                      I came, saw, and then coded.

                      A Offline
                      A Offline
                      awaldro
                      wrote on last edited by
                      #12

                      The computers have 4 GB of physical memory (32 bit cpu), but I certainly don't get 4GB of memory. Actually it seems like I get roughly 1.1 GB. I presumed this was a .NET thing. Is there a site anywhere that actually says how much memory the .NET runtime can allocate?

                      H 1 Reply Last reply
                      0
                      • H Huxley Stronghead

                        OK, I have my doubts that you actually have a problem here but this could be a solution if you really want to code something ... Now this idea I already has been mentioned. But I shall explain it a bit deeper and I would like to call it the Google Map approach. It is about cutting the big monster image into smaller pieces, store those pieces on the HDD as temporary files and only load those that are within the view. Which means you will require some sort of zooming function too. Also you will have to calculate the actual visible pixel in the view and the pixel of your image pieces you use. I think this way you could determine whether an image downsize could help. Of course the downsize only will happen to the loaded pieces in the RAM, neither the original nor the cut version on the HDD. If they apply a filter then you should load all the image pieces from HDD, process them and save back. There may occur some lag because of the image loading time from HDD but hey, they want to work with monster images! However, this approach is some sort of streaming which means you really only load that what you need to work with and not the entire thing. I never coded it before but I think Google did. :^)

                        I came, saw, and then coded.

                        A Offline
                        A Offline
                        awaldro
                        wrote on last edited by
                        #13

                        This is pretty much in line with what I was thinking I might have to do. There is quite a bit of work to it but it just might be the way to do it! Thanks for the assistance.

                        H 1 Reply Last reply
                        0
                        • A awaldro

                          This is pretty much in line with what I was thinking I might have to do. There is quite a bit of work to it but it just might be the way to do it! Thanks for the assistance.

                          H Offline
                          H Offline
                          Huxley Stronghead
                          wrote on last edited by
                          #14

                          No problem, great minds think alike. :-D

                          I came, saw, and then coded.

                          1 Reply Last reply
                          0
                          • A awaldro

                            The computers have 4 GB of physical memory (32 bit cpu), but I certainly don't get 4GB of memory. Actually it seems like I get roughly 1.1 GB. I presumed this was a .NET thing. Is there a site anywhere that actually says how much memory the .NET runtime can allocate?

                            H Offline
                            H Offline
                            Huxley Stronghead
                            wrote on last edited by
                            #15

                            I have found something here after a short google: http://bytes.com/groups/net-vb/381325-2gb-memory-limit[^] Apperantly .Net can only use maximum of 2GB because of two user modes which do split the total memory. ... so I got it. And yeah, 4GB on a 32Bit Windows actually is only 3GB total. I don't know why Windows swallows one. :confused:

                            I came, saw, and then coded.

                            1 Reply Last reply
                            0
                            Reply
                            • Reply as topic
                            Log in to reply
                            • Oldest to Newest
                            • Newest to Oldest
                            • Most Votes


                            • Login

                            • Don't have an account? Register

                            • Login or register to search.
                            • First post
                              Last post
                            0
                            • Categories
                            • Recent
                            • Tags
                            • Popular
                            • World
                            • Users
                            • Groups