Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Other Discussions
  3. IT & Infrastructure
  4. Storing huge numbers of files without creating directory heirarchy

Storing huge numbers of files without creating directory heirarchy

Scheduled Pinned Locked Moved IT & Infrastructure
question
5 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S Offline
    S Offline
    seasaw
    wrote on last edited by
    #1

    I have an application that needs to store over 500,000 PDF files and be able to get at them easily. I do not want to have to create and manage multiple level directory structures just to split up the files. Can anyone recommend a solution (open source or third party) that can deal with storing, indexing, and retrieving this many files?

    P M 2 Replies Last reply
    0
    • S seasaw

      I have an application that needs to store over 500,000 PDF files and be able to get at them easily. I do not want to have to create and manage multiple level directory structures just to split up the files. Can anyone recommend a solution (open source or third party) that can deal with storing, indexing, and retrieving this many files?

      P Offline
      P Offline
      Pete OHanlon
      wrote on last edited by
      #2

      Well - you're going to need some directory structure because, as far as I remember, you can't store that many files in one directory. I may be wrong, but there used to be a limit in NT and I haven't read anything to suggest that this limit has been removed.

      Deja View - the feeling that you've seen this post before.

      My blog | My articles

      D 1 Reply Last reply
      0
      • P Pete OHanlon

        Well - you're going to need some directory structure because, as far as I remember, you can't store that many files in one directory. I may be wrong, but there used to be a limit in NT and I haven't read anything to suggest that this limit has been removed.

        Deja View - the feeling that you've seen this post before.

        My blog | My articles

        D Offline
        D Offline
        David Crow
        wrote on last edited by
        #3

        Pete O'Hanlon wrote:

        ...there used to be a limit in NT...

        Wasn't that for the root folder only?

        "Love people and use things, not love things and use people." - Unknown

        "To have a respect for ourselves guides our morals; to have deference for others governs our manners." - Laurence Sterne

        1 Reply Last reply
        0
        • S seasaw

          I have an application that needs to store over 500,000 PDF files and be able to get at them easily. I do not want to have to create and manage multiple level directory structures just to split up the files. Can anyone recommend a solution (open source or third party) that can deal with storing, indexing, and retrieving this many files?

          M Offline
          M Offline
          Member 4194593
          wrote on last edited by
          #4

          Why not try Winzip 10.0.

          S 1 Reply Last reply
          0
          • M Member 4194593

            Why not try Winzip 10.0.

            S Offline
            S Offline
            seasaw
            wrote on last edited by
            #5

            Interesting idea - thanks. The old ZIP format could only store 65K files, but looking at the winzip web site, their latest format can store "unlimited" files. For those that replied about the number of files that NTFS can manage, according to the Microsoft web site (reference below), NTFS can handle 4,294,967,295 files per volume (2^32 - 1), and it does not matter whether this is in the root or sub-directories. The problem is when the number of files in a single directory approach 300,000 maintaining the 8.2 file cross reference causes NTFS to get very inefficient. So storing the PDF files in a ZIP file may be the way to go (I will test this out). THANKS for the response... "http://www.microsoft.com/technet/prodtechnol/windows2000serv/reskit/prork/prdf\_fls\_pxjh.mspx?mfr=true" Maximum Sizes on NTFS Volumes In theory, the maximum NTFS volume size is 2^64 clusters. However, there are limitations to the maximum size of a volume, such as volume tables. By industry standards, volume tables are limited to 2^32 sectors. Sector size, another limitation, is typically 512 bytes. While sector sizes might increase in the future, the current size puts a limit on a single volume of 2 terabytes (2^32 * 512 bytes, or 2^41 bytes). For now, 2 terabytes is considered the practical limit for both physical and logical volumes using NTFS. Table 17.5 lists NTFS size limits. Description Limit Maximum file size 2^64 - 1 KB (Theoretical) 2^44 - 64 KB (Implementation) Maximum volume size 2^64 clusters (Theoretical) 2^32 clusters (Implementation) Files per volume 2^32 - 1

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups