Storing huge numbers of files without creating directory heirarchy
-
I have an application that needs to store over 500,000 PDF files and be able to get at them easily. I do not want to have to create and manage multiple level directory structures just to split up the files. Can anyone recommend a solution (open source or third party) that can deal with storing, indexing, and retrieving this many files?
-
I have an application that needs to store over 500,000 PDF files and be able to get at them easily. I do not want to have to create and manage multiple level directory structures just to split up the files. Can anyone recommend a solution (open source or third party) that can deal with storing, indexing, and retrieving this many files?
Well - you're going to need some directory structure because, as far as I remember, you can't store that many files in one directory. I may be wrong, but there used to be a limit in NT and I haven't read anything to suggest that this limit has been removed.
Deja View - the feeling that you've seen this post before.
-
Well - you're going to need some directory structure because, as far as I remember, you can't store that many files in one directory. I may be wrong, but there used to be a limit in NT and I haven't read anything to suggest that this limit has been removed.
Deja View - the feeling that you've seen this post before.
Pete O'Hanlon wrote:
...there used to be a limit in NT...
Wasn't that for the root folder only?
"Love people and use things, not love things and use people." - Unknown
"To have a respect for ourselves guides our morals; to have deference for others governs our manners." - Laurence Sterne
-
I have an application that needs to store over 500,000 PDF files and be able to get at them easily. I do not want to have to create and manage multiple level directory structures just to split up the files. Can anyone recommend a solution (open source or third party) that can deal with storing, indexing, and retrieving this many files?
Why not try Winzip 10.0.
-
Why not try Winzip 10.0.
Interesting idea - thanks. The old ZIP format could only store 65K files, but looking at the winzip web site, their latest format can store "unlimited" files. For those that replied about the number of files that NTFS can manage, according to the Microsoft web site (reference below), NTFS can handle 4,294,967,295 files per volume (2^32 - 1), and it does not matter whether this is in the root or sub-directories. The problem is when the number of files in a single directory approach 300,000 maintaining the 8.2 file cross reference causes NTFS to get very inefficient. So storing the PDF files in a ZIP file may be the way to go (I will test this out). THANKS for the response... "http://www.microsoft.com/technet/prodtechnol/windows2000serv/reskit/prork/prdf\_fls\_pxjh.mspx?mfr=true" Maximum Sizes on NTFS Volumes In theory, the maximum NTFS volume size is 2^64 clusters. However, there are limitations to the maximum size of a volume, such as volume tables. By industry standards, volume tables are limited to 2^32 sectors. Sector size, another limitation, is typically 512 bytes. While sector sizes might increase in the future, the current size puts a limit on a single volume of 2 terabytes (2^32 * 512 bytes, or 2^41 bytes). For now, 2 terabytes is considered the practical limit for both physical and logical volumes using NTFS. Table 17.5 lists NTFS size limits. Description Limit Maximum file size 2^64 - 1 KB (Theoretical) 2^44 - 64 KB (Implementation) Maximum volume size 2^64 clusters (Theoretical) 2^32 clusters (Implementation) Files per volume 2^32 - 1