Question Number of folders on a server

GaryPanic

Smoke me a Kipper,Skipper
Local time
Today, 04:49
Joined
Nov 8, 2005
Messages
3,309
Guys - quickie - is there a maximum number of folders you can have on a server ???

reason

I have d/b set up to create a folder per record to store pdf/emails/word files in - and i canenvisage around 2000-3000 records per year
at the end of a 12 month period i can archive off about 60-70 %
but over a 3-4 year period i can see 10,000 folders quite easliy -- will this cause a network problem ?
 
Gary

No idea about folders question but just a little thought.

If you use a naming convention for the various files that includes the Record Reference then you may be able to get away with a lot less folders.

I did similar sort of thing with 60,000 drawings. I used the Drg prefix as the designated folder name in which the drawing was to be filed.

Using a naming convention also allows automatic creation of hyperlinks etc

Please ignore this chat if you have already gone through this but just like to help if I can :) and I was bored :)

L
 
thanks - i already have create folder on a number - and also open folder based on this number all within access - i don't think i will have a problem -but thought it worth asking before i go down this route ..
i think 10,000 will be my limit -cos i am going to archive and burn off on to disc and have a disc refrence on old file sdtore on my record ( something along the lines of Archived - disc 45/46 / 47 ectc
 
addition - within each file will be upto 30- docs/etc/email/pdfssome may be more and some might only be 3-4 but average on the high side (one of my paper file is 3 inchs think - and another 5 pages )
so you can 10,000*30= 300,000 mixture of files

but thanks Len
 
The limit of folders is pretty high, not sure if there is a theoretic limit on folders. There is definitely a limit on the number of files (of which "folders" is a sub-set). However, there is a HUGE and I mean GARGANTUAN performance hit to be taken when you have huge numbers of folders.

See also this post: http://forums.yellowworld.org/archive/index.php/t-19104.html

The file system treats a folder like a file. (Well, ... DUH that's what it is...). To find a file in folder, the folder scanner reads the folder into memory and tries to find what you asked for by name. If you have, say, 60,000 such folders, then you have to search a file with 60,000 entries each time you want to open a new file. Don't even ASK what it is like when you have to insert a file reference. Deleting and printing a directory listing will also become cumbersome.

It doesn't get any better if you have just one folder and 60,000 file names (with varying file types). Same problem, different file type. Bloated is bloated however you cut it.

Putting this on another drive wouldn't be so bad if it is local to the machine doing this. But remember, Access isn't the ONLY thing that bogs down with file servers. Opening a window to a folder with 60,000 items in it when running over a network would consume a significant part of your youth, assuming you still had it.

I don't know that I have an answer, but you might be able to get somewhere if you could perhaps use WinZip AND if there is an ActiveX library for it that you could use as a reference. Then at least the individual files for each number become less of an issue and you become better able to subdivide the folders. Say, a top-level folder of the first digit or first two digits of your index number, at most a second-level folder of the next two digits, and then store the raw files (or .ZIP file) if possible. The more you can store in a ZIP archive, the better off you are in terms of files, waste of slack space, and in being able to find the files you want in the first place.

Just because you CAN make a s**tpot load of files doesn't mean you really should.
 
Last edited:
Big thanks - I had consider the Winzip option - I doubt if i will get to 10,000 folders but putting a huge buffer limit of 10k seemed like a good idea

what i will be doing is as each case is renewed i will archive off (in batches ) on to DVDs - this then will reduce the server storage - however some files i will need to keep 6 years +

(Insurance related and claims related -)
ie a PL cliam can take 6 years to complete :-
beofe anyone complains about the lenght of time - its in the claimants interest to have it this way - insurance companies would prefer 3 years

a claim can be made up to 3 years after the event - and it can take 3 years to settle

each year it takes to settle adds more money on tot he claim

i.e if you lost a hand 6 years ago you would of got £5-7.5 k -- now £25K
so to delay the claim makes sense for the claimant - but for the insurer is a nightmare as they have to reserve quite high on a premium thats relative small
 

Users who are viewing this thread

Back
Top Bottom