Question Attachments

Dan25e

Registered User.
Local time
Today, 20:05
Joined
Dec 3, 2014
Messages
33
Hi folks,

I'm in the process of re-developing my company's document library database. It currently has hyperlinks to documents stored on a network drive. This has obvious drawbacks, the most annoying of them being users bypassing the library and looking in the network drive for the document they want (and sometimes complaining about the way the file has been named!). I want to make use of the attachment field in access doing away with the network drive location and hyperlinks. Problem is there are over 20000 documents! So, is there a way of bulk adding attachments to a multiple database records as opposed to adding them one or more at a time to individual database records? Could the existing hyperlink be useful in assisting with this task?

Thanks in advance
Dan
 
This has obvious drawbacks, the most annoying of them being users bypassing the library and looking in the network drive for the document they want
Not obvious why this is a drawback. Why is it a drawback?

You really really do not want to do what you are proposing.


  1. Attachment fields are an Access-thing only, not upgradeble to any other RDBMS
  2. Documents consume quuite some space - you may hit the 2 gb limit sooner than expected
  3. You now have a lightweight setup, it will not remain so after stuffing it with attachments.
  4. You have something working now. You want to invest effort, but afterwards you do not have more functionality than before, but less. What is there to justify that expenditure?
Update: Windows has much better indexed search facilities for stored files that you could ever make yourself within Access.

Overall, do you have other - not mentioned here - reasons for wanting to stuff docs into Access?
 
Last edited:
I'm proposing this method for security reasons.

The current database has ten or so tables in it. I can work around the 2GB limit by splitting each table into its own back end giving me effectively 20GB to play with which is about 8 more than I need.

What actually happens when a database nears its size limit?

The intention is to upsize to SQL in the near future. Do the same attachment stuffing issues apply?
 
I'm proposing this method for security reasons.

The current database has ten or so tables in it. I can work around the 2GB limit by splitting each table into its own back end giving me effectively 20GB to play with which is about 8 more than I need.

What actually happens when a database nears its size limit?

The intention is to upsize to SQL in the near future. Do the same attachment stuffing issues apply?

It looks like you have made up your mind prior to writing here. Spike's first point was that "Attachments" are native to Access and therefore not portable. So your near-future "upsizing" to SQL will really be scrambling for another improbable solution.

FWIW, I have done a similar project (DML) at a Canadian Fed department three years ago. Used Access as an indexing and search engine/document server for the library. The network path to the documents was a text field in a table and you could retrieve the docs by either shelling or hyperlinking to them in their native format. No issues with "security" existed. The share containing the library was protected under gov standard network security protocol.

But as I said, you seem to be dead-set to look for trouble. So, I wish you good luck, whichever way that may apply in your case.

Best,
Jiri
 
On the contrary, I appreciate yours and Spikes input and as such am starting to shy away from the idea.
 
I'm still interested to know what happens when a database bloats out. Could you enlighten me?

Speaking metaphorically, it's good idea to tell someone to avoid sharks but you also need to show them what they look like.
 
I'm still interested to know what happens when a database bloats out. Could you enlighten me?

Speaking metaphorically, it's good idea to tell someone to avoid sharks but you also need to show them what they look like.

I am not sure i can enlighten you. Speaking metaphorically, it's really hard to say where the sharks are when the water is totally muddy. But if you want to avoid shark-infested waters in db design generally you may want to follow these few maxims:

1. Keep things simple !
2. Index wisely !
3. Small tables are beautiful !
4. Beware of sprawling networks and creeping elegance !
5. If it ain't broke don't fix it !
6. Don't use maxed strings for state abbreviations !
7. Minor improvements are often a way of begging for big disasters !

Best of luck !

Jiri
 
I'm still interested to know what happens when a database bloats out. Could you enlighten me?

Speaking metaphorically, it's good idea to tell someone to avoid sharks but you also need to show them what they look like.

if it exceeds 2Gb, it just stops. you may find it is then completely unusable, even to be able to extract data from (ie use as an import source). I found that.

So make sure you have an adequate back up strategy.

-----
SUGGESTION

Assuming you go with a link/hyperlink method, then out of interest, who puts the files (links) in the access database. If the users don't do this, you may be able to conceal the actual location of the files from them.

Maybe you could put the actual files in a folder they do not have permission to use, and just grant the permission in order for your hyperlink to work - then rescind it again.

just thinking out loud!
 
I took on a client db a few months ago and immediately saw a size issue they had, I produced an image (sensitive data blurred out) to show what would happen in 6 - 8 weeks time if they carried on at the same rate.

Just to mention also that you cannot have referential integrity between tables in different databases, although who would want to maintain a mess like that.
Like Solo says it is best to keep it simple.
 

Attachments

  • 2GB.jpg
    2GB.jpg
    12.7 KB · Views: 170

Users who are viewing this thread

Back
Top Bottom