Huge MDB (Need to create Installer or updater)

RainX

Registered User.
Local time
Today, 03:37
Joined
Sep 22, 2006
Messages
89
Hi all,

I'm in a bit of a dilemma here. We currently have an mdb which includes a form and 4 tables. The entire mdb is around 800 megs. What we used to do is distribute this mdb to users in completely different locations on a CD, but now it has become to large for a cd to hold. Our manager wants us to figure out a way to distribute this mdb using only 1 cd, without zipping it. How would i go about doing that? Is there currently an installer, that can download the database from our ftp site and then insert the tables, or is there any other easier way i can get this done?

P.s. The database will likely get larger and larger every year

Thanks in advance
 
You are starting to get into the realm of having to consider:

Either a DVD burner or
Web Based Application or
Terminal Servers

It really boils down to much time is spend propagating these databases vs or providing a different approach. I think you need to weigh up the options as a compressed file may work now but perhaps a more long term strategy is needed.

Simon
 
Have you tried to compact the db to reduce the size? Why not compress the file?
 
I have compacted it using the compact & repair option in access. Is there any other way to compact it?


@simon what do you mean by Terminal Servers? How would that work
 
Last edited:
When you blow out the limits of a CD burner, your choices are limited. What I don't understand is why your bosses don't want to run a zip of the data - or better yet, self-extracting zip.

Still, there are options of a backwards sort. This might (repeat MIGHT) buy you some time until you can switch to another method.

Export your tables to .txt files, delimited and with quotes around text.

Build your database empty but with indexes and relationships defined.

Write a macro that will import the .TXT files

As part of your installation process, include a BAT file that executes Access on this database using the -x macroname option (see Command Line Options in Access Help).

Distribute the empty database and the text files. It is possible that because of indexes you can get back some space by building the indexes on the fly. If your tables AREN'T indexed to begin with, don't bother. This won't buy you anything. Unless... you distribute the DB on one CD and the exported tables on that CD and a second CD.

If the powers that be don't want to use 2 CDs or a ZIP of the database, the next question is how fast a network you have to your sites where this is going. If you have a fast connection, downloading isn't so bad. If there is a dial-up modem in this mix, you are looking at a large number of connect minutes during which you are susceptible to the vagaries of the internet path you are using. Even though network transmissions are protected by checksums per block, your problem is that you still have the potential for some serious corruption.

Terminal Services relates to using something like CITRIX as a way to communicate with a place where the database is kept so that your users can run it remotely. This is not advisable for several reasons including the need for a special Access license to run Access over CITRIX. (Normally, the end user license agreement says "one user per copy of Access".)

Also, depending on what you want to do, this might not work well anyway because there can be many mapping issues with TS connections and CITRIX as an environment. The success rate of Terminal Services connections can be best described as ... questionable at best. They are strongly dependent on whether the network admins allow CITRIX to do what it needs to do in that environment and whether the person managing the CITRIX server is very familiar or NOT very familiar with CITRIX.

Personally, I'd vote for the DVD burner since most recent Windows boxes will read DVDs - I'd say for the last two or three years minimum. Of course, if you have older machines, that becomes a problem too. If DVD is not right, the next best thing in my book would be multiple CDs and then at the other end, import the data files you need into the tables.
 
the next question is how fast a network you have to your sites where this is going. I
The files are distributed to offices in our neighboring cities. They all have pretty fast connections (at the offices). We already have an ftp-site running where people with a username/password download various stuff.
What i had planned was that i'd put the code for the form & reports and a layout of the tables in a mde and distribute that on the cd's. Create an installer where it would put the mde in a certain location on the users computer & download the tables/backend mdb (username/password somehow encrypted) and then insert them into the mde or maybe download a backend mdb with linked tables.
Is this feasible or am i in way over my head with this?
 
^^^ I'll give it a try, but the txt files for the tables themselves hit about 400-500 meg's.
 
The decompile helps a ton if you have a lot of programming in the background. If it's almost all data, though, then the decompile doesn't do too much.

Also, what was wrong with the self-extracting zip idea? Your boss telling you to not compress the file makes no sense. If s/he's just being a hardass for no reason, then explain that you need to upgrade everyone to DVD drives and you need a burner. Money has a tendency to talk. If s/he's afraid that some users may not be familiar with zip (sounds stupid, but it's true -- I have users that send me 8MB attachments that zip down to 600KB), the self-extracting zip only requires them to double-click it for it to install. I use 7-Zip because it's better than regular WinZip, but either will do.
 
^^ ye he is being a hard-ass.
i've been trying to convince him about the self-extracting zip idea... I'll continue to and hope he agrees...


Thanks
 
I few tips for convincing:

- Expense. Any user that doesn't have a DVD drive will need a DVD drive. If it's a small company, then it's easy. If it's a huge company like what I work for, then it's a major hassle because there's more red tape than a ticker-tape parade down Times Square and somehow, a $30 DVD drive becomes $200 per user.

- Convenience. All your users are used to getting their data on a CD. If you change that, it's going to cause a maintenance/support center nightmare. "Why is this different?" blah blah blah. sounds stupid, but it's true. A lot of users don't know the difference between RAM and a hard drive, much less file sizes.

- Maintenance. Does he want you to spend your time figuring out ways around his stupid rule or working? You work to get the job done, not to fill some artificial need. Not compressing is like saying, "I don't care if your low fuel light is on. Do not stop for gas."

- Portability. If you need to work from home (which I do a lot), then zipping the file makes sense. Having a huge file to transfer around via DVDs, FTP, or otherwise is ridiculous when better solutions exist. (FYI: I use 8GB USB key drives to move stuff around. Maybe that's an option? I reviewed a few here and here.)

Note that the second one (the Kingston DTSP) is better for huge companies because of the built-in encryption.

If all of that fails, tell him to transfer it himself, line by line. ;)
 
Last edited:
If your site have fast broadband connections I would seriously consider deploying a Terminal Server with or without Citrix. I would also suggust running the VPN tunnels between sites.

The other advantage is that al the site not only have a common database but also a common depository for any other files. Printers can be associated with each Terminal Server Session.

Basically this is how it works form a users point of view.

Using Remote Desktop Connection logout onto Terminal Server.
On the Terminal Server are the applications and Front End database.
On a File Server is the Data.
Map the data Directory and run your Front End looking to the specified database with the tables.

There is no duplication, it is totally real time and saves a huge amount of time.

From a secuirty point of view the VPN works on the basis of a specific Public IP generally the WAN Address of each SonicWall, the local IP range and a shared secret.

We have galleries in NY, Central London and the main gallery in East London and 15,000 images. The Terminal Server solution is so much more effecient than having to distribute or synchronise databases.

Simon
 
I would go for the thumb drive idea. Thumb drives are not expensive any longer and you can send one to each site and ask for it back after the copy is completed. They are recyclable and inexpensive and various sizes.

I would push this idea if the file compression is not a go!

René
 
The USB drives are a life saver, but I tell you...

Reviewing those things can become harder than DB development. How many ways can I state transfer speeds, security, and construction? ;) The example I used with the editor was, "This is like trying to review a CD. One side is shiny, and the other you can write on!"

Not too much to do with those things, but their usefulness is like the mouse wheel. I never knew I needed one until I had one.
 
another reason for using zip compression is that you can password protect file. unless the .MDB is password protected then anyone can read your comparies data. tell that to your boss
 
You say you have ftp, why not distribute via that? I know it isn't a CD but when you run out of room.. you run out of room.
You could set up a back end front end scenario where everything except the table of data you are sending is in the frontend and your changed data is in the back end. When the back end db is ftp'd, it would replace the old back end. As long as the new db has the exact same name as the old one the table should link up ok with the front end.
 

Users who are viewing this thread

Back
Top Bottom