remote sites, secure data (1 Viewer)

peffersp

New member
Local time
Today, 10:19
Joined
Feb 28, 2007
Messages
5
I would like some options on how to set up this db... I have a dozen or so locations that will be doing data entry (into access). Every day, i would like to connect them (via vpn) to a server so that they can upload their data. I would like each site to be separate (they cannot access each others data).. What are my options?

Make each site a replica?

Or would it be better to write some vba code to append the tables from the sites to the server?

The two main concerns here are security and reliability.

Thanks for any advice!

Pete
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 09:19
Joined
Feb 28, 2001
Messages
27,223
I'll muse on the subject, but this should be taken as musing and not a strict design effort.

If your issues include connecting only at certain times, then your VPN might as well be treated like an FTP-class connection. In a way, this makes it easy to choose a design of sorts. The trick will be to keep everyone using the same versions of everything.

Twelve sites, twelve databases, twelve LOCAL copies of the application. Performance will not be an issue. If they are to stay separate, then let them always and forever be separate. Once per day, start up a procedure via the Windows Task Scheduler on each machine. Read up on Access Help for Command Line Options, specifically /X {macro-name} as a way to run something macro-ish in your databases. (This presumes the machines will be running untended OR that you will have someone leave them on until a time after this process is done.)

So... build a macro to EXPORT via query the files needed for your database update process. In the batch file triggered by Windows Scheduler, run access with the /x to trigger the export to a file. Then use some sort of copy command or an FTP command or SOMETHING to actually transfer the file. If it is a VPN and you have a mapped shared drive, this would be easier but not absolutely required. Just lots easier.

Anyway, this macro runs on each remote machine to take a daily data dump and send it to the central site as a file. Make each file name predictable either by file name, location (sub-folder name) on the central server, or both. Put some import code in your centralized MDB to pick up the files so happily sent to you from the other sites.

Just musing in overview mode, you understand. Run that around a bit and see if it makes sense in your environment. If so, explore the options. If not, it was just a thought that MIGHT have worked.
 

peffersp

New member
Local time
Today, 10:19
Joined
Feb 28, 2007
Messages
5
thanks Doc_man....

The machines at the remote sites will be shut down at night so I will have the user activate the transfer manually..

I'll have to look into how large the tables get. If i export them like you suggest, I would have to send the entire table every time a transfer is initiated - unless I only export the changes which i suppose might be possible..

I suppose the challenge would be writing the code for the pickup and somehow have it append the changes to the server database...

I'm sure that ill have more questions when i get into it..

Thanks again!
Pete

any more ideas, let me know...
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 09:19
Joined
Feb 28, 2001
Messages
27,223
(Only export the changes....)

Put a flag in the remote databases - Yes/No field - that gets updated by a query under the same macro to drive the export. I.e. export the NEW rows only based on the flag being NO. Then update the Yes/No to YES. Make the default for this field NO. Also, you could make the field get RESET to NO if someone updates the local copy. (Presumes everyone works through forms.)
 

dfenton

AWF VIP
Local time
Today, 10:19
Joined
May 22, 2007
Messages
469
I have to disagree with DocMan's suggestion. Programming your own synchronization routines is mammothly complex, especially if there are updates, adds and deletes at all sites. Deletes are especially difficult. There was a very long discussion about how to manage this and the various problems it entails in the microsoft.public.access.replication newsgroup starting around Christmas. Here's the starting post of the thread:

http://groups.google.com/group/microsoft.public.access.replication/msg/50e977955daabd5a

The thread went on through all of January, with some followup in March. I've done this kind of thing in pure master/slave situations and that was hard enough. Merging updates/adds/deletes from even two databases would be *very* difficult.

But indirect replication works perfectly well for this exact situation, where you have a VPN over the Internet as your only connection. I have explained how to accomplish this without having Replication Manager:

http://groups.google.com/group/microsoft.public.access.replication/msg/3b7167c1462b2d2f

and someone else goes through the same information in a different fashion here:

http://www.vsj.co.uk/articles/display.asp?id=560

Replication would be far, far easier than programming your own synch code, even with all the gotchas and special cases that are involved with replication.

Of course, I'm making one big assumption:

It's not possible to host the application in one location with Windows Terminal Server and have all the remote sites connect across the Internet to the Terminal Server and run the app there. If that's a possibility, it's far easier than any synchronization scheme, including Jet replication.

--
David W. Fenton
David Fenton Associates
http://dfenton.com/DFA/
 

Users who are viewing this thread

Top Bottom