Access offline (1 Viewer)

Gismo

Registered User.
Local time
Today, 14:48
Joined
Jun 12, 2017
Messages
1,298
Hi all,

Just a general question
I have a back end saved on the server
When the Engineers need to use a front end, by using a batch file, they make a copy onto the users remote desktop.

In the case of internet issues or network problems, how would you implement a backup system where the user can still enter data and when the system comes back online, it can be populated

The form the engineer would use collects quite a bit of data from the back end and uses individual sequence numbers per product registration

Any ideas would be helpful to get me thinking in a specific direction of accomplishing this
 

Ranman256

Well-known member
Local time
Today, 07:48
Joined
Apr 9, 2015
Messages
4,337
I have a switch the users can use that relinks the tables from sql To local.
then when they return to the server they can relink back.

tho if you use autonum ID fields,you need to use queries to transfer data correctly.
 

theDBguy

I’m here to help
Staff member
Local time
Today, 04:48
Joined
Oct 29, 2018
Messages
21,467
It may depend on what sort of actions the engineers will do with the data. Will they be adding new data or simply update old data? Will each engineer only update specific data associated with them or could anyone update any data?
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 07:48
Joined
Feb 19, 2002
Messages
43,257
In the case of internet issues or network problems, how would you implement a backup system where the user can still enter data and when the system comes back online, it can be populated
Does this happen often enough for you to create a solution? Synchronizing disconnected databases is not trivial. If this is an unplanned outage, how would you have current data in the local database? everything would be out of context.

There are reasons for disconnected operation. For example, our visiting nurses might not have an internet connection when they are visiting a client so data is ALWAYS collected off-line. At the beginning of the week, they download current data for each client they will visit. Then when they can connect at the end of the day either from home or the office, the new data is transferred. And if any updates were made to the demographic data, the app applies it as long as the office version has not been updated since the data was downloaded. If it has been, then the two versions are printed out and a human has to make the updates.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 06:48
Joined
Feb 28, 2001
Messages
27,156
I'm with Pat on this one. The U.S. Navy had a related problem with network connections. We had something like 18 different agencies on at least a dozen different network segments in eight cities like Washington DC, Baltimore, Alexandria, Norfolk, Philadelphia, etc. and never could keep them all online at once. In essence, you really CANNOT do this easily if you want the combination of connected data and disconnected abilities. Access isn't that forgiving. Neither was ORACLE, nor was SQL Server any better, and ShareBase (which, through typical corporate evolution, eventually became SYBASE) couldn't do better. (And that's just the platforms I knew about.)

One solution, totally not trivial, is to ALWAYS gather data locally and then as a separate transaction, upload it for assimilation. We used to call it reconciliation for some of the linkages, but it was always the same. We were essentially turning bulk updates into batch operations.

Going the other way also presented problems. We had to download "unofficial" copies of the significant historical data we needed, but understood that the historical data could not be changed. This led to the concept of "definitive source." If you were not the definitive source, you had to send transactions back to that source if you thought something had to be changed, and there were complex transaction sequences to accomplish that. But they were file-based transactions that would get loaded to temp tables for final inclusion. And you had to design it to work that way because "multiple direct transactions over long-haul networks" is kind of an oxymoron.

Anyway, the problem with having enough disconnections that you need to consider an alternative method of operation means you need to watch out for database corruption after each network loss.
 

Users who are viewing this thread

Top Bottom