Question Remote Database

PNGBill

Win10 Office Pro 2016
Local time
Tomorrow, 10:56
Joined
Jul 15, 2008
Messages
2,271
Hi Forum, Access 2010. We have moved our "head office" to another country but still want to maintain a reasonably up to date database in the "branch office".
VPN is costly due to the high rate per MB for internet usage.

Any suggestions on how to handle this issue for a small business - 3 to 4 pc's.

database is 300mb which again would be costly to transfer on a daily basis.

If we could get it to just transfer the changes made, on a daily basis ?? is this practical ??

I could email reports on a daily basis but this wouldn't be the same as being able to menu through the database.

Appreciate any advice. regards, Bill
 
Are changes being made at Head Office only or in both offices? How many tables are involved?
 
Changes are made at Head Office only.
5 tables for normal day to day business.
 
Just distribute a database of the new and updated records and run update queries at the branches.
 
A small database will just have records say for last 24hrs.

Email this to branch and they update.

Sounds practical and economical as far as internet mb's.

Sorry for being a bit dumb but what are some key steps in implementing this ??

Can Access tell when a record was created apart from where a date is a field ??

Regards,
Bill
 
You might need to add a date stamp so you can determine when records where added/changed.
 
I guess I need to some tinkering with the database for this to work.
Should keep me off the streets for a while.

Thanks for the advice
 
a similar idea to Galaxioms, and it depends how big the database is - but just download the whole backend every night -then reconnect your frontend to that.

then you dont have to try and merge data at all, or have to amend it to incude date stamps. Far simpler this way.
 
a similar idea to Galaxioms, and it depends how big the database is - but just download the whole backend every night -then reconnect your frontend to that.

Unfortunately PNGBill is in Papua New Guinea where they are charged jaw-dropping data rates.
 
Then remember to zip/rar the thing with max compression, not just the default one
 
300mb each day will cost USD30 x 20 days = USD600 per month at least.

two years ago it would have 4 times this cost.:mad:

Also, link isn't 100% and speed not the greatest so you could be looking at 2 to 3 hours download each day and if problems, maybe you could double your mb's easily.:(

If we could just transfer key data changes / additions to allow the branch office to have a more up to date database we would be ok until local resources allow better results.
 
Then remember to zip/rar the thing with max compression, not just the default one

I wondered about zipping. If the updates can be zipped then we have a very economical transfer system.

Another issue may be the technical skills at the branch office.

If they somehow miss an update, the database could a little :eek:

I could number the updates and have a table in the database that will alert the user to an out of sequence update ??
 
How about using SQL Server (or other) so that you query only the records that have changed since last visit.

As Spikepl said, zipping will give you a massive size reduction. You can easily code the zipping/unzipping so that the user doesn't have to get their hands dirty.

Is Sharepoint an option?

Chris
 
I would be on another learning curve with an SQL Server

Not sure how Sharepoint works. Will only new or amended data be transferred ?

I looked at a thread where Bob Larson was assisting in May with Sharepoint but would like to hear about an example of how much data is transferred.

Would it be like you are on facebook all the time ? We have had telephone number monthly accounts where staff had been using www.
 
Sharepoint? I.e. accessing your db via the web? You mentioned that you have a flakey connection, and this would add to the learning curve, and sharepoint is not for free, and somebody has to set it up, ... and and and ...

IF you need a here-and-now solution, then I'd go with the lot on this: make a DB that only contains updates, zip/mail it, and update the local DB from that. One way to keep staff out of it would be to automate this and bypass emails: pull the data from the head office using a Scheduled Task running FTP
 
Sharepoint? I.e. accessing your db via the web? You mentioned that you have a flakey connection, and this would add to the learning curve, and sharepoint is not for free, and somebody has to set it up, ... and and and ...
I was really thinking of Sharepoint on an in-house server. I agree, it's only an option if the company already has Sharepoint installed on an in-house server. Not really an option otherwise.
 
Sharepoint? I.e. accessing your db via the web? You mentioned that you have a flakey connection, and this would add to the learning curve, and sharepoint is not for free, and somebody has to set it up, ... and and and ...

IF you need a here-and-now solution, then I'd go with the lot on this: make a DB that only contains updates, zip/mail it, and update the local DB from that. One way to keep staff out of it would be to automate this and bypass emails: pull the data from the head office using a Scheduled Task running FTP

This sounds like a good explanation.

I will work on the database to get it fit for extracting recent activity and take it from there.
No urgency for a quick fix as we could do weekly or bi weekly updates of a zipped database in the meantime and refer to Head Office for latest data if required.
 
I was really thinking of Sharepoint on an in-house server. I agree, it's only an option if the company already has Sharepoint installed on an in-house server. Not really an option otherwise.

Sharepoint not installed and we really want the business to run as simple as possible.
Tech support here is expensive and not really up to date. A few times I have found our system to be more advanced then what our Tech support has knowledge of.

If it isn't off the shelf at Microsoft, you may not get support here.

Once cost USD1,000 to resolve an issue in our linux server which I could fix in 30 mins (run a check disk) but I was overseas at the time:(
 

Users who are viewing this thread

Back
Top Bottom