Question Global database access times slow, is there a way to separate and run batched merges?

mrb783

Registered User.
Local time
Yesterday, 20:54
Joined
Oct 28, 2008
Messages
40
I have a database that has a backend for all the data and a frontend with all of the forms/reports/macros/etc and is used globally by my team. All of the US and EU offices can access it fine with no latency issues, but we are experiencing issues with our Chennai office with access times and the significant amount of time for them to make minor updates to the database. It is stored on a high availability network share with adequate bandwidth to accommodate this, so I can't imagine that it is the issue...but of course anything is possible there. The hangup occurs when users attempt to write to the backend...apparently they can access the frontend fine. Regardless, network related issues are not the point of my question...

Anyway, what I am curious about is whether or not I can create a duplicate backend database that could be stored on site at each of my global locations and then set up a batched job to merge any changes on a weekly basis. I'm not sure that this is possible, given my database design (specifically the Autonumber fields tying all of the tables together), but it would be nice.
 
Each individual user should have their own copy of the Front End database. This will reduce the chances of database corruption. If you still have issues after that then I would consider replicating the Back End DB and using the Synchronise facilty to keep them in step. I would however synchronise once a day rather than once a week. This will make it easier to resolve any conflicts.
 
Currently all users do have their own front-end copy that they work on for just that purpose.

As for the Synchronization feature you mention, is that in Tools -> Replication? Sadly, I'm grossly unfamiliar with this.
 
If, as you say, many international sites are doing OK and only one site isn't, this is not a database issue, it is a network issue. The modern-day adage is "a network is no faster than its slowest link." If there is a way for you to do a PING-test, check the path from your problem site to your backend server and compare that to a similar test for some sites that work OK.

I'm only guessing here, but when I run into this kind of issue for my field users, the problem is a specific ISP or gateway device that is far slower than any other. You want to see the hops and timing for each hop. One of them will be radically different from all others. The problem would then be to find a route that avoids the slow device.

This always leads to a problem that will be forwarded to your networks team and their managers. Usually it devolves into a P|$$ing match between your company and the service provider for that site. If you have an agreement that specifies a certain level of service for that site, your ping test will be the evidence that starts you on a trouble call to their service department for failure to provide promised (contracted) service. This might turn into a "hold their feet to the fire" situation and has far-reaching ramifications if you had an "umbrella" contract for all sites from a single provider. Beware that you are approaching an industrial-sized can of worms with your can opener.
 
Currently all users do have their own front-end copy that they work on for just that purpose.

As for the Synchronization feature you mention, is that in Tools -> Replication? Sadly, I'm grossly unfamiliar with this.
Yes thats exactly what I meant. Its worth searching ths forum for advice. David Fenton has some good info on his website. The link is in his signature
 
your problem office could also be on the end of a phone line exchange -

can you copy/paste drag files from server to local PC ok (and quickly)
can office A print tot he problem office ok and visa versa

the office with a problem is it in a city or in the sticks(outline area - or rual set up)
Phone lines need a booster every x miles to boost the signal loss - if you are at th eend run the phone company might not of put th ebooster in ...

i am at an office that is at the end of a run and remote servers are incredable slow - to the point were erosion is faster than the server ...
 
I would not advise using direct replication over a WAN of any type. Fixed offices should be using Windows Terminal Server, which saves a whole lot in administration. Indirect replication works fine over a WAN, but is a helluva lot of work to set up and keep running smoothly. Much simpler is the WTS hosting.
 
The Terminal Server is the best bet. There are products around to test broadband speeds and in many cases the contention i.e. how much competition there is on the broadband, can be the issue. Domestic broadband can be 50:1 contention, business 20:1 to 1:1, the highly contended lines mean that up to 50 users get a slice of the bandwidth.

The Terminal Server essentially is a server based solution that then sends the screen dumps remotely. All the data connectivity is local to that server and therefore fast.

Simon
 

Users who are viewing this thread

Back
Top Bottom