Remote Front Ends

NauticalGent

Ignore List Poster Boy
Local time
Today, 15:47
Joined
Apr 27, 2015
Messages
6,871
Good morning Access Forums,

I would like some opinions on a matter if you please. I have inherited an Access application that is VERY VBA intensive. At least 2 different people have worked on it at different times and it is easy to tell where one left off and the other began. The second individual was more Access/VBA savvy (object naming conventions) and the majority of his code can be found in these forums, especially the FE updater. Pretty slick the way he set it up. It is obvious he, like me, has used this forum extensively!

The BE of this application resides on a server that is centrally located with 2 remote sites that are in different countries. The server and all infrastructures are US Government with a T1 connection at all three locations. The Office Suite is 2010, with the possible exception of SharePoint. I have been told it is a version or two behind but I am not sure.

The previous guy tried to have the remote FEs use the central BE put performance was so degraded the remote sites no longer use the application so updates are sent via e-mail or other means and then hand-jammed into the DB.

My task is to fix this. I have pondered this issue and here is where I am at, please keep in mind that the info from the other sites need only be transmitted once a day (morning) for briefing and reporting purposes. For that reason I am leaning towards deploying a site-specific BE at each site and build in a function that will transfer the data either via e-mail, directly to a directory on the central server or a SharePoint list that could be linked to the central BE.

This is less than ideal because the big-wigs were looking for something more dynamic and real time.

Any suggestions would be enthusiastically received and maybe rewarded with Challenge Coin!
 
Have you considered using a product like Citrix?
 
No I haven't. I am not sure what that is and I doubt it is available on the server. I did ask the IT dept about SQL and they looked at me with a blank stare.

Its all they can do to support SharePoint and the whole command has a handful of Access applications that "some guy" from the past developed and is no longer around to support. It didn't take long for word to spread that there was a new guy (me) who knows "a lot about Access". I have been so busy fixing other people's stuff that I barely had time to do what I was hired for.

However, it HAS landed me a new position with work I enjoy. I HAVE broached the subject of bringing in a contractor to get a handle on things but that went over like a lead balloon. Anytime you talk about new software on Gov machines the powers that be get all loopy. :banghead:
 
Citrix is a remote desktop style app.
If you have an windows 2008 server onwards you can use terminal server to act as a remote desktop server. That way you are only moving the desktop image over the network - not any data and the program is running on the server. We use it for remote workers with Access DB Front end and it is really straight forward.

You can also load a remote app which is an even more stripped down version of the same thing.
 
we have used citrix with Marian software, very slow.
 
the trouble is that using access over a WAN is inherently slow.

Instead, using a Terminal Server (Citrix) allows a remote user to connect to a session at the same location as the datastore, so the database works more quickly. All the connection needs to do is screen updating, rather than move a lot of data. (as Minty said). There me be issues with moving files and printing at the remo0te end, as it depends how the database/server end "sees" the remote end.

There are licensing and other support issues also.

The other alternative is moving to a web based solution, but that is a completely different technology.

You might be able to use a dbs over a WAN by writing it very carefully, in order to move the minimum amount of data over the network. It will be slower than a LAN, but may be useable.

Integrating several databases into a single database is also not without difficulty. The main issue concerns the integrity of new data. How do you resolve and integrate new data added by the remote ends .....
 
The last time I was at a place that attempted to use an Access database over a WAN, the System Analyst running the show wound up converting the whole thing to a SQL Server back end in order to fix the incredible lag problem it caused in Access.
 
Having suffered through a problem with Access over a really slow network before, I can tell you that there is no perfect solution. A bum physical-link-layer connection is a bum connection. You'll be subject to all sorts of poor transmission problems including corrupted records, broken (and therefore hanging) sessions, and lots of user collisions.

The solution will depend heavily on analyzing ways to minimize the amount of data to be exchanged between FE and BE, because remember that you have two nasty facts with which you must contend.

First, access uses SMB protocols to bring the database to YOU on your remote machine. (Yes, I said that correctly. YOU are the remote system. Your BE is the central system.) You have to transmit pretty much the entire table in a block-wise protocol in order to have that data that will support the action you must eventually take. That is because YOU are executing the query on the FE machine.

Second, unless I miss my guess, there is no version of Access on which you can rely on the BE machine, so you can't easily do a "batch-mode" operation for your import/export process on the central system. In order to do anything at all, you MUST have a remote system do the work for you.

When I was faced with this problem on a U.S. D.o.D. system, I ended up using a lot of local tables in the FE file that were caches of the translation tables in the BE (because the translation tables didn't change often.) I also had to split tables in such a way as to minimize the data actually queried.

As to the updates, you might need to consider that someone local to the BE site might have to be your "operator" now and then for daily maintenance functions. If you can get files sent via a TCP method (FTP or SFTP come to mind), remember that TCP includes message integrity checking whereas general IP transfers do not. If you have a secured method of transmission (SFTP comes to mind), you can send files and know they made it because of the protocols used.

All I can say is, I've been on a SHARED T1 line before and it was no fun in paradise, I can assure you. A dedicated T1 line would not be much better. This is where you also need to put a bee in someone's bonnet to make a priority decision over how important this application is to your mission... and you have to be prepared to live with the answer.

Personally, I'm surprised that you are using multi-national BE locations because of the security implications thereof. SMB traffic isn't even allowed past our mid-level firewalls. We had to move our stuff to a different network and make some application mods in order to be able to run on something a bit hotter than a single T1.
 
I have Access front ends that work pretty well over a WAN, but I use SQL Server as the back end, which gives me more control over what goes back and forth over the wire. As others have mentioned, with an Access back end it is typically too slow. The emailing scenario you describe is certainly possible, and could probably be automated at both ends.
 
Great info all of you and although I do not have my answer yet, you guys imbued me with a good starting point. While I was messing around I found this website.

http://www.kallal.ca/wan/wans.html

Right now I am leaning towards a SharePoint solution; it seems to be the only application my IT shop seems knowledgeable on. They do have Xendesktop (CITRIX) but no one I have talked to today seems to know anything about it except for imaging machines.

SQL server was a non-started from the word go, unless I am able to convince and few Admirals/GS-15's that is a necessity.

At any rate, I do appreciate all the feedback and I will re-post my progress and the results!
 
It just occurred to me that this thread should be moved to the Access Web Forum. Apologies for placing it hear but until you all chimed in I didn't know any better!
 
It just occurred to me that this thread should be moved to the Access Web Forum. Apologies for placing it hear but until you all chimed in I didn't know any better!

I can move it if you like. Seemed like you were exploring alternatives, of which web app is one.
 
Admirals? Are you riding the NMCI network or is this a dedicated research network? If you are using SMB, I know you can't be considered secure because "plain" SMB is inherently not securable unless you can develop a VPN over a faster network backbone.

FYI, I'm with SPAWAR LANT at NEDC NO. If you can decipher that, you know enough to tell that I have some experience in a Navy environment.
 
pblady, no need to move it unless you think it needs to. If the regulars here think its fine then so do I!

Doc_Man: Close, this is on One-Net SIPR. The remote sites are Rota and Bahrain, I am in Naples. I asked them if they considered SPAWAR but at this point they don't want to shell out the bucks...
 
Oh, yeah, I can understand shelling out the bucks. SPAWAR hosting services ain't cheap. Since you are riding SIPR (Secure Internet for our UK friends) vs. NIPR (non-secure...), you can probably evade having DCAO (Datacenter Consolidation folks) breathing down your neck. It also doesn't hurt that you are probably considered as a relatively small, site-oriented application. DCAO won't move stuff to a consolidation center when the added network traffic delays of remote service would make operation of the application's mission (more) untenable.

That SIPR environment is also why you can use Access at all. On NIPR, the firewalls would stop you dead at the 10/11 zone layers because of using SMB protocols - but on a secured network, there is already data-in-motion security so you are relatively safe.

Back to the more general problem at hand...

I don't know of a way to improve speed in your solution unless you can:

1. Boost to a higher T-number like T3 or T6 or T12. (or <gasp> OCS-3).
2. Work really hard to minimize what you are sending "behind the scenes"
3. Localize everything that CAN be localized (i.e. local copies of translations, updated once per day at most) to further minimize the SMB traffic.
4. Localize any heavy-duty maintenance operations (local in the sense of no external network hops, staying on your local network segment if possible).

For #2, that MIGHT mean denormalizing the database slightly to split a long table into two tables having a 1-to-1 relationship where you only grab stuff from ONE of the split tables strictly for reducing the size of transmissions. You would have to process the rejoined tables for local maintenance, but if you can designate a subset of your table fields for your operational queries, it might provide a good return on effort.

For #4, you would need one trusted user with Access at the site of the BE files - and this user would do complex things (like your imports) that are NOT on the external backbone. You want to avoid jumping on to your remote network because each hop is tremendously expensive when T1s are involved. Or did you mean that even your in-house network is on a T1? (Oh, I hope not, for your sake.)

Switching to CITRIX might be a "big bucks" issue if your site can't even jump to a higher T-level because you would need a multi-user license. On the other hand... if you are on an NMCI SIPR connection, you might want your site security team or your network administrator to ask if you already have a network-based CITRIX license AND a network-wide multi-user Access license. You might not have known this, but NMCI (Navy/Marine Corps Intranet, for our UK friends) has several products licensed Navy-wide. If you could use CITRIX, you would need Access on the central server but would only need Windows Terminal Services for your remote sites.

As to data imports, it would depend on the formats, but if you can send, e.g., a spreadsheet with your updates, perhaps as an e-mail attachment, you can import a spreadsheet directly to a temporary table, then use a query to distribute the temporary table contents to the working tables, then just dump the temporaries. It would be a manual process, but the manipulations required to diddle with an Outlook message attachment might be a bit tedious to program, particularly since you would need TWO application objects - one for Outlook and one for Excel (if you were using spreadsheets).

Of course, it is always your choice as to how to continue. But if you want to get this updated system running sooner, avoid direct application object tinkering at first and come back later to retrofit in a more controlled manner.
 
Wow, Doc_Man, this is a lot to digest (I wish I could honestly say I understood all of it!). I will break this down and work with our N6/NCTS folks to see what is in the realm of possible at this time.

At any rate, I thank you for taking the time to write that War and Peace Novel!

Shoot me a PM with your Official Address, I have a CNE-CNA-C6F coin with your name on it!
 
Dang, I wanted that coin for my daughter (USAF officer). You're going to let the fact that Doc knows 50 times more than I do sway your judgement?!? :p :D
 
Thanks for the laugh! Send me a pm as well, can't let the AF go,without!
 
First, pbaldy exaggerates. I doubt I know 50 times what he does. Maybe 1.0000050 times, at most...

Second, Gent, I've been where you are in terms of fighting slow systems.

We started out here in New Orleans as a site with a T3 line and damned lucky (if not very grateful) to have it. But that was before Hurricane Katrina hit our fair city. After we came back from an extended run at our COOP site (you might recognize the acronym COOP = Continuity of Operations Plan), we were tapped by the D.o.D.'s "Fairy Godmother" department as a reward of sorts. We had kept the US Naval Reserve's personnel management systems running (with a total of about 3 hours down time during the switch to the COOP site) even though the city had been hit about as hard as it gets and still survive.

Several officers got their careers (and their Captain's Eagles!) by running a site that served the nation during one of the worst natural disasters in modern history. At least two of them retired to the lecture circuit as consultants on data center disaster survival. Even though the HQ for USNR moved to Norfolk after that, NEDC NO stuck around because we obviously knew how to survive a disaster.

Once that happened, our networks switched to some pretty hot stuff. We are now running Gigabit Ethernet for our in-house backbone. The site hosts nearly 70 USN (and other agency) projects. But I remember what it was like when we honestly thought that we could "sneakernet" something faster than we could digitally send it. So I can commiserate with you on slow networks.

Good luck on whatever you have to do. BTW, your PM is sent with my official business mailing address and NMCI e-mail address.
 

Users who are viewing this thread

Back
Top Bottom