Performance issue causing Access to crash and burn

LaBelette

It means "The Weasel"
Local time
Today, 08:49
Joined
Mar 12, 2004
Messages
68
Hi folks!

I've been experiencing a very frustrating problem recently.

One of my databases crahes randomly, and I'd like to know if some of you have tips to helps me.

The database is splited in front-end and back-end, both on the server. I know keeping the front-end on the client PC will improve performance, but due to a high number of users (about 60 to 80) and the need of doing frequent updates, installing the front-end on each machines is not an option for now.

Users are not all directly connected to the server. Those who are directly connecte4d to the server (in building A) have no problem. Those who don't (building B) have extensive lags on opening (about 30 seconds), and on about anything that calls an heavy query, most often leeding to a crash. Access display the message telling that the datbase needs to be repaired and restart.

I tryed several thing to improve my queries, like eleminating most of GROUP BY and UNIQUE clauses and avoiding the use of functions.

I have a lot of conditional formatting in the forms, but I don't think this is the problem, since it is executed on the local machine (as far as I know).

My theory is that the lags cause some call to the DB to timeout. I just dont know how to fix that. Is there an option to higher the timeout delay?

My other theory is that the problem is caused by dropped packets during the connection to the DB. I tryed downloading large files (100Mb to 250Mb) from the server several times and never had a problem. Although the connection is quite slow, the ping is very small, like below 10ms to 20ms. Is there better ways to test a network connection, if possible without installing software (limited rights on the machine)?

Would using Replicas be a solution? What are the pros and cons of it? I never used them and know very little about it. My idea would be of putting a back-end on both servers, but I guess the data would not be as live.

BTW, I am using Access 2002 (2000 file format) on an XP SP2 PC.

Thanks to all of you patient enough to read all of this! I Hope some of you have some awnsers, tips or tricks!
 
You need to find out what is different with the building B users connection to the server where the db's are located. Access will time out if the users connection to either the front end and/or the back end looses it's connection [even for a brief moment]. Once the connection is lost Access will either lock up or give an error message like runtime error # 3043 - Disk or network error.

Splitting your db is a must in a multi-user environment. Updating each users front end is very easy to do. There are a lot of posts on how to do it [automate the process].

Your problems will not go away if you continue to have the users share the front end. It must be installed onto the users computer and each front end must be linked to the one back end on the server.
 
Thanks for the info, GHudson! I never get any runtime error when it crashes, but it might only be because I use some error handling in the form's code. I'll check for that.

The difference between the two buildings? About 20 miles... :D

Seriously, the Back-end is located on the server in building A, so the users of that building are connected in LAN. The users in building B are connected via a VPN or some kind of remote connection.

I'll search for the posts about automating the front-end update process. One thing I did not mentionned is that there is, most of the time, only one or two users connected simultaneously, and the problem occurs even when there is only one user. Most users avoid using the DB since it crashes... what a shame for me! :(

I'll see if putting the front-end on the local machines resolve the problem. Meanwhile, if anyone has any other idea, please let me know. :o
 
To back up ghudson, here are MVP Allen Browne's thoughts on the subject. And here is MVP Tony Toews Auto FE Updater.
 
Thanks!

Thanks for the links, now I have something to back-up the hours I'll be spending on this, this will helps my client to understand the need of usign a front-end on each machine.

I don't know if I'll use "Auto FE Updater" or if I'll do my own updater. For security reasons, my client won't allow me to use some application taken on the net if I don't have, read and understood the whole source code. I can understand why!

Thanks for your help, guys!
 
LaBelette,

I hope that you have the situation in control. Ghudson and RG are valuable
assets to this community.

BUT, you don't "LOOSE" things!

This has been covered in the WaterCooler extensively! Not to make you "LOSE"
your train of thought here, but I am sick and tired of hearing that people
loose things.

Seriously, though your remote users will not be happy if you:

1) Base your forms on entire tables. This will require you to transfer the
entire contents of the table over the NET to the front-end. Instead, you
should have the front-end form use queries to restict the amount of records
that it sees. Forms based on entires will seriously hamper performance.

2) Combo/Listboxes can be held in front-end tables, if the information does
not change.

3) If this is a SQL Server app, you can increase the timeout value for the
front-ends that are very remote.

4) Sorry G, the "LOOSE" thing really bothers me.

Wayne
 
What about the "LOOSE"?

I don't base forms on whole tables and don't use SQL Server. Just good 'ole Access back-end. Thanks for the tricks anyway.

About putting tables that makes my combo/listboxes in the front-end, I never tough of that, and I think it's brilliant, since my front-end is gonna be on the local machine soon. The data they contained is changed from time to time, but I can update the front-end when it appends.

I just don't get what your are talking about, the "LOOSE" thing. Is it when I talked of dropped packets? Did I said something rude or in a inapropriate way, or did I mentionned a subject that has been already discussed over and over again? I am french speaking, so it might just be a misunderstanding. Sorry about that anyway.

One last thing, I am very grateful to GHudson and RG. Keep on the good work, guys, the Internet needs more guys like you! :D And you too, WR!
 
LaBelette,

I am not sure what your problem is, but I have some hunches.

You say you have 60-80 users. The question isn't how many you have, but how many you have at one time.

Issues that can come up require you to think about what is really happening inside your machines.

You have a file server sharing its little heart out.

You have a bunch of workstations clawing and scratching for information from the file server.

You have a network with all these users whacking away at the same file on the same disk using the same heads and putting shared information in buffers that start in the same pool of memory.

THEN you add to the mix that you are running Access, which means that each query must pass the entire contents of the affected (afflicted?) tables to each workstation running that query, because the shared machine in question is a FILE server, not an Apps server.

I see a system with HEAVY contention for resources. You are looking at disk cycles, CPU cycles, network cycles, and memory buffers all being in high demand. If the server in question isn't robust enough to guarantee a high SUPPLY of these resources, you will find out just how unstable Windows can get. Because a Windows under strain eventually dies on you.

Why? Because of a little thing buried in many of the resource routines that run behind the scenes in Windows - a timer. If you don't get a resource you need in a certain time, bad things happen as I/O rundowns get triggered, or as processes go into memory-wait states while the swapper goes nuts trying to find room in memory for one more process.

When that happens, one of your resources runs out eventually. When it does, you get a process or system crash.

The way to minimize this (not prevent it, for you cannot) is to be sure that the file server has LOTS of memory and LOTS of CPU power. It also would not hurt to be sure it has the fastest possible connections to its disks, such as fiber-channel or better, and that the disks are at least 7200 rpm if not faster. (I've seen 10000 rpm and have heard of 15000 rpm.) You see, the thing you DON'T want to happen is for a program to wait too long, so that one of those timers I mentioned goes off.

People often make the mistake of finding an old, tired machine to be the file server for a complex. I don't know if your folks did this, but if they did, it is part of your problem.

A machine for file services should have at least a couple of Gb of RAM plus at least another Gb for every 5 or 6 users who share the machine simultaneously. At least. You should have a very fast disk dedicated to the swap file function. Nothing else on the disk. Just one humongous swap file. If you can get a multi-CPU system, or at least a Hyper-Threading system with a dual-core CPU, that helps the CPU speed. But make it a fast CPU. Not less than a couple of GHz. The network card should never be less than 100 MHz and if you could land a Gbit card for everyone and a network backbone capable of Gbit speeds, that would also help.

If the machine is a file server for 60-80 people, it should be a DEDICATED file server. No other function. Don't make it divert its attention to any other purpose like DHCP or DNS or Firewall or (heavens forefend) Web services. Particularly not Web services.

The machine should not be exposed to the internet, only your intranet. It should be behind a really good firewall. You should have your sys admin sweep the machine for running services that it doesn't need. Like, if you are going to have a fixed IP address, disable DHCP on that machine. Turn off web-based services like SMTP or SNMP if the machine is going to be dedicated in purpose. Your one goal is to assure that the machine has nothing else to do but serve files. Then be sure it has the resources it needs to do so.
 
In an utopic world...

Thanks for that extensive reply, Doc! It's just too bad that the actual server belong to my client.

If it was mine, I'll be more than willing to make it like you proposed! In fact, I am about to make one of my own and your tips will be very useful! I like the idea of putting the swap on a dedicated drive. I guess I found a use to those two raptor drives and this RAID controller!

The problem i have is that this server is a file server (file server only, no other application) on which I have absolutely no control. I already proposed my client to get another server dedicated to Access applications, but it was refused. Security and performance reason. They say putting another server on their lines would be too heavy for the network... so we put all those files on the same server, which can't cope and become slow, sometime it even just "die".

I guess my clients prefer to have me charging the 10 000$ for work I could have avoided than paying 5 000$ for a killer Access-dedicated file server. :D Kind of sad. :(

Anyone ever faced this kind of situation? I wish I was able to show to my client the importance of having a dedicated machine, but I am not very persuasive... any tips?
 
LaBelette said:
Anyone ever faced this kind of situation? I wish I was able to show to my client the importance of having a dedicated machine, but I am not very persuasive... any tips?

Let their server crash and burn? I wish I was in the situation of taking tens of thousands of dollars from stupid people (even at the current exchange rate) ;-)
 
reclusivemonkey said:
Let their server crash and burn? I wish I was in the situation of taking tens of thousands of dollars from stupid people (even at the current exchange rate) ;-)

Heuu.. something honest? ;)
 

Users who are viewing this thread

Back
Top Bottom