Performance Analytics

systemx

Registered User.
Local time
Tomorrow, 00:01
Joined
Mar 28, 2006
Messages
107
Hi all,

I am work in a medium sized (80-100 people) Government Contact Centre. As some people may know from previous posts, for several months on and off I have been developing a database which could be best described as a 'ticket of work' system for many of our transaction channels.

I am currently rolling out the 'Beta' version of the database, and am noticing some slowdown in performance.

The database is Acces 2K, F/E to B/E X2 (writing to one backend 90% of the time) sitting on one of the networked drives (F/E is local).

I have addressed some of the issues that relate to performance (such as persistent locking of B/E) and am seeking some further advice. Unfortunately, as our I.S department do not support Access use I am unable to seek advice at work.

1. Does anyone know of any tools or methods for measuring database performance?

I am particularly interested in response times and how these are impacted depending on traffic/load. It would be particularly beneficial it there was a tool available that recorded data automatically(eg. in an excel worksheet) for later analysis.

2. A question re: code efficiency. I am primarily using ADO to open and manipulate recordsets. This may be a silly question, but being self taught I have missed lots of the obvious stuff on the way. With ADO, I can either open an entire table, or use an SQL select to open only a specific record. Do the two methods differ with respect to performance greatly (more so from the perspective of other users)?

Sorry about the long post, and thank you in advance for any advice you can offer!

Cheers,

Rob
 
I dont think your post was large, it is consice and to the point. Complements for your post.

1. Performance.
I find performance usually is restricted to one or 2 issues.
a) Database size, make sure to compact both FE and BE
b) Network traffic. Access is relying on the stability and avalability of the network and is not very efficient in its use of network resources. This is something you can monitor using any old network monitoring tool... Probably one of which your IT can supply.

2) ADO
If you are opening the table you then need to do the searching after that by code. If you are allowing the user to browse the table, then opening the full table or full query is the way to go.
If on the other hand you only need the one record, the query is the way to go.

All above is in my experience and my oppinion, by no means accept it to be the one and only absolute truth.
 
Only thing to add is to consider Index's on tables.

L
 
I would amplify the issues.

1. Indexes are important. You want indexes based on the following ideas.
1.a - For tables that do mostly lookup and very little updating, you can live with multiple indexes. For tables that do a lot of updating, the fewer indexes the better.
1.b - Use the shortest keys possible for any index.
1.c - Remember that when working through JOINs that the indexes on all participating tables are also participating.

2. Maintenance issues
2.a - Every database needs a regular regimen of cleanup and maintenance. The more you update it, the more maintenance it will need. The more users you have, the more maintenance it will need. The more ad-hoc reports you allow, the more maintenance it will need.
2.b - BACKUP, BACKUP, BACKUP. Can't say it enough. BACKUP YOUR WORK!!!
2.c - In a complex environment, you need to be extra careful in terms of your user base and the "principle of least permission." Make sure that everyone has exactly what they need to do their job. Make sure tha NO ONE has any more than that.

3. Growth issues.
3.a - A database that size is going to grow. You are going to face two issues much sooner than you probably wanted.
3.b - You will have to archive your database on a regular schedule to shorten it, reduce table sizes, etc., or you will be eaten alive by historical data.
3.c - You will have to seriously look into converting an Access BE to something like an SQL Server backend.

4. Performance
4.a - It isn't the users that kill you. It is the SIMULTANEOUS users. Once that gets over 10-15, you start running into SERIOUS locking issues because of the probability of locking tables. Particularly since call centers tend to look at the most recent entries in preference to older entries. (I.e. locus of attention is focused rather than diffuse.)
4.b - going to a backend of SQL server changes locking tremendously since you no longer need to lock ANYTHING quite as long. (The query's returned recordset remains locked but the base tables don't have to.)
4.c - the problem with Access network performance has ALWAYS been that to Access, the database is FILE-served, not database-served. Processing is ALWAYS done on each user's local machine with a massive hit to the network any time you do a large query. Going to an SQL BE means the processing is done on the BE server, not on the FE workstation. Network traffic drops like a stone. So down the road, you might really, seriously need to consider upgrading the BE to something else than Access.

Going to a new BE server concept is not the end-all or be-all solution. But it is a big step towards efficiency on all levels.
 
Hail to that Doc... Great response! very detailed and all...

2.b. is something people tend to not do
4.c. is something most people do not realize.
 
A huge thanks The_Doc_Man for the comprehensive response.

This has been my first serious database - so a learn as I go project and much rework along the way as I have researched and developed better methods for getting from point A to point B.

I have taken some reasonable measures in that -

1. F/E is compacted upon exit
2. B/E is compacted manually daily
3. B/E is backed up twice daily (automated as long as the Reporting Engine - a separate F/E for pulling data out - is open)

I have put a great deal of thought into access levels and as all forms are unbound am controlling what the user gains access to.

With respect to archiving - that is one of the main reason I was hoping to find a performance monitoring tool (I'm thinking of attempting to build something to do this in Access as a future project).

I am keen to find the 'sweet spot' as far as performance goes and set up a second B/E for archived records, still accessible through the F/E but perhaps with a 'View All Records' switch to query only live or live and archived records. I have to read up on best practice as far as archiving is concerned still.

As for SQL Server....I want it. Oh, how I want it. I.T and myself do not see eye to eye on this one though :( I'm working on it and hopefully within 6 months I can have a win!

Thanks again to all for the advice, I am always amazed at the wealth of knowledge you have!

Cheers,

Rob
 
Rob, you are welcome. As to the depth of my knowledge, I have lots of experience.

You know the Devil's Dictionary definition of experience, don't you? ...

Experience is the ability to recognize your mistakes when you make them again.
 
On a different note, performance monitoring tools are going to be rare for Access other than its own internal database analyzer stuff under Tools.

However, depending on your workstation setup, you COULD run the system performance analyzer and look to network load, CPU load, and some of the disk-related stats. Each version of Windows has a slight difference between the other versions in what it will show you. Since you are unlikely to find an inside-out performance tool, your only other bet is the outside-in viewpoint of the Windows System Performance tools.

Side note: If you REALLY like VBA, there are threads here you could look up to do some sleuthing from inside a hidden form that samples your user actions. Or you can just make your forms do invisible logging with time stamps. The trick will be to get TWO time stamps. One, the date. The other, a time stamp that is milliseconds since midnight. I think the NOW function or the TIME function will do this. If not, there are Windows system calls you can make as externals to do the same thing.

Logging to the millisecond gives as much performance data as you can hope for in terms of when individual events occur. Like, taking a time stamp BEFORE and AFTER a particular event that opens, updates, and closes a given recordset. Beyond that, performance is an iffy thing.
 
Does anyone know of how the performace is with Office 12 - Access? I am running into performance issues now too. I havent deployed my database to my users yet but in terms of my forms and reports the design view is sluggish and laggind about 5 secs or so.

Didnt know if anyone here uses Access 07 and if they addressed any of the performance issues?
 
The_Doc_Man said:
Logging to the millisecond gives as much performance data as you can hope for in terms of when individual events occur. Like, taking a time stamp BEFORE and AFTER a particular event that opens, updates, and closes a given recordset. Beyond that, performance is an iffy thing.
You are looking for "Timer"

Performance is all in the eye of the beholder.
If a user has something he does manually and it takes say 4 hours and you can make it automated in 2 thats is GREAT!
Then if the performance drops to 2 hours and 5 minutes... Then it is performing badly.

Or there is no one saying that 2 hours is actually optimum, but still it is great! but in actually it could be done in 1 hour, but no one knows... so who cares?

As long as users are happy that is all (or most) that counts.
 
As long as users are happy

Does that ever happen?

I once completely rewrote an application from the ground up. I won't go into the bloody gory details. Before I touched it - 22 minutes 15 seconds on the test file. After - 17 seconds. So we published the new utility program. What happened?

Three days later one of the developers passed me in the hallway and said, "You no-good S.O.B., thanks for nothing."

So of course I asked what brought that on? His reply:

"Before you stuck your finger in the pie, we had a guaranteed coffee break while that program was chugging away. Now we don't even get a pee break."

You just can't win some days.
 
The_Doc_Man said:
Does that ever happen?
LOL @ Doc...

Yes it does... Sometimes... But this clearly falls into the category of: "If it aint broke, dont fix it"

Users are strange creatures... They can be perfectly happy with a crappy sollution and totaly mad at you for the perfect one.

Just the other day I had a user saying: WTF, you are done allready? Thats way to fast (2 hours work HIGH priority problem)... Now I have to rush and do the final work on this while I havent got time :(

Then the same afternoon, different issue, same user... Me: Sorry, but this is going to take atleast an hour!
User: Unacceptable, we need this up and running within 15 minutes...

They just dont have a proper concept of things...
Men are from Mars, Women from Jupiter
Programmers are from earth, users are from ....???
 

Users who are viewing this thread

Back
Top Bottom