Access and LOTS of users

JohnW63

New member
Local time
Today, 15:42
Joined
Sep 6, 2002
Messages
8
I have read the threads on this topic and I think I know the answer to this but I just wanted to be sure.

I work for a school district that has a newly purchased student attendance system that is based on MS Access ( 2002 ). The standard users do grumble a bit about speed. They all have Access on their machines and just connect to shared data on our network. At the High Schools, we may have 20 -30 of these direct users. The real snag is that this system has a web based feature that allows the teachers to take attendance in the classroom from a web browser. At the begining of each class, the system gets 100+ teachers all trying to submit attendance at the same time. Of course the direct users are still connected too. This makes for a VERY slow system. Add to that, a built in Gradbook screen that the teachers can access via a web browser. All pointed at the same set of databases.

We have found that the performance really dropes as the users get much beyond 10 or so.

Our suspecition is that this is way too many concurrent users for Access and we should be using a bigger db engine for this.

What is a reasonable # of users for this setup ?
Is this about what to expect with just Access ?
What product would be the best to port this to ?

JohnW63
 
Your problem is because of the way Access handles the configuration you just described.

When the data files are shared but Access is located on each machine, you are essentially using the shared machine as a raw file server. Every query has to touch the same data tables, so you are getting file locks all over the place.

What you really want is something like a robust centralized applications server based on something like MS SQL Server or ORACLE (with ODBC connections). Because you don't want every copy of Access reading through the tables when it does a query. (Sadly, that is EXACTLY what you are doing right now.) You want to be able to send SQL commands and have only the input data or the answer data being sent over your network. That will greatly reduce your network contention issues. Which, in the configuration you describe, must be horrendous.
 
Doc_Man,

The shared data is housed on a Novell File Server with lots of RAM. ( 512Mb or 1Gb I think. )

The web access to the data is a Win2000 Server box that connects to the Novell Server via ODBC.

By what I have read, these are NOT the bottle neck. Is that correct ?

JohnW63
 
You didn't mention ODBC usage in your first post, so I said what I said. The fact that you are using ODBC to an Access Server setup probably means you are about as efficient as this thing is going to get, in terms of data flow, for your web pages.

However, something you said in your first post still makes me pause. The web access isn't through direct Access access. (Don'tcha hate Bill Gates for using product names like that...?) But the 20-30 other users who have Access on their terminals still connect through direct Access operation if I read you correctly. THIS is the part where you are using the Novell machine as a simple file server.

Unless you have a licensing issue on the Access server's concurrent user limits, consider splitting those 20-30 users into a Front-End/Back-End case (and the front end file can still be a single file shared on the Novell box), then make the back-end an ODBC connection through your Access server.

As long as Access has to read a database file remotely and not through ODBC operation, it has to read the database file all over the place through the network. It DOESN'T MATTER that the database is on a shared drive. It is the local copies of Access that are eating your lunch. You are not only reading the queries, forms, reports, etc. - you are also reading indexes and scanning tables across the network for those local copies.

Which doesn't preclude the possibility that either your network, your server, or your database architecture are contributing to slowdowns. But I have to reiterate, Access isn't noted for handling large numbers of users at one time. It is an OFFICE product - not a departmental or enterprise product. Don't get me wrong. I like Access. It can offer lots of really great abilities. It does a lot, and very quickly at that. But somewhere along the line, the helter-skelter layout internal to Access gets in the way.

My best advice: Take a good, hard look at growth potential for your demand. Unless you are already topped out in terms of how many folks will use this application at once, it is time to seriously consider something like MS SQL Server or ORACLE for your future environment.

Now, the remote users of Access, if they all switch to ODBC, could use your Access server, SQL Server, or ORACLE without much change. Just buy the appropriate ODBC library for your product. Link via ODBC to the tables you want. Access can handle that without much change to that front-end. And you MIGHT get some performance back by forcing the remote Access users to ODBC right now.

Another thing that might get back some performance is to assure a regular regimen of database backup and compression. (Backup first in case the compression fails.) Because as Pat Hartman reminded me, compression re-writes all tables in order of their primary keys. This can be a good thing in terms of performance, despite the fact that disks are random access devices.

Also, sweep through your queries to make sure that they base their major operation on fields having indexes. Make sure that no one regularly runs queries that have to do flat table scans. Look REAL HARD at table normalization, because that usually saves you space in any one table. When a table is smaller, more of it fits in buffers at one time, thus speeding up all searches and sorts. So normalization is both a space-saver AND a speed-saver.

You probably have done at least some of these things already - but now, if you are having performance issues, it is time to get positively paranoid about these issues. (Yeah, I know you didn't want to hear that.... but please don't shoot the messenger.)
 
Doc_Man,

That's interesting. We haven't looked at the structure or normalization of the db yet because it was developed by a third party and was just sold to us as a complete product.

I think that my boss has done a lot of testing on the system over the last month and has found that the performance drops considerably as the user count goes up. I don't know if he has done a "only web access users" test to see how that fairs with no "client" users working on the system.

I have done a little db programing with another product, but practically nothing with Access, so this statement I could use some clarification :

"...consider splitting those 20-30 users into a Front-End/Back-End case (and the front end file can still be a single file shared on the Novell box), then make the back-end an ODBC connection through your Access server. "

How do you know what to access directly and what files to get via the web ? I talked to my boss about the structure of the DB system. All of the forms, queries and reports are on the directly connected users local drive. Only the tables are on the Novell File Server. There are about 135 tables and so many relationships that they exceeded the maximum allowed and had to make secondary tables to make them all.

From that, I am getting the impression that the more we funnel through the Win2000 Server box, ( which has no other function than to handle web requests , using ASPs and then sending and recieving data from our Novell server via ODBC ) the better the speed will be. Is that correct ? We've thought it was the other way around. That a direct connection ( from client computer to mapped network drive on the Novell box ) was the faster method. Hmmmm.

We've been trying everything to speed up the web access. We've tried a 1.8Ghz , 512Mb box with a Gig Ethernet card connected to the Novell server with another Gig card, and found little improvement. We thought if we moved the shared data to the Win2000 "web server" the speed jumped up. Then we found the # of users connected to that copy of the data was never beyond 1 for that test, so it wasn't a real test. From what you're saying, if we ran a test of just web users accessing the data files, even if they are still on the Novell server and the "web server" gets to it via ODBC calls, the speed may be much higher. It's when we throw in the directly connected users the problems start.

If MS Access is not so good at handling larger numbers of users, how hard would it be to use SQL Server or Oracle as the engine and still retain all the forms, reports, queries, etc... ?

Thanks for the insights so far,

JohnW
 
Last edited:
Sorry I didn't get back right away. I take things one day at a time, but lately my days have been ganging up on me...

How do you know what to access directly and what files to get via the web ? I talked to my boss about the structure of the DB system. All of the forms, queries and reports are on the directly connected users local drive. Only the tables are on the Novell File Server. There are about 135 tables and so many relationships that they exceeded the maximum allowed and had to make secondary tables to make them all.

Sounds like you may already have done a front-end/back-end job. Do the persons with local copies link to external tables as though they were flat files, or do they link using ODBC to the AC2K server? This makes a huge difference.

Again, it doesn't matter where the files are. It matters very much how they are queried. If the desktop copies of Access have to scan the tables as though the files are just links to separate files, you haven't bought a lot by having the Novell server manage the files. The only thing that helps is if you have Access running on the Novell server. But I don't think you can do that. We have Novell machines here. Novell doesn't run Access 2K Server that I know of. So that means you are using the Novell machine as a disk connected via networking. Which is a bad thing when the number of users goes up.

This next explanation is not intended to insult your intelligence, but it sounds to me like we aren't fully communicating on what happens in a remotely networked set of files with Access. This is what is going on...

1. User launches a query

2. Access wants to generate a resultant recordset, so...

3. Access starts reading all the tables across the network, keeping pointers to what it wants.

4. When the scan is complete, Access steps through the result set based on its local list of pointers to remotely located files.

5. The user examines the query results based on the result set pointers and the remote files.

Here is what happens when an ODBC or SQLnet operation has been set up.

1. User launches a query.

2. Access wants to generate a resultant recordset, so...

3. Access sends SQL commands to the remote server.

4. The remote server locally reads the query, then scans the files local to itself to find the result set.

5. ONLY THE RESULT SET is transmitted back to the user. The user examines the query results based on pointers stored at the remote server.

So the difference in the two methods is in steps 3, 4, and 5. The amount of data transferred is RADICALLY less. If you have shared files but local query processing, your network is getting blasted by the transfer of the ENTIRE DATA TABLE (or combined tables) as part of the query processing.

The part that gives me the heebie-jeebies is that you have so many relationships. This usually implies lots of tables that have to be joined. And when you do joins like this, you have to iteratively scan each table several times across the net when doing joined queries.

I cannot emphasize enough that you need to consider which of these two scenarios applies to your site. Because you can do a LOT of good with conversion to ODBC - but not if you are there already. And Access works correctly with ODBC databases assuming you bought the appropriate library files to support that operation. The table setup wizard can help you set up the ODBC links if you need that kind of help.

By the way, gigabit Ethernet is still only 12.5 Mbytes/second, less any hand-shaking overhead. Most modern SCSI direct bus connections can support disk transfers rates of 20 Mbytes/second. Particularly if they are SCSI-3 or higher.

All you do when you use that hot Novell box is swap to a slower interface - the shared network. And if it is Ethernet, it is CSMA/CD - with the last two letters being really important: "CD=Collision Detect". As network traffic increases, you approach the saturation level. Past some point, the odds of any single message being successfully forwarded start to drop quickly - hence re-transmits and other things that cause delays.
 
Doc_Man,

I think you've got it right.

The Novell File Server just houses the tables for the attendance system. All other files are on each LAN users local hard drive. ( Queries, Reports, Forms, etc.. ) The files are accessed by just pointing MS Access to a remote data location. ( Drive letter K:, in most cases. ) No ODBC connection for these LAN users. As you've detailed the process of running queries, I can see how there would be a LOT of network traffic generated by relatively simple db tasks. Each persons copy of Access would be sending out all sorts of pointers to the remote tables and doing all sorts of scans and joins and such to get the answers to queries or display screens with info that may have come from various tables. Considering we have a LOT of tables with a LOT of relationships, this could get hairy very quickly.

Now add to that mix, the teachers in classrooms that want to submit attendance via a web browser, every 60 minutes or so, plus a web based gradebook. These 100+ people all pull up a web page served up by a Win2000 Server box that sends their "posts" to the Novell server via ODBC. ( Of course the campus network has the job of handling roving profiles to 100s of computers and other normal LAN stuff too. )

I guess the trick is to reduce the Access induced traffic as much as possible. One way to do that is to have everyone run via a web interface. Unfortunatly, that is not a feature of this program. The only web capabilities are the class attendance and gradebook. The other would be to have all the table work done on the server and the results sent back to the users application. That would require a server running a DB Server application. Something like, SQL Server, or Oracle or some other db product. There is a version of Oracle that runs under Novell and something called Pervasive SQL. The problem would be converting the tables, reports, queries, forms,.... . Then support for the system becomes OUR responsibility not the vendor we bought the program from.

On that thought, if we were to have the LAN people not run this Access based program and then have the teachers submit attendance and run the web based gradebook , it should be noteably faster, right ? It was the web based people who found the huge slow down, so we've been thinking THEIR connectivity was the problem. If it was actually just that the web access was the last straw of a loaded system, I'd like to have a way to test that out.

It really seems like this program that the school district purchased was not designed to have has many users as we have on it. An elementary school with only around 5 LAN connected users and say 30 web users would be OK. At our high schools, it's very over loaded.

Does it sound like I have the problem understood ?

( p.s. Feel free to assume very little knowledge on my part , as far as Access goes. I have installed it many many times and worked with it sparingly, but I have never gotten any further than that. My job is IT support and repair with some Novell admin thrown in. )


Thanks for the on going info. I have been sharing your ideas with the boss.

JohnW
 
Last edited:
John, I think you've got the idea.

Feel free to share my thoughts with your management. I'll try to keep it relatively clean. (I can sometimes get a bit colorful, but I now consider myself warned.)

The web traffic probably isn't really that much of an issue. It is just that, with all the file content traffic cluttering the "data highway", the web traffic is encountering a traffic jam. The earlier comment about "collision detect" - for those members of your management who aren't familiar with network details - is what happens when you try to take an on-ramp to a freeway and the traffic is so heavy that you can't see an opening into which you could merge.

With regard to the load of "roving profiles" - don't worry about that traffic. It might experience a momentary delay - but once you log in, that is all low-level stuff. Your problem is raw data traffic between the workstations and the file server.

I understand your point about the responsibility of support becoming an issue for the vendor from whom you bought this package. "You break it, you bought it." Very common attitude, I might add. But don't downplay that vendor's ability to help you. If the package is already split into front-end/back-end as you describe, it is worth asking them a point-blank question about where their support really ends.

In other words, ask them if converting the direct access link to an ODBC and SQL Server or ORACLE Server configuration breaks their product. If they say "Yes" - ask them why. Don't be afraid to play some hardball on this issue. Because, you see, Access definitions work pretty well in either situation. Pat Hartman might have more experience on this than I do.

You've already indicated that an ODBC link exists because of the way your web stuff works.

Now add to that mix, the teachers in classrooms that want to submit attendance via a web browser, every 60 minutes or so, plus a web based gradebook. These 100+ people all pull up a web page served up by a Win2000 Server box that sends their "posts" to the Novell server via ODBC.

OK, if ODBC links already exist, find out from your vendor if they have a way to exploit those links! Explain to them that the traffic load is a problem and that ODBC links have been proposed as a way to reduce the traffic load you see.

You also commented

if we were to have the LAN people not run this Access based program and then have the teachers submit attendance and run the web based gradebook , it should be noteably faster, right ?

Maybe it would. It is testable in exactly the way you suggested. But this might be a rather Procrustean solution. (Procrustus was an innkeeper from ancient times. He didn't like criticism. If you complained that the bed was too short, he would cut off your legs to make them fit.)

Crippling the product by turning off an important feature is NOT an acceptable long-term solution, I'm sure. So I would first consider working with your original vendor on a way to make your solution fit the environment with reasonable performance. I might not know the details of your product, but I've been on both sides of the software product issue so I know a LOT of the procedural issues when a product doesn't quite work the way you wanted. I spent over 12 years as an employee of a commercial software vendor before I started working with the U.S. Government as a services provider.

Perhaps the vendors can offer an upgrade to make other functions web based that are currently being handled by networked file operation through Access. Or perhaps they can offer an ODBC upgrade. Be willing to flex with them a little bit. Be aware that they might wish to charge you for the added work. But if it makes things work more smoothly, perhaps it is worth it. That is your call, though - not mine.

For example, if the vendors say that they need to make the back end files reside on a Windows box instead of a Novell file server box in order to offer a workable ODBC solution, that is a reasonable statement. (Lack of products on the Novell side is a legit issue.) Inability to provide an ODBC-based linkage AT ALL, on the other hand, is probably NOT a valid statement. It would reflect very negatively on that vendor's knowledge of the commercial product on which they based their offering to you. You would be justified in asking them why they offer a value-added product based on a rather wide-spread commercial product if they don't understand that product (Access) well enough to offer the service you need in this complex environment.

I don't know the laws in your state, but if you have to get REALLY "hardball" with them, a comment about "selling a product unsuitable for the specified task" thus leading to a legal action called "redhibition" usually gets someone's attention. You want to get your legal staff involved before going down this road. The law in question varies from state to state and country to country.

BE WARNED! That is a last-straw action, to be used only if the NEXT thing you are about to do is to break the warranty because you just can't live with the situation any more AND can't get any satisfaction from the vendors.

To keep the explanation simple, redhibition is what happens when someone sells a product unsuitable for a specified task. You can legally FORCE them to take the product back and simultaneously return the purchase price in full. It is a spin-off of laws against false advertising. Which is why you would need your legal staff involved, to decide whether redhibition is even going to work at your location. And REMEMBER - this is the act of desperation. It ONLY WORKS ONCE if it works at all.
 
Doc_Man,

Rumor has it that the vendor is working on a MS SQL Server version for future release. When that will actually come to light has not be revealed.

The problem with that is that we run all of our schools with Novell servers and have them all with in a single remotely managable "tree". There is no way to include a MS Server within the Novell NDS Tree. That means our ability to controll access to that server and it's volumns would end. Without driving out to that school and sitting infront of the box of course. If we can get it into a product like Oracle, that I believe still has Novell compatability, we would be happier. ( Oracle has a data migration product for converting from Access, by the way, but to have EVERYTHING changed requires some programming by some Oracle "partners". )

By the way, I wasn't implying that we stop using the Access program and only use the web features as a solution, but just as a 15 minute test to see if the web people can tell the differance. I was looking for concrete verification of the bottleneck. We will probably need some "proof" before we ask pointed questions of the vendor.

Which leads to another question. Do you know if there is a limit to the ability of an ASP running on a web server to send and recieve ODBC stuff to the file server ? For attendance, ALL the teachers will hit this page at about the same time, when taking role, every 60 minutes or so.


Thanks for the help,

JohnW
 
Last edited:
Sorry, none of my own apps are web-based. Can't help you there. But perhaps others reading this thread could comment.
 

Users who are viewing this thread

Back
Top Bottom