Split Database Runs Slowly under different versions of Access

MSAccessRookie

AWF VIP
Local time
Today, 12:30
Joined
May 2, 2008
Messages
3,428
Greetings to all.

I have been assigned the task of supporting an Application written in Access 2003. I have made modifications to the Application for the purpose of user enhancements as well as bug fixes.

The Application was never split into an FE/BE Format, and due to the fact that only one or two users were using it at any given time, so it did not create many problems for the users. Recently, an influx of new users is creating the expected problems, so I arranged to Split the Database and send it to a Test Environment, where it runs so slowly that it sometimes appears to have stopped.

I have spent time some researching the issue, and upgrading the Database to comply with the expert recommendations such as the ones contained in the link. below. As a result of the modifications (some of which are listed below), the Split Database is now functioning at an acceptable level (at least 80% as efficient as the original).


http://www.granite.ab.ca/access/performancefaq.htm
  • The Database Back End was relocated as close to the Root of the Network Device as IT Security would allow.
  • I have been assured by Corporate IT that Anti-Virus Software ONLY checks Local Drives and that network Drives are handled through other processes.
  • Subdatasheet Names have been set to [NONE]
  • Use of Domain Functions in Queries was changed to JOINed Tables.
  • Indexes were added where appropriate.
I set it up for testing by other people. It was at this time that a new issue came to light.
  • The Database Functions slowly under Access 2007.
  • The Database Functions even more slowly under Access 2010.
Testing by other users under Access 2003 produces similar results to my own testing.

Has anyone else experienced similar issues, and if so, do you have any suggestions or recommendations?

-- Rookie
 
It doesn't look like you implemented the first recommendation, the persistent connection? I've not had this problem, but I've heard that will often resolve speed issues.
 
It doesn't look like you implemented the first recommendation, the persistent connection? I've not had this problem, but I've heard that will often resolve speed issues.

Pbaldy,

Thanks for the suggestion. It was my mistake not to say that I did that as well. I implemented the Persistant Connection by creating a Hidden Form that was bound to a Table in the Back End of the Database, but since it seemed to make no difference the overall performance, I failed to mention it. The items that I listed were the ones that had the greatest effect in improving performance.

-- Rookie
 
Last edited:
UPDATE to the issue:

During additional testing, some debug tracking was added to the VB code and it was determined that the code was running more than efficiently, and the MS Access part (loading the Form) was taking an unusually long amount of time.


As an example, when a Form was selected, the following occurred:
  1. It took up to 10 seconds to display the Form on the screen.
  2. It took less than 1 second to Fill the Form with up to 300 items.
  3. It took up to 5 more seconds before the user had input control.
Before the split, the entire process took about 3 seconds. If the Loading of the Data is not the problem, then Access is taking over 10 seconds longer to do its part.

Can anyone clarify what might be going on during the (up to) 15 seconds that Access appears to be involved in the delay, so I can have a better idea as to what I should be looking for?

-- Rookie
 
If forms are slow to open I have found some benefit in setting the Record Source of the form in the forms On Open event. Do this form subforms as well.
Setting the Row Source of combo boxes and list boxes in the forms Open event can also speed things up.
 
In my experience some of the methods you describe - including users having their own front end - can help but more often than not the only thing to cure painfully slow Access 2007 installations has been updating the hardware. This can mean the individual's pc or the server or both.
 

Thanks for the reminder about this. I had already created two Functions and implemented tem in the On Load and On Unload Events for the Forms, and had commented out the code since it appeared to make no difference. I restored the code (displayed below) and it still seems to make no difference.

-- Rookie

Code:
Procedures are called as follows:
[LIST]
[*]Call LoadDropdowns(Me)
[*]Call UnloadDropdowns(Me)
[/LIST]Public Sub LoadDropdowns(theForm As Form)
    Dim ctl As Control
    
    If Nz(theForm.Tag, "") <> "" Then
        theForm.RecordSource = theForm.Tag
    End If
    For Each ctl In theForm.Controls
        Select Case ctl.Properties("ControlType")
        Case acComboBox, acListBox
            If Nz(ctl.Tag, "") <> "" Then
                ctl.RowSource = ctl.Tag
            End If
        Case acSubform
           If Nz(ctl.Tag, "") <> "" Then
                ctl.Form.RecordSource = ctl.Form.Tag
            End If
        Case Else
            'do nothing
        End Select
    Next ctl
    
    Set ctl = Nothing
End Sub
 
Public Sub UnLoadDropdowns(theForm As Form)
    Dim ctl As Control
    If Nz(theForm.Tag, "") <> "" Then
        theForm.RecordSource = ""
    End If
    For Each ctl In theForm.Controls
        Select Case ctl.Properties("ControlType")
        Case acComboBox, acListBox
            If Nz(ctl.Tag, "") <> "" Then
                ctl.RowSource = ""
            End If
        Case acSubform
            If Nz(ctl.Tag, "") <> "" Then
                ctl.Form.RecordSource = ""
            End If
        Case Else
            'do nothing
        End Select
    Next ctl
    
    Set ctl = Nothing
End Sub
 
You didn't say whether the FE's were local or on the server. They should be local so that each user has a separate copy.

Is the database slow for everyone, all the time or does speed vary? One problem I've run into is when using my laptop without a printer, Access slows to a crawl. So check to make sure that everyone has a default printer set up that is always accessable. If no real printer is accessable, try changing the default to the PDF printer. You can use the print dialog to actually print to a printer but leaving the PDF as the default may speed things up.
 
You didn't say whether the FE's were local or on the server. They should be local so that each user has a separate copy.

Is the database slow for everyone, all the time or does speed vary? One problem I've run into is when using my laptop without a printer, Access slows to a crawl. So check to make sure that everyone has a default printer set up that is always accessable. If no real printer is accessable, try changing the default to the PDF printer. You can use the print dialog to actually print to a printer but leaving the PDF as the default may speed things up.

Pat,

Thanks for the reply. I have the standard setup for a Split Database.

  • Each user has their own copy of the Front End on their desktop
  • the Back End is located on the Server as close to the root as IP Security will allow.
  • I also have Bob Larson's FE Update Tool installed.
I will look at the printer issue tomorrow, but for today, I believe I have made a great deal of progress.

I determined that a big reason that the Switchboard was taking so long to load some pages was due to the OnCurrent Event of the Form. Using the Default Switchboard request to open a Form, I found that the OnCurrent Event occurred up to three times before the Form was opened. I was able to get it down one (most of the time), and the time was cut to about 30% of what it was.

Checking OnCurrent Events for other Forms I was able to eliminate several additional bottlenecks as well. I am still not completely satisfied, but it is getting much better.

-- Rookie
 
Hi Rookie,

I develop and sell an Access 2010 solution for a niche market and have observed that as soon as we put a back end on a network performance drops considerably as compared to a purely local installation. I'd guess it averages about 3X - 10X faster locally than over a network. The problem with Access is that all the data must flow over the network to the front end on the user's PC to be processed and then back end. I still have some users with 10 MIPS networks and its horrifying. 100 MIPS networks work okay, usually and 1000 MIPS is, or course, ideal. That said, most users seem to get accustomed to their general network speed and I've never heard a complaint. I have to get online with them occasionally and it's only then that I see what they're living with. I have a truly huge application developed over 20 years and the cost to port it to SQL Server has been quoted in the range of $25k to $50k USD which is way more than I, or my customers, want to spend.
 
Perhaps, you could consider dumping the swtichboard and then what is exactly is it loading?

Foregive my ignorance but I have never used a swtichboard but developed a Menu system instead.

It is also rare to access data without being qualified by search criteria which greatly reduces the traffic. Opeining up the Clients with 30,000+ records and then finding a particular client seems slower than asking for the actual Client in the first instance when opening the Form.

Simon
 
@cyberman - did the folks who quoted $25-$50 k give you any details about what they intended to do? My own apps are always SQL Server ready out of the box. That is because I build my forms to always use selection criteria to minimize the rows returned. I always use DAO (I never switched to ADO since I never had any problem with DAO) and to change the DAO code only requires adding an argument to the .OpenRecordset command when the tables use Autonumbers. The wizard upsizes the tables. It rarely takes me more than two tries to get the tables upsized without errors. I always clean up in Access, delete the SQL server database and run the upsize wizard again. So, bottom line is, the most time consuming part of the conversion will be changing any forms that currently are bound to naked tables or queries without criteria. Usually the easiest solution is to add a few intermediary forms that collect criteria and pass it to the form using the where argument.
 
Hi Rookie,

I develop and sell an Access 2010 solution for a niche market and have observed that as soon as we put a back end on a network performance drops considerably as compared to a purely local installation. I'd guess it averages about 3X - 10X faster locally than over a network. The problem with Access is that all the data must flow over the network to the front end on the user's PC to be processed and then back end. I still have some users with 10 MIPS networks and its horrifying. 100 MIPS networks work okay, usually and 1000 MIPS is, or course, ideal. That said, most users seem to get accustomed to their general network speed and I've never heard a complaint. I have to get online with them occasionally and it's only then that I see what they're living with. I have a truly huge application developed over 20 years and the cost to port it to SQL Server has been quoted in the range of $25k to $50k USD which is way more than I, or my customers, want to spend.

I have to agree with Pat that the effort is as straightforward as she is suggesting. I have had similar success in MS Access to SQL Server conversions. The one wildcard is in the interpretation of what "a truly huge application" consists of. It has been myt experience that one programmers "truly huge" can be the same as another programmers "pretty big" while yet another programmer might consider the same database "above average"

Out of sheer curiosity, what does your Database look like in terms of Tables, Queries, VBA Code, Macros, and Other Resources?

-- Rookie
 
Hi and thanks for taking the time to answer and ask questions. In terms of the sheer size of this beast, I keep track of things on the about page:

Currently:

No. Forms/Subforms: 521
No. of Reports/Subreports: 334
No. of Queries: 3484
No of tables: 414 (many local, perhaps 90 needed on the server)
No Lines of Code: 172,000 (thanks to Allen Browne's "countlines" function (http://allenbrowne.com/vba-CountLines.html)

All forms and reports are based on queries, so that isn't the issue. I think the big issue is that I have no experience with SQL Server and need it handed to me on a silver platter. The quote also included making the system "multi-customer" in the sense that every table would have to carry an additional column to differentiate the customer's data from other customer's data. My thinking is that it would end up being a web-based port of my current system with multiple customers logging in. If that was taken out of the equation, the cost would be much less (and it is still an option). The range of cost depends on how much optimization would have to be done. The thing really does consume CPU cycles with all the calculations it does. It's not just easy input and output such as a bank may need.
 
You're correct, that is a pretty large Access application.
Is the schema properly normalized?

Once you upsize the BE (all tables to the server) you can create a web based FE to certain aspects. You probably don't want to open everything to the world. So a hybrid solution is probably more appropriate.
 
Thanks Pat, and you are correct, very insightful. Not only is this too large, I have two competitors who are cleaning chemical manufacturers, who tried to put an app like this on the web - both spent a ton or money, and both failed. One is trying again, and the other now suggests their software be hosted locally, on the customer's server.

So, yes, I think this would be doing something like trying to put AutoCAD on the web, it's just too complex for the web tools available. Even mighty Google really can't compete with Microsoft in either Word or Excel if one needs the full power of the desktop application. I expect that will change in the years to come, but I just don't see if feasible now.

The tables are normalized. The only design rule I broke years ago is that I store some calculated values in tables. It takes about 30 seconds on a fast desktop to run the main query sequence, so store the values (needed for reports and on-screen display) and have triggers set up so the app "knows" when to do the heavy lifting. 15 years ago, given the speed of hardware, I had to split the space inventory (buildings, floors and rooms) into up to 10 separate databases in order to handle the speed issue - then I got the idea of storing the calculated values and viewing some of the reports happened in seconds when before it would take 10 minutes or more.

Hybrid solution: interesting you mention that, I've been thinking I really only need some aspects of the software, primarily communications, on the web. For example, when one wants to send a service order out, it would be nice to generate it on the desktop and then provide for someone in the field to log in and get the information. Each customer currently does have a unique "identifier" in the form of the license key generated in the internet licensing system I "rent" from Sagekey.com, otherwise, I suppose that could be generated easily enough. So, if I had a small SQL Server setup on the web, and someone to code it, I could get there in short order. The only driver to port the whole BE is the questions I get from IT people who prefer the back-ups, and, of course, some speed issues in a network environment.

From a business model perspective, I understand the drivers to put applications on the web. But, only if you come up with applications thousands of people want (and are willing to pay subscriptions for). I have only a few aspects of my application that could possibly be utilized to make some (reasonably) big bucks. One, and probably by far the biggest one is the need for inspections and transfer of inspection data and the other is for 2-way communications, including complaints, requests, etc. The only thing holding this back is my lack of skills. But, hopefully, I'll either find the time to learn, or find a partner, or an Angel.

My motto should be: "what I lack in skills, I make up for in sheer determination", so, we'll see.
 
The inmates are running the asylum. That is why I left traditional IT years ago. Everyone is obsessed with using the latest and greatest tool and they give no regard to appropriateness. IMNSHO - web forms leave a lot to be desired. They are very much like the CICS applications I was writing 40 years ago except they are prettier. There are obviously uses for web apps but not every application benefits from a conversion and many are significantly less user friendly and lose features in the conversion.

PS - just using queries as the form RecordSources isn't good enough. In order to take advantage of SQL Server (or Oracle, DB2, etc), you must use criteria to restrict the rows returned. If your app works that way now, it should take less than a week to convert to SQL Server and if your table row counts are large, you will see a performance improvement. Jet and ACE are frequently faster than SQL Server, especially for a local database. But once your row counts get higher than a few hundred thousand, SQL Server will frequently be faster because if your queries are written correctly, they will pull significantly less data over the "wire".

I have an application that I support which is sold to large corporations. It will run with ACE or SQL Server as the BE. I include a form that allows swapping one BE for another and you can do the same thing to give your clients the option.
 
Thanks Pat, I can't comment on IT folks since my degree is in Mechanical Engineering, but don't doubt you. In my profession, there seemed to be various levels of expertise, with the docs (professors and super-experts) followed by the guru class (capable of solving the messes of others) followed by the vast majority of "others' with skills ranging from very competent to not competent. There is a need for all levels except the "not competent". Given the pace of change in IT, it is very probable that few really know what's going on.

I understand the attraction to the web from a business standpoint. I have a very mature product that takes some training and smarts to operate and costs from $1500 to $9000 or so. And, believe me, sales are too sparse to drive development at the pace I'm going. In comparison, a couple of ex-building service contractors put up a site to help small contracts bid on jobs and they charge $30 a month. I'd wager they have a good 1000 users or so and that's some serious cash flow.

Please tell me, do you offer consulting services (on a paid basis)? I can use some experienced advice from time to time.
 
...do you offer consulting services (on a paid basis)? I can use some experienced advice from time to time.

You can always come back here to ask the lot of us. There is nearly always someone around that will be willing to respond to your Questions.

:)
 

Users who are viewing this thread

Back
Top Bottom