Access Cose Slows way down Second time it is run

Daveyk01

Registered User.
Local time
Today, 09:05
Joined
Jul 3, 2007
Messages
144
Access Code Slows way down Second time it is run

Hay there,

I have one form with a rather large amunt of code behind it collecting data from an electronic instrument (via serial commands to that instrument and via GPIB interface to some test equipment). Anyway...

The first time I run this test it runs fine. The second time a little slowerm the 3rd time much slower.

I closed the form between runs. No change.

If I exit Access 200 and re-load the database, that form is very fast again.

When the program is slowed down by multiple runs, every test form after that one is slow too. If I only run the test forms once, between instruments and then exit Access and re-load Access, there is no perceived slow downs and my routines all run fast.

Does Access 2000 have a memory leak. The form in question does save data in arrays and then when it's time the data from the arrays is saved in tables and the form closes.

When the slow down occurs, it makes the remaing three test forms take maybe 20 minutes to complere their tests whereas if they run at full-speed the first time through they take 5 minutes.

It like I used up the memory and instead of releasing it, I just keep taking up more.

The form code that I think is causing this error does have variables that
dimensioned to be used by all subs and functions within that module:

Option Compare Database
Dim Results(6, 42) As Single, CourseResults(6, 9) As Single
Dim GainCntlStartGain As Single
Dim NoiseStartGain(7) As Single, NoiseExtAtt(7) As Integer, NoiseGenV(7) As Single, NoiseMeasNoise(7) As Integer, NoisePass(7) As Boolean
Dim FilterPassed(7) As Boolean, GainControlMaxDeviation(7) As Single, GainControlCumulativeError(7, 4) As Single
'
'===================================================================================
Private Function CGP75(Freq As String) As Boolean

I am using a lot of variable arrays dimensioned as "single", but I really didn't think they were that much. when the form closed, they would all be wiped from memory anyway wouldn't they?
 
Last edited:
i've read several times that there definitely is a memory leak in access. however the context in which i read this has always been in designing forms, ie. going in and out of design view of a form can really eat it up...should close and open access once or twice a day if ... etc. i don't know if this is related, exactly.

just in case, from help:
"A module-level variable differs from a static variable. In a standard module or a class module, it retains its value until you stop running your code. In a class module, it retains its value as long as an instance of the class exists. Module-level variables consume memory resources until you reset their values, so use them only when necessary."
 
Running lots of queries in association with form opening an closing burns up the memory.

In my telemarketing DB if the telemarketer elects to see his call results as measured against objectives be updated after each call, then the DB has slowed way down after about 60 records. At the start when he clicks Next Record it will be about 1 second to get to the next record. Once he he has done about 60 calls it is taking about 3 seconds to get to the next record.
 
Davey,

That is definitely NOT a large of memory that your arrays are demanding!

You could be doing something weird in your code that's eating resources,
but that's hard to tell without looking at your code.

I'd think that the problem really has something to do with communications
with your external devices.

Have you ran your code in debug mode?
Where's the code spending all of it's time?

Can you stress test the device(s) to see if repeated use over a connection
makes THEM slow down?

Have your code log it's activity with a date/time stamp at key points. That
should let you see if the whole process is "slower" or just a specific point.

Need more info.

Wayne
 
One function slowing down? I don't think so, anything I do with the program after that data collection page is sloowwwww; anything. The whole system slows to a creep until I exit the database and restart it.

I have been going through code, making sure that I closed all recordsets and dbs after opening them.

I seem to think this occured after adding the last few form/module wide dimensioned variables. As you said it is not that much. I could re-work the subs and functions to pass those variables among themselve without dimensioning them for the whole form to use.

Once that test form closes of the next form loads, any variables dimensioned in those forms, either at the sub/function, or at the module level should flush out of the system should they not?

This is starting to fustrate me. I always seem to be running this form 15 minutes before its time to go home and then it turns in to a slug. I think I am going to print out everyline of code (need a wide carrage line pronter lol) and go through it line by line by line.....
 
Davey,

The memory usage (as stated earlier) is definitely not large at all.

Do your devices come with some support DLLs that you're calling?

Maybe invoke TaskManager and look for some really weird memory/CPU
requirements as the software runs.

It'd be nice to see it first hand, but that's not really possible.

"Typical" Access apps, don't exhibit this kind of behaviour, although there's
really no definition of "Typical". I'd still start the investigation with the
performance of your third-party devices/software.

Do they have a support site?

Wayne
 
I monitor the Performance TAB of the task manager. Available memory and the other readings were not drastically effected.

In my code, I read a "signal height" from our instrument, via the serial port. In high gain situations, I have that routine interpret the signal from the noise. Now, I do not know how this happened, but it looks like I was comparing a Variant Variable to a Single Dimensioned variable in numerous places in that routine. I have corrected that and now all the variables that are compared are dimensioned as SINGLE. That seems to have helped tremendously.

Now here's the kicker. That has been like that for about a year. That was of the first complex routines that I "perfected" last year. Why is it only now causing problems? The program code has grown tremendously since then and a lot in the last few weeks. Could I have hit a precipice code size that has sent me over a performance edge?

If this has truely been the fix, I am baffled. I need to run this over and over again in the next day or so.

Dave
 
Davey,

Without seeing it, knowing the sample rate, or duration, all I can do is
conjecture from this vantage point.

hope I've helped so far,
Wayne
 
My question is this: Since you have already told us you build large arrays, are you releasing them between runs of this collection form? It might be instructive to have the Task Manager showing the tasks and watch the size of Access. I'm betting it grows each time you run this form. I would be interesting in knowing whether the Charge/Commit numbers change each time you get this form going. The C/C represents commited memory in the page file (virtual space). The Task Manager's task panel will show you virtual memory size.

I'm going to assume that you do some things - because you said you were busy looking for things you opened to be sure they were closed. If you dissolve the arrays, you are fragmenting virtual memory. The next time you do that and try to release something, that memory fragmentation will eat your socks. Because to release something in a badly fragmented memory area, you have to step through a series of released data structures to find the place to put the next thing you are releasing. If I'm right, your problem isn't a memory leak per se - it is garbage collection. The solution is to plan to recycle.

That big set of arrays you mentioned... Is there a chance that you create them big enough that you could simply re-use them for each run. I.e. instead of releasing them, keep them around but leave behind a marker to show where you stopped writing. Which means you have to change that array to be bound "permanently" for the life of the Access session. It also means you have to make them big enough to hold the biggest data set you are likely to see. Then use the marker to show how much data you actually took. (It's called "high watermarking" as a method of managing over-sized arrays.)

I would not take this approach if your runs don't reveal a growing Commit/Charge ratio and a growing Virtual Memory size in the relevant task manager panels. But if each iteration consumes more virtual memory and commits more swap space then I might consider NOT releasing that memory array. Which means you have to manage it from a general module, not a class module.
 
Thanks for the detailed replu. Unfortunitely, I am packing up this morning for a quick service run to a customer, and will re-read your reply when I get a chance.

I did capture the Task Manager screen shots while the program was running. I do not understand a lot of it but have uploaded it here.

"If you dissolve the arrays, you are fragmenting virtual memory. "

How do I disolve them? When the form closes does not everything behind the form, VBA code, everything else, go "poof" (for lack of a technical phase <g>). Those variables where declared at the form module level so that all subs and function behind that form would have access to that data without passing it along via the function call.

If everythign goes "poof" as I was thinking, then why is it important do close out record sets and database objects? That blows a whole in my "poof" theory does it not? Cause I know it is very import to close both out and set them to "nothing".

I think I am very good at figuring out the puzzels of code and how analyse collected data, etc, but the advance topics of memory management, I haven't learned yet.

"The solution is to plan to recycle."

I could make the arrays global (always thought that was a bad thing to do). the data can be over-written in them if the routine is run a second time. After the routine runs its 20 minute test, a record set is open, a new record added, and the arrays committed to the database and then the form closes and opens the next form automatically and a different test proceeds from there.

Well, that took longer than I thought, I must run now, but will check back tonight. Thanks very much for your help and explanations.
 

Attachments

It is not clear from what you posted. A better thing to see would have been the "processes" tab with a "before first run", "after first run", "after second run" - with the Access process showing where we can see the virtual memory size.

To answer the question... there are two ways to clean memory. Wholesale and retail. Wholesale means you delete the entire process in which that memory resides. But that is not what happens when you take down the form but leave Access itself open.

When you allocate a chunk of memory, you get a chunk header to go along with the chunk itself. I.e. it isn't an "amorphous" chunk. There is a structural component that says what you allocated. When you close a structure, you deallocate that component. But if you close multiple structures in a different order than you opened them, you run into the headache that Access (or in fact MOST systems, including *NIX and OpenVMS) cannot merge two chunks together very well if there are little chunks still allocated in the middle of the memory heap. So if you do a lot of allocation and deallocation, you end up with fragged memory, a patchwork of released and still-active chunks.

I've got another question. By any chance do your big arrays include strings or variants? See, that will EAT YOUR SOCKS every time. Because the string paradigm is that a string variable is a pointer to the real string somewhere in the heap. If you do a lot of string modifications in your processing, you do the same sort of patchworking to the heap.

If it is at all possible to define a data structure with a fixed-size string and use LSET to do the string work, you do not thrash the string. Then the only catch is to assure that the fixed-size strings are long enough for your "real" longest string. Because unlike the ordinary string assign which destroys the old string in the heap and allocates a new one, an LSET overlays the existing string without touching the heap layout again.

I once used a variant of the method in the above paragraph to change a program with a slow-down problem. It was on a VAX, but the concept was the same because the string paradigm was the same. (Yes, it was "ANSI standard" BASIC on the VAX.) The program before applying LSET took 22 minutes 15 seconds to run. After changing all the direct assignments to LSET, it took 17 seconds.

Again, this is a case where recycling rather than discard/remake was the preferred solution because of heap churning. I suspect that is what is happening, though I can't tell whether it is the big arrays themselves or something associated with the big arrays. But if you have arrays of strings that get "edited" during processing, there's where I'd put my money and attention.
 
Going through code now... I found one plase so far in a different procedure that runs 3 or 4 forms before that one I had in question. It was:
dBerror(10, 16) As String

I changed it to: dBerror(10, 16) As String * 8 <- that is probably one byte more than needed, but I trim it during the save process.

I tried to look LSET up in help and could find nothing in the index or search. I type it in the routine like: LSET dbError(1, X) and then help found it just fine.

Why that variable is a string is a mistery to me and I am the one that created it. I could have made it a single and then added & "dB" in the procedure report print-out. I guess I just got lazy. I am record stuff like"2.38dB" in it. Duh. I would change it over to a single variable now, but don't want to have to ripple change a bunch of stuff (at least until I am done with the search and re-working the 27 page report in other areas).

Thanks very much for the information and advice.
 
Is this the kind of screen shot you wanted to see? It is not when the massive TestGen form and routines are running (I can not do that here at home), however a CPU intense test is running.

I found numerous poorly dimensioned String Arrays and have converted them all to fix length string arrays (took about an hour). There were two string arrays in a public function that was call quite often by most forms. I change them to fixed length and use LSET everywhere to equate string arrays.

So far, from what I can test here at home:
A Phased Array Linearity test routine used to take 10 seconds per beam (64 beams), it now takes 6 seconds per beam. That is a savings of about 256 seconds.

Another ASTM PA Linearity test is now blazing. It was pretty fast before, but now much better.

I can not test the EN12668-1 Generator Test page that was slowing Access down until maybe tommorrow, or probably Friday.

I have a lot of hopes. I remeber back in the early '90s with VB for DOS always dimentioning string arrays as fixed length for memory consumption problems and having a white paper from Microsoft on variable memory useage. I guess I forgot about all that or have gotten very lazy with VB6 and VBA since this is the first such problem like this that has hit me. These more modern machines have a lot more memory and better processors. I don't know how much of that memory Access is capable of using? If someone developes a memory hog VBA application on a computer with a GIG of ram, could it have problems on a puter with only 512megs ram? (only...lol, I remember buying a $700 ram upgrade for my OSI Superboard/C1P to take it to 16K Ram. I think before it had 4K??
 

Attachments

As to the question of whether a memory hog running on a 1 Gb machine will run on half a gig. Damfino! The best solution is to try it, but be prepared to think about array access order. Look out for having a bunch of apparently parallel arrays that you traverse. Here's the hint. In memory, they are parallel and LINEAR AS A GROUP. Picture a train on a track. In memory (as the track), the cars of the train are stretched out. NOW have several trains but a single track. (That single track being the imaginary line representing low memory to high memory.) If you have parallel arrays, that is what you really have.

BUT if you could make one array of a structure that contains one of each element of the parallel arrays, you have everything you need at hand for a single item. I know that might not work so well, but if it is possible, it would be good if you start paging or swapping to any degree.

Here's the catch... if you really only use 68 Mb of RAM as shown in your display, you don't need to do this. But if you get into the hundreds of Mb, then my suggestion makes sense. Hold it on the back burner until you decide you need it.

See, if you have enough memory, you don't swap anyway. But if you start needing memory, you start swapping and that is when your machine will go nuts. Here's a rule based on the priorities of your swapper process: When a process swaps, processing stops. This is because of the really high priority assigned to memory management functions. So watch your statistics for swapping. If it starts, you are going to be using a lot of virtual memory and that might not be so good.

As to how much memory can Access use? All of it up to about 2 Gb. Trust me, though, you won't like the performance of a system using memory at that level. Given that memories are probably still at 833 MHz, it would take... 2.4 seconds I think ... for one scan of memory.
 
Finally got back to using the program.

That routine is slowing down to a crawl again. Here is the memory usage. I need to investigate some more line by line.
 

Attachments

Printed out the code of the module on 8.5x11 paper landscape mode. Took 150 pages. I am about 3/4's of the way through the code. So far I have seen no mistakes. however, I see that where I passed the single dimensioned variable Results() to a local function.

Results was dimensioned at the very beinginng of the code module so that all functions could used it. I would hope passing it along a command line of a functions doesn't create another instance of it gobling memory.

This routine has grown over a year. At one point, Results was not dimensioned at the beginning of the module so I passed it along the command line of the function. I guess when I made it "global" (to that module only) I didn't update a number of functions.

Couldn't tell you if that is a problem or not. All I know is that after running this code module, Access's operations are very slow, so I exit access, re-start it and continue my testing at the next test form. It then runs full speed again for the remainder of the tests.

Dave
 
Re: Access Code Slows way down Second time it is run

Okay Doc Man, here is another clue...

Looking at memory useage, when my first form loads, it is 20,000K. The I load an adjust form, it goes up to 27,000K. When I exit and close that form, it only drops back down to 25,000K.

I would bet each form chews up some more and then it is not all released?

What poor practices could I be doing that could cause that?
 
Last edited:
This slow-sown is still an issue even using an MDE version of the program. I am wondering about Access 2000 Runtime but can't seem to find that in Microsoft's site anymore.

After the first three pages of test routines, the program starts operating sloowwww. Exit Access, restart the program. Start at page four and it is up to full-speed again.

The amount of memory usage I see in the task manager does seem to be related to this issue. Running the MDE version, after page three, 32,xxxK is in use. Running the straight MBD version, 45,xxxK is in use. CPU usages is consistantly rather low. Access just starts slowing up.

This is still fustrating.
 
I am wondering about Access 2000 Runtime but can't seem to find that in Microsoft's site anymore.
It never was on Microsoft's site. The runtime for every version up UNTIL 2007 was something you had to purchase by buying the Developer Version of Access (and it was a pretty penny at that). You might be able to find someone with a copy you can purchase on Ebay or something but you need to be careful as there is a lot of counterfeit merchandise there as well and you can get burned.
 
Just getting in on the tail end here but a couple of observations:
I'm curious why you're not using LabView instead of VBA to interface to the machine. LabView is made to handle large amounts of data from a GPIB.

But, like you, I opted to use VB instead of LabView because we didn't want to purchase the LabView/G object for writing to Oracle. And I learned quite a bit.

In my experience, I found that we could make code re-entrant and that could cause some problems with bloating after several runs. We also had some problems with fragmenting memory (similar to fragmenting a disk drive) as already alluded to and I believe it was directly caused by the re-entrant code.

In order to fix the re-entrant problem, you can create a static variable at the front of each subroutine/function that won't allow you to be "inside" any sub/function more than once at a time (you have to code this, it is not automatic). With the information gained from disallowing re-entrant code, you could more easily troubleshoot where in the code you are having problems.

Of course, recursive code can cause similar problems but that is not what I'm (necessarily) talking about.

This is in addition to the already great advice you've receieved from Doc, wazz, and Wayne.
 

Users who are viewing this thread

Back
Top Bottom