Run Module within another database?

cyborgoddess

Registered User.
Local time
Yesterday, 19:48
Joined
Sep 2, 2004
Messages
19
I have a module that maxes out my database about 2/3 of the way through running it. I then have to compact and try and pick up where I left off. The total time is close to 5 hours. I thought if I broke up the module into different databases and then linked them back to one shell, I'd stop maxing out the database. That works fine, except for one problem, I am now referencing a module in an external database that uses the "create table" command from and that table is getting created back in the primary database, defeating the purpose entirely.

Is there a way to call a module and have it run within the second database and not within the primary?
 
I don't think it's the module that's "maxing out" your db, but what it performs.

First, what do you mean by "maxing out"? Is it increasing over 1GB (a97) or 2GB (a2k+)?

If that's the case, your module is perhaps creating the bloat, but it's probably not the code that's bloating, but more likely related to creating tables, temporary objects, recordsets etc. Could it be an option to create whatever tables you create in a linked database?

To the question of running code in another database. Depending on how you open it, I think. If you Shell it, it will probably be it's own instance (and without control), but with linking, I think you're either using two instances of Access from one, or running both within the same instance of Access. So memory wise, it will probably peform poorer than having all code within one database. I don't think it'll do anything "bloat wise" either.
 
Hi Roy, thanks for the quick answer - I mistyped, I know what is creating the bloat, I've never found a fantastic, simple way to create five year monthly returns for a universe of 15,000 funds. So what I do is create one month's return for 1/10th of database, paste it into a new table, calcuate a new return off that, append it, clean out the duplicates then repeat the process, 59 more times per step, for 10 more steps. The do it all again for 3 year numbers. The database can make it through those calculations, but then shuts down before the next round of commands.

Unless I am mistaken, I haven't found a way to compact the database in mid module, which would instantly solve my problem.

I've now got almost everything running through linked tables, except when it comes to the above mentioned procedure. I've set it up to be linked, but whenever it comes to the rename table command, it turns the linked table into a primary table and ends up returning no values at all.

My first goal would be to find a way to compact that database midstream, after that, it would be to send a command (Shell as you sugguested) to run the monthly return pieces (I've split up 5 & 3 year into their own databases) then link them back into the main database and continue along.

Actually, my first goal would be to find a way to create a monthly return code that didn't involve seven tables, seven lines of sql code and the inevitable bloat that comes with deleting multiple entries from the same table over and over again - but I've been asking for two years and haven't found the answer to that one yet.

I'll do a search on shell and if I can't find help, I'll be back. Thanks again!

Cheers,
Amy
 
One way to do this is to set up a batch file in dos and have the batch file opening the Access database. If you set the options in Access to compact on close then the database can open and close through the batch file and will compact between separate runs. Macro name would be AutoExec and you would want to modify conditions so that it runs a different macro each time. For example you could make a table that adds one to a field each time through. Then on the last step it resets to 0. This way the conditions could be set so that each macro in the AutoExec Macro would run based on the value that is being updated each time the new macro runs.

Also, there are ways in VBA to delete the temp tables. This is what "compacting" does to the database. Running this code in the middle of the Macro is possible if you create it as a function in a module and then run that function through "RunCode" in the macro.

Here's another idea:

Export the data to a text file and then link to it. Also, include as many links as possible in the database so that space is available for the make tables you are working with.

I know your pain. I have several databases that size-out every month and must be compacted. Have you ever thought of running the data in SQL Server and storing it there so it can be linked to from Access. SQL Server for a single PC is not that pricey!

Good luck!
 
Last edited:
If all you want is to show the data in different ways, what is wrong with having a plain select query instead of making a heap of tables?
 
To expand on that thought. Access treats queries as if they were tables for most purposes. You may be creating temporary tables (and causing needless bloat) for no reason. If a process takes a series of steps, try nesting queries instead of creating temporary tables. Also there is no reason to limit yourself to a single back end. You can use several back end databases to spread the bloat around.

When I had an appliation that required daily uploads of data from the production system ("they" wouldn't let me link to the tables. "they" insisted on doing extracts to keep the production data "safe".), I solved the bloat problem by creating a "table template" back end. The be contained only empty tables defined to hold the daily upload. Each day, the upload process would copy a fresh copy of the table template db from a protected directory and essentially blow away yesterday's full database by overwriting it. Then the process would run the TransferText methods to append the .csv files to the empty tables. Since data wasn't constantly being added/deleted to the be, no bloat occured. The be was only as big as it needed to be to hold a single copy of each table.
 
Great ideas - still no solution

Thanks guy, these are great ideas, I'm going to try and work my way through them and see what sticks.

I can't do this as a series of select queries because I need to keep a table with the running monthly return values. What drives me the most crazy is that I can't be the only one out there trying to calculate monthly returns, shouldn't this be easier?
 
Although calculating running totals in queries is possible, it REQUIRES a uniqueId and is a very inefficient process. Calculating the running totals is simple and efficient when done in a report. Why not rethink what you're doing and put off calculating the running totals untill print time?

Storing calculated values violates first normal form which is the most basic principle of good database design. Do some reading on database design and normalization.
 
Pat Hartman said:
Although calculating running totals in queries is possible, it REQUIRES a uniqueId and is a very inefficient process. Calculating the running totals is simple and efficient when done in a report. Why not rethink what you're doing and put off calculating the running totals untill print time?
This could also be done in a form, having the calculations in a text box, couldn't it?
 
The data is never delivered in a printed report - the numbers are the basis for a whole host of other calculations as well as to plot graphs. Having the numbers in a table is crucial.
 
cyborgoddess said:
The data is never delivered in a printed report - the numbers are the basis for a whole host of other calculations as well as to plot graphs. Having the numbers in a table is crucial.

I think it may be better for you to do your project in excel. Access is meant for processing data and simple calculations. Is not meant for the amount of calculations that you want to do.

If that is not how you wish to go, I still think having a query set up so that you get your calculations and then see the result as pivot Chart view.
 
Very good point Mercer

My last company did the same. I would hold alot of the data in access. I would export alot of information to excel for users to play with. If you divide the 2 up well, you eleviate alot of strain from the db.

Excel will also be faster for many seperate cases of user needs.

If you are doing some graph reports on this data, and the reports are done, if reduplicating the reports becomes too become too tough, or too extensive in production time requirements...

You link to excel sheets and use the same reports... maybe needing to change some naming conventions...
 
I don't think Excel can hold the amount of data I need. Also, there aren't multiple users, it is just me. Once the final numbers are produced, I export just the core data to a new database and send that off to our web developers who import it into mysql.

The core problem is coming up with the rolling 5 year & 3 year montly returns for 35,000 datasets. Once those numbers get produced, they stay fixed until the next round (we run the reports quarterly).

I appreciate all the offers, but I still haven't seen anything that will make this easier/ more efficient.
 
cyborgoddess said:
I don't think Excel can hold the amount of data I need.
...
The core problem is coming up with the rolling 5 year & 3 year montly returns for 35,000 datasets. Once those numbers get produced, they stay fixed until the next round (we run the reports quarterly).

I appreciate all the offers, but I still haven't seen anything that will make this easier/ more efficient.
You are not looking outside the square. If the data you want to hold cannot fit on one sheet, what is wrong with using another sheet?

eg you could limit only a one year's worth of data to one sheet, or if need be down to a quarter, or if you are really desparate down to one month.

I am very sure there has to be a better way of doing it then the way you are.
 
still plugging away

I'm not sure how I could go about doing it in excel. I've got just about enough vb skills to make myself dangerous, and I'm not sure I can get up to speed with doing vb in excel by the time our next production cycle comes along.

I originally had the numbers exported to excel, then manually calculated the returns, then imported them back to access, but the manual input time took too long and defeated the purpose.

In the mean time, I've decided to go with the shell...wait command, but am seeing all sorts of different ways of formatting it and can't figure out which will work for me. Right now I am using the one from the knowlege base, edited to open the database I need, with an autoexec in that database, but it goes through two steps and then tells me the database is read only, even though I've set no permissions to that effect. When I click through the error messages and get back to the primary database, the system has crashed and I have to do a ctr+alt+delete -end now to get out of it.

GRRR...
 
Although the obvious is that the Database is so bloated it is making it crash, It could be something else too like bad coding, having more coding means more problems for you to trouble shoot more coding.

Food for thought
 
Oh, I don't doubt there is bad coding in there - I've hobbled this thing together from imported macros and MS access help forums. There is a lot I don't know and coming here has helped tremendously.

My short term solution is to try and figure out how to make the shell...wait command work. Long term- find a better way to calculate monthly returns.
 
My Fix

I've found the other codes here really helpful, thought I would add the one I used. This is from the knowldege base at Microsoft.

Option Explicit

Private Type STARTUPINFO
cb As Long
lpReserved As String
lpDesktop As String
lpTitle As String
dwX As Long
dwY As Long
dwXSize As Long
dwYSize As Long
dwXCountChars As Long
dwYCountChars As Long
dwFillAttribute As Long
dwFlags As Long
wShowWindow As Integer
cbReserved2 As Integer
lpReserved2 As Long
hStdInput As Long
hStdOutput As Long
hStdError As Long
End Type

Private Type PROCESS_INFORMATION
hProcess As Long
hThread As Long
dwProcessID As Long
dwThreadID As Long
End Type

Private Declare Function WaitForSingleObject Lib "kernel32" (ByVal _
hHandle As Long, ByVal dwMilliseconds As Long) As Long

Private Declare Function CreateProcessA Lib "kernel32" (ByVal _
lpApplicationName As Long, ByVal lpCommandLine As String, ByVal _
lpProcessAttributes As Long, ByVal lpThreadAttributes As Long, _
ByVal bInheritHandles As Long, ByVal dwCreationFlags As Long, _
ByVal lpEnvironment As Long, ByVal lpCurrentDirectory As Long, _
lpStartupInfo As STARTUPINFO, lpProcessInformation As _
PROCESS_INFORMATION) As Long

Private Declare Function CloseHandle Lib "kernel32" (ByVal _
hObject As Long) As Long

Private Const NORMAL_PRIORITY_CLASS = &H20&
Private Const INFINITE = -1&

Public Sub ExecCmd(cmdline$)
Dim proc As PROCESS_INFORMATION
Dim start As STARTUPINFO
Dim ReturnValue As Integer

' Initialize the STARTUPINFO structure:
start.cb = Len(start)

' Start the shelled application:
ReturnValue = CreateProcessA(0&, cmdline$, 0&, 0&, 1&, _
NORMAL_PRIORITY_CLASS, 0&, 0&, start, proc)

' Wait for the shelled application to finish:
Do
ReturnValue = WaitForSingleObject(proc.hProcess, 0)
DoEvents
Loop Until ReturnValue <> 258

ReturnValue = CloseHandle(proc.hProcess)
End Sub
Sub VYr()
ExecCmd "C:\Program Files\Microsoft Office\OFFICE11\MSACCESS.EXE Database"
End Sub


I had originally added the second database as a refrence in my library - but had to remove it because each time I launched this it told me my file was read only. Now it works great, but does cause a huge memory suck as mentioned up above.

I am still looking at ways to minimize that, but thought I'd share what I learned.
 
tinkering

I've returned to this problem with a fresh eye. An earlier poster proposed exporting the data to excel for calculations, the problem I had at the time was inconsistant start dates. Having gotten my mind around exporing the data as a crosstab query has opened me up to what appears to be a fantasic solution - set sheet 1 up to paste the data in, set sheet 2 up with formulas and finally have sheet 3 with the values. (copy, paste special, values should do the trick). But I have no idea how to convert that new crosstab table back into a normal table (as in evidence here - Related Post.

To recap, I've got sixty columns of data going across (60 months of data) and over 5000 user ids going down the column. When I import that back in, there should be 60 entries for each id, with date value columns.

How would you do it?

Thanks,
Amy
 
Wow, you've been on this problem for a few months now, I thought you would have solved it by now.

In answer to your question about cross tab queries, It is a topic of access I know little about, but I can say you are on the right track I think.
 

Users who are viewing this thread

Back
Top Bottom