Need Help controlling the size of my Access Form DB

AConrad00

Registered User.
Local time
Today, 06:15
Joined
Aug 21, 2012
Messages
15
Hi All, First Post.

I am creating a form that contains a 6 inputs PER CRITERIA (4 listbox, and 2 textbox) that feed into different queries that run in sequential order to create a listbuild for marketing purposes. The master file is created in DatabaseA and exports into DatebaseB. DatabaseB contains only the master-file and then this form and all the associated queries/tables. DatabaseB is my front end database.

Each "criteria" for the list build functionality runs 4 queries. In turn, these 4 queries create four tables. The last table/query creates a new "MASTER"-file (that removes the marketing subjects for that first criteria), exports the file, and then deletes all the temporary tables I created. This process then repeats for "Criteria2" and then again for "Criteria3". As of now, it stops at criteria3, but would be optimal to go all the way up to 10.

Currently, this process works GREAT! When I run my form, I have three very separate and unique lists of customers that are not duplicates, that export nice and neat into a worksheet on my desktop that is controlled by a user-input text box and the different customer-lists are separated into separate sheets that are labeled "offer1","offer2","offer3"

Here is my issue:
My raw database file now is 234mb (after compact and repair). When I run one list-build, and delete all temprorary tables, my new datbase is now 650mb. When I run another listbuild exercise (I am doing a lot of testing, so am running it often), it goes to 750mb. I can get to the 2Gb limit very quickly and get into issues. I ran into the 2gb file limit and experienced some data corruption and was able to repair the database, as well as do a 'compact and repair' and all is well. I have backups of my work so I am not worried about issues.


However, are there tips/tricks to keep this smaller? I do not really have much background in VBA, although I am not ignorant and willing to learn. Is there something I can do (outside of running a 'compact and repair' process?) to make my process better?




4 queries do this:
1- Customers who bought productA
2- Customers who bought productB
3- Customers who bought ProductA but not ProductB (link the two togehter with "Bring ALL Customers who bought ProductA and only those that match with ProductB" and then creating a "Is Null" for the productB customers.
4- Take old master, remove customers who qualify for the previous criteria (query3), and create new masterlist for the next listbuild criteria.

These queries all make tables, and the tables are wht I use for my joins. I was thinking that maybe I could just link the queries together, although it would probably take a lot longer I'd imagine.

Any Help?! :)


EDIT: Wanted to add that the list build marketing design is that there are customer criteria, but the main Idea, which will explain why there are so many queries, is that I want to get a customer who BOUGHT productA, but did not buy productB. I have to isolate the customers (each customer can have thousands of lines of data in the MASTER file) in each query and then work from there to get the information that I need.
 
Last edited:
If your temp tables blow up your system, then put the temp tables in a separate db, one that you simply delete when done: a "side-end" (one per list-build?) . Also, perhaps some or all "temp tables" should be permanent? After all, each new day does not change already existing data but just adds new, presumably.
 
If your temp tables blow up your system, then put the temp tables in a separate db, one that you simply delete when done: a "side-end" (one per list-build?) . Also, perhaps some or all "temp tables" should be permanent? After all, each new day does not change already existing data but just adds new, presumably.

Spike thank you for the suggestion.

The temp tables will always have the same column-headers, but the underlying data will change depending on what the particular criteria for selection might be from one day to the other.

So I think I have two options:

- Have the queries feed new tables into an outside DB and then delete that 'temp' DB (new to me, doesn't sound too bad though)
- Instead of deleting my tables, I should just clear them upon exit, since the tables themselves will always be the same in terms of structure.

I'm kinda leaning towards the first option, I'll have to try it tomorrow to see how it works.

On a side note, I re-created my database (not working in the production DB) to see if I just linked the queries together without creating tables... the three criteria ran about twice as slow and the filesize is even bigger :)

thanks for the suggestion, i'll report back tomorrow.
 
Don't use make table queries. Clear out the data in the tables and repopulate with append queries.
 
Hi Bob,

thank you for the reply. I will try this method out today and see if it helps contain the filesize growth.
 
Spike,
How would I go about creating a table in a separate DB?

EDIT - wow so simple. surprised that I didn't catch that earlier

EDIT EDIT - New question, I am having trouble accesing the new 'back-end' temporary file with my front-end DB's form. Any resource for this? It's truly a simple sequential line of docmd.openquery's

Bob,
instead of using just 'append' queries, I was thinking that it would be a good idea to clear the data from the tables upon closing. Does this make sense? If it does, is there a way to quickly clear multiple tables without creating a delete-query for each individual table i'd like to clear?

EDIT - Append tables will not work as I am not adding additional lines to the data but actually overwriting the existing data with total fresh data.
 
Last edited:
EDIT - Append tables will not work as I am not adding additional lines to the data but actually overwriting the existing data with total fresh data.
You would just clear the tables either before appending or upon close. I personally use just before appending as you never know what might interfere with something on close.

CurrentDb.Execute "Delete * From TableNameHere", dbFailOnError
 
You would just clear the tables either before appending or upon close. I personally use just before appending as you never know what might interfere with something on close.

CurrentDb.Execute "Delete * From TableNameHere", dbFailOnError

Thanks Bob

The delete command works great, but the append query work-around does not fit my needs, since both my third and fourth query contain multiple fields.

I tried Spike's idea of having my queries make tables into an outside DB but that seems more 'intense' then this solution.

Any ideas for a work around?


this is my #3 query:


INSERT INTO
Criteria_Combine1_Table ( [Criteria_Yes1_Table_Card Code], [Customer Name], E_Mail )

SELECT DISTINCT
Criteria_Yes1_Table.[Card Code], New_Master.[Customer Name], New_Master.E_Mail

FROM New_Master INNER JOIN (Criteria_Yes1_Table LEFT JOIN Criteria_No1_Table ON Criteria_Yes1_Table.[Card Code] = Criteria_No1_Table.[Card Code]) ON New_Master.[Card Code] = Criteria_Yes1_Table.[Card Code]
WHERE (((Criteria_No1_Table.[Card Code]) Is Null));


that bottom area has the logic of this: Isolate all customers who bought productA, isolate all customers who bought product B, and have customer information added. Whoever bought A but also bought B are removed, with that Criteria_No1_Table Is Null statement

sorry if this is super confusing
 
Last edited:
Thanks for all who posted in this thread - The issue has been solved (last week, thought i'd updated for anyone with a similar situation)

Bob nailed it on the head that I needed to clear all of my working tables and merely append them with new information. Being an access-newbie, it took me a little while to figure it out, but now this is working great. I also removed all excess tables, and now am re-using the same 3 tables for each "search criteria", merely clearing all the tables with a delete query before appending with new data.

Also, each of my four MASTER files (customer, product, sales and marketing) are housed on middle-end databases (with all calculations being done on back-end databases) and are linking to the front-end form. I no longer have any issues with filesize management, and this form is now working as intended and I am able to include a entire 10-criteria search form without issue.

Thanks everyone for the responses, appreciate the help!!
 

Users who are viewing this thread

Back
Top Bottom