change query source

supmktg

Registered User.
Local time
Today, 05:11
Joined
Mar 25, 2002
Messages
360
I have a bunch of queries and reports based on a very large ODBC linked table with over 120,000 records and growing. The table's size causes both speed and dropped records problems.

The solution that I came up with is to create a new table with only the records that I really need thru a Make Table Query, and then base those queries and reports on this new table. I have three questions:

Is a Make Table Query the best way to limit the records accessed ( via date criteria ) in my ODBC table?

Is there an easy way to direct my existing Queries to this new table without rebuilding them from scratch?

Is there a way to hide the message box that you are about to delete and create a new table?

Thanks, Neal
 
What is the data source of your linked table? SQL Server? There are other solutions available depending on what your data source is.
 
The source is not SQL Server, it is a relatively little known database by Omnis (aka Raining Tree Data).

Neal
 
supmktg -

Our company ran into similar issues when I ran queries against our DB2 databases. But were were dealing with multiple joins to tables with millions of records. In order to simply speed the process of working with that large amount of data, especially if you working with complex queries, we found that running make-table queries worked well. It also allows us use that data for other purposes.

However, it sounds odd that you would have speed issues with a table of 120,000 records. That is a number that Access can easily handle. I am not familiar with the database you memtioned, but if possible, you may want to try indexing some of the fields that you query most. On a more techinical end, you may need to adjust the size of you buffer pools. If you have a lot of fields and those fields are quite long, you may be limiting the number of records you can read at one time if your buffer pools aren't large enough.

Although I can't really solve why you're having a problem, at least you may like to know that we make good use out of running make table queries from our ODBC databases.
 
Thanks mstorer & pdx_man for your help!

I've solved my issues. For anyone trying to resolve any of the same issues, I've described my solutions below.

My DB is basically a report generator that reports on data in an ODBC compliant database that I have no control over. The particular table that I am linking is very large, and on opening the database I run two queries that filter and create lists that populate two drop-down list boxes in the opening form. These opening queries take as much as 3-4 minutes to run, and I have found that some records are dropped in the process. I don't have the answer as to why it takes so long or drops records, but here's the solution that worked for me.

On open, I run a make-table query to extract and create a much smaller table (filtered by date) that contains only the records and fields that I need for my reports. Then I run the queries that populate the list boxes against this new table. Doing this cut the processing time to under 1 minute, and no longer results in missing records.

I did not have to rebuild my queries. In design view I was able to show my new table, and delete the original (linked) table.

I run the make-table query (along with the other queries and opening of the main form) thru an autoexec macro. I eliminate that pesky message box regarding the deleting and creating of my new table by including a set warning to no in the beginning of the macro.

I've found a lot of help on this forum and I really appreciate it!
I hope this info might be helpful to someone that is having similar issues.

Neal
 
Set Warning

Regarding the Set Warning at the beginning of the macro. I included the Set Warning to No and still get one message saying the original table will be deleted before i run the query. Any ideas?
 
Set Warning to False within the code

Set warning to false within the code
 

Users who are viewing this thread

Back
Top Bottom