Solved Export data from Passthrough Query to CSV

Nicha

New member
Local time
Today, 14:51
Joined
Jul 14, 2025
Messages
8
I need to export data from an ORACLE database, and I'm using Access VBA. This involves running SQL via ODBC, using a Passthrough query that brings in about 1,300,000 records and takes about 15 minutes to run.

I need to find the best way to do this.

I'll describe what I did, but I'm not sure if it's the best option.

1) I create a Querydef for the passthrough "QRY_Extract"
Set qdfPassThrough = CurrentDb.CreateQueryDef(QRY_Extract)

2) I set the query properties - The query is complex and has several joins. To simplify, I just call it 'OracleTable'
qdfPassThrough.Connect = SConnect1
qdfPassThrough.SQL = "Select fields from OracleTable;"
qdfPassThrough.ReturnsRecords = True

qdfPassThrough.ODBCTimeout = 1500


3) I define a second Query that will be a 'Make Table', exporting the records from the "QRY_Extract" passthrough to a local table in Access. This Query is named "QRY_Local_Temp".
Set qdfLocal = CurrentDb.CreateQueryDef(QRY_Local_Temp)
qdfLocal.SQL = "Select into LocalTable (Fields List) Select (Fields) from QRY_Extract;"

4) Query execution "qdfLocal" that will create LocalTable with 1.300.000
qdfTemp.Execute dbFailOnError


*******************************************
I'd like to know the best procedure for exporting to CSV:

1) Directly from the "QRY_Extract" Passthrough, without having created the temporary local table "LocalTable," which will be deleted at the end of the export.
2) Export from the created temporary table "LocalTable."
3) I'm using DAO; would it be better to use ADO?

Last question:
The purpose of the CSV file is to feed an MS query associated with an Excel table, which uses parameters that extract subsets from the table.
Would it be better to not export to CSV or link the MS Query directly to the Access table?

Thank you in advance for your valuable help.
Thank you very much.
 
Last edited:
For my Excel Blood Pressure charts, I read directly from my Diabetes DB.
How often is the table going to change? I would expect you need to create the table just so the Excel can query it reasonably quickly?
 
The extraction will ocurr in a daily basis. So @Gasman, in your opinion, regarding my question "Would it be better to not export to CSV or link the MS Query directly to the Access table?" there's no need to export to CSV; It's better to Link directly to access Table wich seems to be faster...
 
Last edited:
That would be the way I would at least try first.
I am unsure how a csv would be better?
With a table, you can have indices, not so with a csv?
 
I'll try your ugestion @Gasman. Even so, can I ask you the best way (vba code) to export from access table to CSV?
And wich is faster; export from Access temporary table or from the Passthrough?
 
Look at TransferText command.
I would do it via the query. No point of an extra step?
Code:
Sub ExportToCSV()
    Dim exportPath As String
    Dim tableName As String

    ' Set your export path and table/query name
    exportPath = "C:\Users\YourUsername\Documents\ExportedData.csv"
    tableName = "YourTableOrQueryName" ' Change this to your table or query

    ' Export the data
    DoCmd.TransferText _
        TransferType:=acExportDelim, _
        TableName:=tableName, _
        FileName:=exportPath, _
        HasFieldNames:=True

    MsgBox "Export complete!"
End Sub
 
So, sumnarising your idea for my last question, first create table from passthrough query, and then export to csv using your function.
Your function doesn't work on exporting directly from passthrough query?

Thank you once more @Gasman
 
Not my function but ChatGPTs
I would not export to csv. I would just use the table.
That function does export from table or query. Read the code.

I would not set the qdf each time, unless it is likely to change. Just use a saved query.
 
Ok @Gasman. My best regatds.
I'll wait for someone to answer my initial questions to build the complete orocess of extraction from passthrough. And as you nentioned, I'll use allways the same, created manualky, instead of creat the passthrough every time it runs.
Thank you once more.
 
I have never used PassThrough queries, so cannot advise on that part of the process. Sorry.
 
Last edited:
The pass through can be used as direct source for the export to CSV, however it would probably be better to store the data locally in a table as if you need to requery it, filter it etc, the PT query will be fired again.

So to confirm @Gasman option, create the local copy then simply link to that directly from MS query.
Avoid the CSV export as it's not gaining you anything.
 
Ok. @Gasman and @Minty, I've tried your sugestions, and I decided to follow that path.

Regarding the question of using DAO or ADO, can you help?

I've tested with DAO and worked fine.
 
DAO is the preferred/default method. I don't think with a pass-through it will make much odds to be honest.
If it works stick with it.
 
Somehow missed this yesterday. From a theoretical viewpoint, the key question relates to the number of steps involved.

You've got data in an ORACLE database. You want to eventually end up with a .CSV file because you have something happening in Excel. Your say your path (excluding the query parameter setup) is

1. Using a complex JOIN-related query, extract data from ORACLE to form an intermediate table.
2. Using the intermediate table, export to the .CSV file.
3. Rinse and repeat daily.

My question is this: Is there a step between 1 & 2 where you process, modify, filter, or otherwise interact with the detail-level content of the intermediate table?

If the answer is YES, then I don't see an easy path better than as stated without first considering details of that filtration or processing.

If the answer is NO (because that first query does all the work, and I DO mean ALL of the work), then you should be able to perform the .CSV export either directly from the 1st query, or at worst, directly from a second query that names the first query as its source as opposed to that intermediate table.

The goal for efficiency is to reduce the number of "touches" required. You touch it once to get it from ORACLE. Using the intermediate, you touch it a second time.

There is another consideration that might be significant: Compacting and repairing necessitated by creating 1.3 million of ANYTHING on a daily basis. If you do this every day from a fresh COPY of your working DB, that is OK. But I would think that within a few days (if that table has ANY complexity at all), it would quickly blow out your 2 GB limit for a single table. That is worthy of consideration, too.
 
Smart words and smart advise @the_do.
Answering your question, no, there's no process going over the intermediate query.
At this point I will create the table in order to ferd the excel ms qwery, instead of create an CSV for that purpose.
As Gasman said, the advantage of using the table is to use indexes on fields.
Tha table created will be placed in an external dayabase, that. Will be created/deleted every day. The database where the code runs only will have the table until be moved for the external database created on every extraction.
 
So, sumnarising your idea for my last question, first create table from passthrough query, and then export to csv using your function.
Your function doesn't work on exporting directly from passthrough query?
No, That forces Access to read/write all 1.3 million rows TWICE. Use the name of the query instead of a table name in the export command. The data then goes from the server to the .csv directly. No intermediate save!

Also, since the pass though query has several joins, you might want to build a view so that the joins get optimized. That might speed up the process.
 
The ONLY time you would ever consider a make table query is if you had to process the data locally - MULTIPLE times. If you are just passing it through, then the make table is a waste of time and only bloats your database.

PS, in the cases where you do need a make-table query, you should use a separate BE to hold the make table data so you don't bloat your FE or BE. You can simply replace the side-table database from a stored, empty template each time you want to import the data.
 

Users who are viewing this thread

Back
Top Bottom