Pass-Through Query or ADO, is there a way?

helen1977

New member
Local time
Today, 05:46
Joined
Oct 4, 2007
Messages
7
Hello All,


I have to run a query against a DB2 table and join it to a local table. My dilemma is that the DB2 table has more than 255 columns some of which I need to return in my query results and these columns are not visible in Access.

The DB2 table is very large and my query has no parameters other than the join between the local table and DB2 table.

Is there an efficient way to perform this query via a pass-through, ADO, or any other mechanism in Access? :confused:
Writing the local table to DB2 is not an option....
 
I was going to suggest writing the table to DB2 ... erm...

If your local table is farly limited you can try using some code to inbed the (local) join into your ODBC where clause...
I.e. If your local table holds a couple of products, Apples, Oranges and Pine apple...
You dont need to fetch the entire table over ODBC, rather...
Select * from thatTable where Produce in ('Apples', 'Oranges', Pine Apple')

Then maybe you can use a created date of records to limit to say the last year or so of data?
 
I agree that you ideally want to avoid the heterogeneous join. (Such an inefficient query request.)
Building an In clause by which to filter will be a reasonable solution if
a) you don't have all that many local records and
b) you don't need to inlude fields from the local table (I kinda get the feeling that you do want that though.)

Not being familiar with DB2, I can't offer specifics.
However you should be able to create temp tables as part of a procedure and join to those (or even table type variables if they're supported).
You'd just have to parse the local data into those temp server tables before joining it to the server based data.
Not necessarily a fast act itself - again depends upon how many rows you're holding locally.

A passthrough in itself won't be an ultimate solution due to the limit of columns Access supports (which is the same as Jet - even though a passthrough bypasses Jet).
You could use a form bound to an ADO recordset though... If only you can get the data "together".

Cheers.
 
Thank you for your replies...

I was able to get write access to DB2 and I've created a table in the database. However, I now need to get my local table into the DB2 table. I tried doing this via an append query but it took about 25 minutes for about 8,000 records.

Does anyone have any suggestions for how to improve the efficiency or a better way to do this? My local table comes from another data source, it's not DB2...

Thanks All! ;)
 
Hi
Yeah 25 minutes does sound pretty lame - but then that's over 5 rows a second. Bearing in mind that each one is performed as an individual insert - it becomes less shocking. It's exactly that kind of latency that means you want to avoid the local join and the 8K records means you don't want an In clause. ;-)

You didn't mention if you need columns from the local table included in the resultset though. That isn't as big a factor given the 8K rows by which you'd need to filter anyway. But better 8K than 80K or 800K or...

The suggestion by Mailman in the mean time is what I'd suggest too.
Either use some server specific functionality to accept a stream of data and transform it (for example into XML) or output to a file on the server (so that the server can read it) and import it using a bulk method.

Again my lack of familiarity with DB2 means I can't offer anything more definitive.
See what the import utility suggested offers.

Cheers.
 

Users who are viewing this thread

Back
Top Bottom