Linking TXT Files to Access (1 Viewer)

hamed_gan

Registered User
Joined
Nov 26, 2004
Messages
10
Hello everybody,

I have a problem which should be simple.

I have a txt file (exported from oracle) containing data I want to add to access. What I usually did was the follwing:

1. Importing the txt file to Excel,
2. Using the Excel Import text wizard to set the brakes for the columns
3. Sort the data and remove the unwanted "crap-lines/data".
4. Save the file as Excel-Worksheet
5. Linking the file to Access. In access the data in the excel file would be taken, manipulated, and added in the Table

But now I have a problem, the txt file is more than 65,536 lines, which we all know is the maximum number of lines Excel can handle in one sheet.

What to do now? The file is a txt file, so I can perform the Import-File function for txt files. However, the Access Import Text wizard works differently than MS Excel. And I don’t really want to import them to Access, as the first need to manipulated in a Query before written in a table. So I need to link them. Linking them would result in the same pronlem.

Could anybody recommend a Program which is capable of doing the same as Excel Import Text wizard, only with large files? So, editing a txt file in such a way to remove the crap/unwanted data, but keeping the “columns” in the txt file in tact?

Any suggestions would be welcome,

Regards,
Zurvy
 

Pat Hartman

Super Moderator
Staff member
Joined
Feb 19, 2002
Messages
28,533
You can use the import wizard to define your .txt file and then save the import spec. Once you do that, you can automate the import/link process and use the TransferText Method or Action. This Method allows you to link or import. You can then use a query to take the linked file and select the appropriate data and append it to the permanent table.
 

The_Doc_Man

Happy Retired Curmudgeon
Joined
Feb 28, 2001
Messages
15,540
When dealing with this amount of data, the only other methods I can think of would be to either

a. Break the file up with an editor into 50,000 line chunks and do several spreadsheets in sequence.

b. Write VBA code to do the equivalent of the import.

1. Open a recordset into a pre-defined table.

2. Open the file.

3. Read (INPUT) a line unless you hit the EOF.

4. Skip your infamous crap lines, assuming there is a programmatic way to identify them.

5. Parse the line into pieces-parts. For instance, if it is comma-delimited, look for the commas using InStr to identify the fields. Do one InStr call for each comma, then use that location + 1 as the starting point for the next InStr function. Pick apart each field and store it either as text or in the eventually desired format in variables inside your VBA module. Remember that if you used the InStr function, what you found included the comma you were searching for. So you might have to remove it before translating the string you got.

6. Do an .AddNew on your recordset

7. Set the field values in the new record based on the pieces-parts you parsed earlier.

8. Do an .Update on your recordset

9. Go back and do it again unless you hit the EOF.

10. Close the file.

11. Close the recordset.

If you don't feel comfortable with VBA than option "a" might be your only shot. I don't recall any interfaces for Windows that allow you to use raw text (delimited or not) as a database.
 

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Top Bottom