CSV mass import...with a twist (unforunately). (1 Viewer)

Syllinger

New member
Local time
Today, 18:59
Joined
May 6, 2012
Messages
3
Hey folks,

I have about 30 different master tables and there are about 100 individual data files that need to be inserted into each master.

So far, I have created a macro in excel to list all of the possible field names in the data files, and I have dervived my my master tables from the output.

So, why did I need to do this? Here's the unfortunate twist. The individual data tables all list only the fields that contain data in alphabetical order. There are approximately 40 fields total across all of the data files, but each file usually only contains about 25 fields. Don't ask me why they are formatted this way, I can't figure out the logic either.

My issue is that I now have to get the data from these 1200 tables individual tables and organize them into their respective master.

So, I imagine I have two methods of doing this. I can either clean up the data in Excel trying to create some kind of standard template that I can mass upload using an import spec, but I would prefer to do all of the legwork in Access.

Has anyone ever attempted this, or know of a method by which to do this?

Finally, to throw another curve-ball your way there are several fields that need to be changed from "Text" to "Memo" as they contain more than 255 characters.

Any help/ideas about how to approach this would be appreciated.

Thanks for reading!
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 22:59
Joined
Sep 12, 2006
Messages
15,614
can you clarify this again

the header row defines how many fields, and the field order?
are the data files csvs?

anyway, definitley do not touch them with excel. excel can actually change data. you are better keeping the raw data intact

you need to understand what you have.

in this case, probably do the import manually

read the header.
store the column heads in an array

read the data lines a row at a time
write a sql statement to insert the row into the correct table(s) based on the structure of the header.

you need to make sure your table structure works for all variations of data file you arel ikely to get.

you may be able to import directly to a single master table, as long as all the column headers are present.
 

Syllinger

New member
Local time
Today, 18:59
Joined
May 6, 2012
Messages
3
Hi Gemma. Thank you for the response!

1. I have 400ish folders named according to each individual customer.

2. Each folder contains data files named according to the table the data belongs to.

3. The field names for each data file are found in the first row in alphabetical order, BUT only the fields that contain data are included in the extract.

This example should highlight what I am talking about:

Folder structure:
C:\ABC Company\CustomerInfo.csv
C:\ABC Company\OrderInfo.csv
C:\XYZ Co\CustomerInfo.csv
C:\XYZ Co\OrderInfo.csv

File structure:
C:\ABC Company\CustomerInfo.csv contains:
"Field1","Field3","Field5"

C:\XYZ Co\CustomerInfo.csv contains:
"Field2","Field4"

CustomerInfo (master table) contains:
"Field1","Field2","Field3""Field4","Field5"

As you can see, the master table contains all field names across all files found in the 400ish folders. I created this table using the excel macro I described in the original post.

Your approach sounds bang on, but how do I do that? Do I create a VBA macro similar to the one I did in excel that will open every similarly named CSV across all folders, save the field names to an array, and then write INSERT statements for each row based on the field name configuration?

I.E. Open all CustomerInfo files and INSERT depending on what is in the first row?

For C:\ABC Company\CustomerInfo.csv:
INSERT INTO CustomerInfo (Field1,Field3,Field5)
VALUES (...)

For C:\XYZ Co\CustomerInfo.csv:
INSERT INTO CustomerInfo (Field2,Field4)
VALUES (...)

Something like this?
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 22:59
Joined
Sep 12, 2006
Messages
15,614
you could try designing a standard table, with all 5 fields

then try to design an importspec that will import each variant, even though some columns are missing

you will have to try this to see if it works

if it does then you can just do

docmd.transfertext and do the whole thing (for each file) with one line of code

i don;t quite understand why sone columns of data are missing, if theyt are all important

.....

if you find you cannot import the whole thing - then the trickier alternative is to read successive lines, and write the sql insert statements
 

Syllinger

New member
Local time
Today, 18:59
Joined
May 6, 2012
Messages
3
The reason that all fields are not being pulled is becuase the person extracting the information is incompetent, and apparently this is the only way to pull it out of the current database.

First, I thought that using a loop to insert the data record by record would work, but I feel that this is going to create some masssive program overhead.

Is is not possible to simply have dynamic INSERT INTO statements generated via VBA?
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 18:59
Joined
Feb 19, 2002
Messages
42,976
You can let the query engine help you with this by creating a generic append query and letting it figure out all the columns. Of course this will ONLY work if the column names are IDENTICAL in all the tables.
Code:
INSERT INTO [yourTotablename]
SELECT [yourFromTablename].*
FROM [yourFromTablename];
 

Users who are viewing this thread

Top Bottom