CSV semicolon delimited without header import and all data in first table column (1 Viewer)

Thibasch

New member
Local time
Today, 04:40
Joined
Feb 23, 2023
Messages
3
Hi all,

I developped some VBA code to import a CSV file (semicolon delimited, no header line, no text qualifier markup) into an existing table (named VERIDASS_Imported_File ) in Access db.
My input file has 70 columns, so the existing table. In the existing table, columns are named F1,F2,..., F70 (initally they were named "Field1, Field2,..." but then I got run-time error 2391)

Data are well imported but everything in first column (it didn't considered the semicolon as separator). I'm pretty sure that it is just one small thing that I have to adapt but I don't get it.
You'll find my code here under

Public Sub ImporterFichier()​
' Open Dialog box to select a file​
Dim fd As FileDialog​
Set fd = Application.FileDialog(msoFileDialogFilePicker)​
fd.Title = "Sélectionner le fichier à importer"​
fd.InitialFileName = ""​
fd.Filters.Clear​
fd.Filters.Add "Fichiers CSV", "*.csv"​
fd.Filters.Add "Tous les fichiers", "*.*"​
If fd.Show = True Then​
' If a file is selected, import it in VERIDASS_Imported_File table, semicolom delimited, no header line, no text markup​
DoCmd.TransferText acImportDelim, , "VERIDASS_Imported_File", fd.SelectedItems(1), False, ""​
End If​
MsgBox "Import Successful!"​
End Sub​
Thanks in advance for your help and have a nice day

Thibaut
 
Data are well imported but everything in first column
You must use an import specification. Among other things, the separator for the columns is defined in this specification.

initally they were named "Field1, Field2,..."
Do you want to import into an existing table with your own assigned field names?

If you import a text table (CSV) without field names, field names are automatically assigned when accessing it, in the way you experienced (F1, F2, F3, ...).

Possibilities:
- Import to temporary table, write its contents to target table via append query (with assignment source field - target field).
- Add the list of field names in the CSV (this can also be done with VBA) and then import the whole thing with field names.

My input file has 70 columns
There are a lot of them. Depending on the subsequent use, you may need some normalization steps and thus a division of the data into several tables including the creation of the keys between these tables.
 
The default delimiter in windows is a comma. If you want to change the default you need to mess around with the registry - generally not a good idea for the odd file, but if all your files are all semi colon delimited, might be worth doing.

other options:

simply create a linked table to your file (providing the file name doesn't change, only need to do this once) and use an append query

Or create a schema.ini file before importing

the benefit of a schema.ini is you don't need to store the import specification - useful if your app is distributed. The downside is you need to create it every time because it is file name specific - but not a problem if you use the same file name each time as per a linked table - and in reality takes but a moment to create.
 
the benefit of a schema.ini is you don't need to store the import specification
The definitions for reading the text file are stored in two system tables for a specification. Since a database application can handle tables quite well, the advantage of storing the same information in an external file instead, which would also have to be included in the distribution, is not so obvious. At this point I don't see any.
 
The definitions for reading the text file are stored in two system tables for a specification.
Agreed - but those two IMEX tables need to be created by Access. Create them manually or using VBA does not 'register' them with Access. They can only be created by creating an import specification. Once created, they can be edited if you know what you are doing. Yes, you can include them when distributing a front end. Which is fine when there will be no variation. I have to cater for clients who receive the same type of data in multiple formats from different sources. I don't need to get involved if the format or structure changes.

an external file instead, which would also have to be included in the distribution
a couple of lines of VBA code can create it as required, no need to distribute.



 
but those two IMEX tables need to be created by Access
Access does what I want (hopefully).
Create import specification of a DB in VB - I can create and fill tables by hand and by code. Of course, apart from a wizard, I have to know exactly what I'm doing. When writing a Schema.ini but also. Apart from you, who is probably fully in the material: Who writes a suitable Schema.ini from the wrist and places it correctly in the file system?

But what I conclude from the mentioned application: It is practical there that only necessary definitions, in extreme cases only the column separator, must be specified via the Schema.ini and thus the variability in the application increases.

The import specification on the other hand is always COMPLETE, so all field names are named and fixed with datatype. This result is then usually immediately applicable in practice.
 
I can create and fill tables by hand
when I tried creating the tables by hand, they were not recognised as supplying an import specification. I'm on 2010 so perhaps resolved in later versions. (And yes, I do know what I'm doing :)

Who writes a suitable Schema.ini from the wrist and places it correctly in the file system?
It's done in VBA - as you say for most cases all you need to state is the file name and the delimiter.
 
The import only has to be done ONCE manually to create the proper import spec. This is the OLD type spec, not the NEW type. You get to it from the wizard. Before you get to the end, you have to press the Advanced button. There you get to the place to specify the delimiter as well as the ability to name the spec. Help yourself by using some name on the short side.

Then whenever you use the TransferText method, you reference the import spec.

Rather than appending the data directly to the production table, you can resolve the column name issue by using an append query. This also gives you some ability to clean or validate the data as it gets imported. So, I would link to the file and then run an append query to get it into the target table.
 
Hi ebs17,

Thank you for your idea. As imported file structure is always the same, I used it and created an import specification. and adapted my VBA code However, even so, the imported data remains in first column.

Public Sub ImporterFichier()​
' Open Dialog box to select a file​
Dim fd As FileDialog​
Set fd = Application.FileDialog(msoFileDialogFilePicker)​
fd.Title = "Sélectionner le fichier à importer"​
fd.InitialFileName = ""​
fd.Filters.Clear​
fd.Filters.Add "Fichiers CSV", "*.csv"​
fd.Filters.Add "Tous les fichiers", "*.*"​
If fd.Show = True Then​
' If a file is selected, import it in VERIDASS_Imported_File table, semicolom delimited, no header line, no text markup​
DoCmd.TransferText acImportDelim, "VERIDASS_ImportSpecification", "VERIDASS_Imported_File", fd.SelectedItems(1), False, ""​
End If​
MsgBox "Import Successful!"​
End Sub​
I thought it would work
 
created an import specification
This is very likely not yet correct.

If in doubt, upload the CSV here (two, three data rows are enough) and a database that is reduced to the table and your code (and that contains the hidden system tables) here. Then you can better estimate details.

@CJ_London:
Before we start talking past each other: At the moment, we are taking very different approaches.

If only the column delimiter is defined, you will be able to view and process any CSV. So, structurally, if diverse to arbitrary CSV are supplied, one would have a first good step for access and processing.

But for me import does not mean to drag any file as any table into my database and be happy about this "success". An import is when the data from the external file is in those tables that correspond to my planned data model and can therefore be used directly and in context. Thus, data should be in the right fields with the right names and the right data types and comply with all defined validity rules.
For this, a higher standardization of such external files for data supply is to be striven for and adhered to as best as possible. In other words, standardization instead of arbitrariness.
So, if the structure of the CSV used is left constant, much more functionality can be put into the first access to the file. Processing is more direct, faster and less error-prone.

My practice is to link such external files and write the data into the table(s) via query(s).
Importing the source table as it is only makes sense to me if, for example, I have to specifically index fields in it in order to achieve acceptable processing times for larger amounts of data.
 
Last edited:
I merely suggested that there is more than one way to break an egg, not pushing that the alternatives are better - just use the right tool for the job. OP has not acknowledged me anyway so happy for you to have the floor :)

However I agree with you, sounds like the OP has not created a valid specification - Since the spec is created as part of that import process would be useful to know if they actually finished the process to check it did as required
 
I merely suggested that there is more than one way to break an egg
That's right, in addition, since Access 2007 there are also saved imports, with again their own peculiarities, so with new advantages (specification for Excel) and their own disadvantages.
 
Must admit, my standard process is to use a sql query to link to the source file and append/update/upsert as required.
 

Users who are viewing this thread

Back
Top Bottom