Table import from text file that is 'lost' spaces (1 Viewer)

A systematic analysis would separate the processes within the import:

1) Reading the CSV via query/link: Are all spaces there complete?
2) Testing the database table: can it accommodate spaces as desired with the given definitions? Did you test it manually?

You got the hints, but didn't really pay attention to them.

why the native TransferText function didn't import correctly
A standard method takes care of standard cases. Carrying along spaces was obviously not seen as a conclusive standard by the creators.
 
No, you just use Nz(somefield, " ") on all text fields in the append query.

When you use a RAD tool, you get what you get. This particular tool will NOT import ZLS values. If you were using a fixed file format, I think the import would import the ZLS. Most people think this is a good thing. I've never run into a situation where I actually wanted to keep "spaces" from the imported data.

So, if the Nz() on each text field doesn't do it, you'll need to roll your own import. This will be a PITA because although you can use FSO to read a .csv file and load each record into an array using the split function, the quotes become part of the data and you will need to get rid of them as you write the record to the database table.
 
so I'll have to do it all again by hand
Fact of life. Of course you could just quickly write a routine to create the append query - you’ll have it for all time then
 
@Pat Hartman:
Space <> ZLS <> NULL

You should take that into account when considering. As I understand it, it was all about carrying existing spaces.
 
A systematic analysis would separate the processes within the import:

1) Reading the CSV via query/link: Are all spaces there complete?
2) Testing the database table: can it accommodate spaces as desired with the given definitions? Did you test it manually?

You got the hints, but didn't really pay attention to them.


A standard method takes care of standard cases. Carrying along spaces was obviously not seen as a conclusive standard by the creators.

1) No
2) Yes

A standard method takes care of standard cases => TransferText function didn't import correctly

"..You got the hints, but didn't really pay attention to them..." ???
What tips exactly are you writing about?
 
No, you just use Nz(somefield, " ") on all text fields in the append query.

When you use a RAD tool, you get what you get. This particular tool will NOT import ZLS values. If you were using a fixed file format, I think the import would import the ZLS. Most people think this is a good thing. I've never run into a situation where I actually wanted to keep "spaces" from the imported data.

So, if the Nz() on each text field doesn't do it, you'll need to roll your own import. This will be a PITA because although you can use FSO to read a .csv file and load each record into an array using the split function, the quotes become part of the data and you will need to get rid of them as you write the record to the database table.

String with 10 space char is zero lenght string?
I do not believe
It is evidently a defect of the TransferText native command
I write this because the export procedure correctly writes the contents of the fields only spaces between double quotesù
Exactly as required by the export specification
My question was to understand if there was any special parameter available to assign to the import procedure
But if not, I think there is no other way than to redo the import procedure without using the TransferText
 
If I take that as a given, the text file content is already read incorrectly, so one would have to start there.

If I look at the representation from #1, there is a text file with fixed lengths. The only exception is the title bar, but this can be removed.
Fixed lengths are defined lengths. Therefore, I would dare to try to fill up incomplete reads with spaces per space function or string function, of course in a query embedded in an append query.

"So I would dare to try" ... I can't, of course, because I'm not recreating environments that are foreign and available elsewhere.

Overall, I don't understand the following meaning and use of the spaces, and I'm sure I'm not the only one.

TransferText function didn't import correctly
No. It imports differently than you imagine.
 
Last edited:
If I take that as a given, the text file content is already read incorrectly, so one would have to start there.
If with "..Reading the CSV via query/link.." you mean a command like this
DoCmd.TransferText acLinkDelim, "my_spec","myTable", file_name, True
then yes, the space is lost

If I look at the representation from #1, there is a text file with fixed lengths. The only exception is the title bar, but this can be removed.
Fixed lengths are defined lengths. Therefore, I would dare to try to fill up incomplete reads with spaces per space function or string function, of course in a query embedded in an append query.
Both the export and the import were not performed with fixed length fields, but with a special character as field delimiter

"So I would dare to try" ... I can't, of course, because I'm not recreating environments that are foreign and available elsewhere.

Overall, I don't understand the following meaning and use of the spaces, and I'm sure I'm not the only one.
This is irrelevant, it has nothing to do with the question posed

No. It imports differently than you imagine.
No
The import procedure works incorrectly, because if the designers' idea had been "..all fields with one or more space characters, only spaces characters, will be lost.." then the export procedure would also work by losing the space characters, while It is not so
 
I write this because the export procedure correctly writes the contents of the fields only spaces between double quotes
This is the important part.

For text fields in a csv, they should be enclosed in double quotes for the parser to be sure that it is text.

What is the source of your csv? Did you create it yourself? Can you influence whoever/whatever created it to make sure all text fields are properly delimited?
 
This is the important part.
For text fields in a csv, they should be enclosed in double quotes for the parser to be sure that it is text.
An example of the csv file that I would like to import is the one attached to the first post
Text field containing a space is highlighted in red

What is the source of your csv? Did you create it yourself? Can you influence whoever/whatever created it to make sure all text fields are properly delimited?
That file was created by the export procedure, to export a db data table, a command like:
DoCmd.TransferText acExportDelim, "my_spec", "Articoli", file_name, True
 
An example of the csv file that I would like to import is the one attached to the first post
OK, so your space is quote delimited and still the import gets rid of it - I misunderstood what you meant in the part of your reply which I quoted in Post #29.

Yes, it appears a deficiency in Access import proc that it does not respect the data being fed to it.

A manual solution appears to be your only option.
 
A basic question: do you need the space or a char data field (fixed length) instead of varchar data field?
The export looks like an export of a table with char data fields.
 
Last edited:
there is a text file with fixed lengths
Not quite- the first row contains a date value in the time column. The rest of the rows just show a time value

However it is irrelevant to the question which has been answered many times - which is that fields with only spaces as a value get imported as null.

whether intended by design or just not considered is anyone’s guess - you need to put the question to the original designers. I've tried using other (database) methods for importing/viewing text files and all produce the same result - spaces only fields all return null.

I did try opening my text file example in excel
image_2023-07-01_121131251.png


With the other prompts (field type, etc) this then opens the file with a space in fld2. However if the file is subsequently saved (from excel), the file format is modified.

And doesn't really prove anything since excel does not have strong datatyping whereas databases do

Since OP is responsible for generating the file and is not keen to 'manually' correct for spaces, suggest consider changing the space as a 'marker' for a suitable character instead (perhaps a pipe or another character outside of normal useage)
 
Last edited:
I'm using a TransferText function to import the data contained within a file, and it basically

DoCmd.TransferText acImportDelim, "my_spec", "Items",file_name, True

On my_spec there is variables:
DateFourDigityear = true
DateLeadingZeroes = true
DateOrder = false
DecimalPoint = ,
FieldSeparator = ;
TextDelim = "
TimeDelim = :

View attachment 108588

I would expect the contents of the fields in the Items table to be exactly the same as a text rows contains
But it is not so, in the sense that some fields that previously contained spaces (for example ...;" ";" ";...) , after the import are empty fields
And this causes me problems because some queries are based on the presence of the space character inside some fields

Question is: how to prevent the import command from considering empty fields even the fields that contain only spaces?
Personally, I hate specs. They are not reliable.

I usually link to the Source from a query rather than as a linked Table. I will call that a base query. I will then create a second query to do all the data cleansing and go from there.
 
Personally, I hate specs. They are not reliable.
They've always worked fine for me. If the file format is unreliable, the spec can't fix it if that is what you are expecting it to do.
 
I can't swear to this of course but it seems to me that earlier versions of Access imported spaces, at least in fixed with files which is what I was getting as an export from the mainframe application I was interfacing with. My problem at the time was getting rid of the spaces rather than keeping them. I was new to Access so this was quite a while ago and I know the trailing spaces were being imported, otherwise, I wouldn't have spent so much time trying to get rid of them.

I tried a bunch of stuff today and I can't get Access to import or link using TransferText or the GUI import/export and keep the spaces. Trailing spaces are dropped and that means fields with only spaces end up as null. I didn't try code using FSO. I did look at the files using a hex editor and the bytes were actually space characters.

The other problem, if you insist on preserving the trailing spaces is that when you type them in a table/query/form Access ignores them.
 
Question is: how to prevent the import command from considering empty fields even the fields that contain only spaces?
Is there any chance you can simply search and replace all spaces first (before import) with some other special character that won't be dropped like the space will. Then convert back to spaces once imported into Access as desired.
 
Is there any chance you can simply search and replace all spaces first (before import) with some other special character that won't be dropped like the space will. Then convert back to spaces once imported into Access as desired.

Yes this is also a possibility
But if you have to do some manual work, you might as well redo the import routine completely
 

Users who are viewing this thread

Back
Top Bottom