Solved Key Violation Error Trapping During TransferText Operation

CranstonSnord

New member
Local time
Today, 18:12
Joined
Feb 29, 2024
Messages
10
- All -
I have reviewed many threads across this site and elsewhere, but didn't see an answer that fit my exact situation.

I have a form with a multi-select listbox that I use to allow me to easily import and export table content from my database using an import/export specification. Here is a small snippet of the code that I use to perform the import:

ElseIf Me!ogChoice.Value = 1 Then ' (Import)
' If DCount("*", "[" & Me!lstMetadata.ItemData(varItem) & "]") > 0 Then
' MsgBox "Stop! Make sure these rows" & vbCr & "don't already exist in this table. ", vbCritical, gstrAppTitle
' Else
DoCmd.TransferText acImportDelim, Me!lstMetadata.ItemData(varItem) & "_IMEX_Spec", Me!lstMetadata.ItemData(varItem), "C:\Temp\Initial_Dev" & "\" & Me!lstMetadata.ItemData(varItem) & ".txt"
' End If

If the table being imported to already has content and the file being imported has a duplicate, the dreaded "key violations" error is generated by Access. I have tried to trap this error both in the executing sub and also using the Form_Error sub in order to make it more user-friendly, but to no avail.

As can be seen above, for the moment (and perhaps going forward) I've performed a simple DCount test to see if the table being imported to already has content and then blocking the import if it does, but this prevents me from pulling in non-matching content. Can a key violation error generated during a TransferText operation be trapped?
 
Can a key violation error generated during a TransferText operation be trapped?
Very easy with a small variation. The text file is linked using a query. This allows you to create an append query that only inserts new records after checking.
Sample query
Code:
Dim sSQL As String
sSQL = "INSERT INTO DestinationTable ([FieldList])" & _
    " SELECT T.[FieldList]" & _
    " FROM [Text;DSN=NameSpec;FMT=Delimited;HDR=yes;IMEX=2;CharacterSet=850;DATABASE=C:\Temp\Initial_Dev\].[datei01.txt] AS T" & _
    " LEFT JOIN DestinationTable AS D ON T.Key = D.Key WHERE D.Key Is Null"
' Debug.Print sSQL
db.Execute sSQL, dbFailOnError
You can incorporate the variable parts into the SQL statement using VBA.
 
Wow, thanks for your quick response!

All the above looks really cool, but there are some moving parts that I don't know how to fit into the puzzle. When I generate the sSQL string, it looks like this (for my test sample table):
INSERT INTO AL_TR_AL_Link (ALBUM_ID, TRACK_ID, ARTIST_ID, LOCN_ID) SELECT T.ALBUM_ID, T.TRACK_ID, T.ARTIST_ID, T.LOCN_ID FROM [Text;DSN=AL_TR_AL_Link_IMEX_Spec;FMT=Delimited;HDR=no;IMEX=2;CharacterSet=850;DATABASE=C:\Temp\].[AL_TR_AR_Link.txt] AS T LEFT JOIN AL_TR_AL_Link AS D ON T.Key = D.Key WHERE D.Key Is Null

I'm familiar with an INSERT query and I love the idea, but I'm not sure how to create the TextDSN; I thought it was another way of referring to my IMEX specification, but not so much. I think if I could figure out how the TextDSN part of the statement above works, this approach would work like a champ.
 
Thanks Pat. The only portion of the solution that @ebs17 posted that I'm still unsure of is how to create the DSN NameSpec. Would I have to create a different DSN for each and every different table that I'm importing?
 
...;DSN=NameSpec;...
This corresponds to yours Me!lstMetadata.ItemData(varItem) & "_IMEX_Spec", you should already have these specifications; insert the name in the marked place. The import specification defines how the text file is to be read as a table. If you have text files with different structures, you also need different specifications.
 
Got it. I have already created the IMEX spec, so that's not the issue. Here's what sSQL looks like using my data:

INSERT INTO AL_TR_AL_Link (ALBUM_ID, TRACK_ID, ARTIST_ID, LOCN_ID) SELECT T.ALBUM_ID, T.TRACK_ID, T.ARTIST_ID, T.LOCN_ID FROM [Text;DSN=AL_TR_AL_Link_IMEX_Spec;FMT=Delimited;HDR=no;IMEX=2;CharacterSet=850;DATABASE=C:\Temp\].[AL_TR_AR_Link.txt] AS T LEFT JOIN AL_TR_AL_Link AS D ON T.Key = D.Key WHERE D.Key Is Null

And this is the result when the code attempts to execute the query:

1709235898263.png
 
The same import specification that you would use with DoCmd.TransferText is required. If an error occurs here, then the same with your code above.

And for reassurance: The name of an import specification is something different than the name of a saved import, which is created when you save the one-time manually created import including the specification at the end of the dialog.
 
The spec absolutely exists, because I have already imported/exported with it many times successfully.
1709237421483.png

However, Access only returns the 3625 error code when attempting to use it to link to the text file. I will do some Googling on my end to see if I can better understand how to open a text file as an Access table.

Thank you for all your help!!! I don't want to waste too much of your time - since this isn't a production database, it's not the end of the world if I can't get it to work exactly as I might like. Thank you again for the idea; I will continue to play with the code on my end to see if I can get it to work.
 
DoCmd.TransferText acLinkDelim,...
This would link the text file as a table. If that works, the query must also work. It's the same technique.
 
I would like to point out once again that the suggestion in #2 is not an air code with a test character, but is (if the currently necessary terms are observed) a definitely working solution that I often use in this form.

At best, a problem lies in the concrete individual implementation.
If in doubt, you can upload a sample database with the (empty) table and import specification as well as an example text file here so that a third party can take a look at the overall situation. With the same objects in front of your eyes, you can talk to each other better.
 
Thank you once again for all your time and help. Unfortunately I'm enough of a newbie that I have no clue how to upload an empty database with the necessary objects to this site, otherwise I would do so.

I'm starting to wonder if I have a too-old version of Access (MS Office Professional 2010, Version 14.0.7268.5000) or my operating system (Windows 7) is not up to the task. I very much like your solution, and hope that by playing around with the code I can convince my installation of Access to connect to a text file so I can implement it.

If I'm not successful, no harm done, it's not the end of the world if this particular portion of my application doesn't work exactly as I'd like. Once again, many thanks for your patience and sharing of a great idea. :geek: At this point, I'm willing to mark this question as "Solved" and move on.
 
After much dinking around, I was able to get Access to link to the text file using my IMEX spec in one statement, and then use the linked table in your INSERT query to proceed. WooHoo! I am very pleased - thanks again for sharing your knowledge.
 
"Efficient" is actually getting the error and ignoring it.
This needs special explanation to me.
Have you ever driven a car? Hit the wall and ignore it?
I think it's better to prevent errors from occurring in the first place instead of ignoring them. In some cases it may be very difficult to ignore it.

Specifying the file name the way ebs did is NOT more efficient than simply linking to the text file.
But it is no less efficient either.
There are two reasons why I like to use the variant shown (connection data directly in the query).

1) The text file is only linked for the moment of query access. I save the explicit linking and the explicit unlinking. This makes it easier to process entire batches of files to be imported using a simple loop.
Assuming the text file is removed, leaving a link to it wouldn't be a good situation either. Excel tables can be processed analogously.

2) I don't necessarily have to link the backend table and text file in my frontend in order to access both tables at the same time. As is well known, you can only create a DAO reference or an ADODB connection on one backend at a time. If I now link to the second backend directly in the query, I don't need a third part into which both external backends are linked.

Overall, this means I have a more variable position.
 
Your explanation doesn't explain anything to me.

Either we talk about Access, since we are in an Access forum. Or we talk about something else. Mixing everything with a large mixing spoon doesn't make much sense.
Simple things like not checking for a duplicate before inserting a row actually saves a lot of time once you are in a busy environment.
I do not believe that. How can you say that without measuring it?
Not checking tends to save the developer's intellectual power. Is this the desired efficiency?
An error is an error, also an index error. That's how I see it, and @CranstonSnord sees it that way too.
That you can often get away with ignoring errors. may be so. Anyone who consciously allows something like this to happen can live with the consequences.

if the BE is SQL Server
That would be a different topic, namely the use of T-SQL.

However, the above queries with linked tables are Jet SQL, and since Jet 4.0 (MS Access 2000) this has allowed external databases to be integrated directly into a query formulation. Jet SQL works locally, even with linked tables on a SQL server.
 
"Efficient" is actually getting the error and ignoring it.
Allowing and even ignoring errors is an expression of efficiency? Of course, the database engine rejects anything that is invalid, and the database doesn't collapse straight away.
But what about data integrity, like full imports?
Has subsequent processing of error tables been included in the efficiency analysis, provided that they have recorded all errors?

The focus here was on defending against or avoiding unwanted duplicates and thus index errors against violations of a unique index.
Can someone now tell you how to ignore exactly this error, but keep other errors (index errors due to violation of RI, data type errors, conversion errors, violations of validity rules) in consideration.
But if the database engine is going to trap certain errors, it is hubris to think that you must trap them also to "save the day".
How can you be sure that errors only occur for which the database machine's own error trapping is sufficient?
Or do we forget about errors altogether so that an import is quick and therefore efficient, because it is no longer important to rely on correct and complete data.

As far as I know, a dbFailOnError causes a rollback in the event of errors, i.e. the table is reset to its initial state. You identify the error, correct it in a suitable form and start the query again from the beginning. This is easier than analyzing and cleaning up the table afterwards if the import is aborted and incomplete.
Expected errors such as those in the analysis just mentioned can be avoided during the import process, and violations of a unique index are expected errors, simply because a double import could occur (same data in different files, repeat import with the same file).
If I force a rollback because of expected problems, progress and efficiency are over.

It's a question of style and it has practical implications if you strive to be error-free. What's not to understand there?
 
Last edited:
We are talking here about forms not duplicating
No, the topic was import from text file into table. Where does a form appear and what kind? At best, I need a button click to invoke the import action.
If you leave this film and go to another film, you can of course tell a different story. If you do something like that, you should also mention it EXPLICITLY.
It is not convincing to accuse others of lack of clarity and stupidity when you yourself travel through the world as a ghost driver.

We are talking about importing a file in a situation where the import may contain duplicates. We are not talking about anything else.
I would never be so presumptuous as to assume that there could be exactly one error that I have to deal with. You would never need general error handling anywhere, you would get along just fine with the one more famous you know.

In my mind it's about importing from an environment with a weaker standardization into a database table (or several tables) with a higher standardization. There are systematic problems there.
There are certainly a lot of topics in this forum where it wasn't enough to simply press the standard button when importing from a text file and everything was done. No, there are many opportunities to encounter errors.

"We are talking about" => no, you are having your own conversations elsewhere.

My point is that I/O is the most expensive operation your code performs.
In practice, the proposed additional testing action will only need to be questioned if it produces NOTICEABLE extensions of the runtime. I have never experienced anything like this in my practice.
Well, importing from a 2GB file would certainly require some different considerations. In terms of the technique required, it will make a difference whether I pick up an earthworm or an elephant.
 
Last edited:
Your solution BTW only dealt with duplicates
Code:
db.Execute "AppendQuery", dbFailOnError
The execution also deals with other possible errors. RunSQL or OpenQuery does not appear there.

You have to run a VBA loop
Isn't that a recommendation regarding efficiency?

why in this specific case, it is OK to allow the database engine to catch the duplicates
However, it takes extra effort to limit the import to this specific case.

I'm sure you can find timings done by someone that tell you the cost of a set statement or an identity or an If or a select.
I would be happy about that, but I'm rather skeptical, simply because the execution plan process is different than my SQL statement.
I have never seen convincing measurement tests in this direction.
 
SQL Server table
Where does this suddenly come from?
If I want to communicate efficiently with SQL Server, T-SQL and T-SQL methods are required. Then an MS Access method such as TransferText and Jet-SQL is out of consideration, at least not the first choice.

My experience of less than 50 years teaches me to know what I am touching before selecting methods for necessary actions.
 
It must be difficult to never be able to learn from others.
That is a very arrogant statement.
Any expert will confirm that you should also have a healthy amount of residual suspicion about their statements.
My experience - yes, that does exist - shows me which experts I follow almost unconditionally and, on the other hand, where I differentiate based on statements.
 

Users who are viewing this thread

Back
Top Bottom