TransferText : Lookup and remove duplicate records automatically

leongkeat

Registered User.
Local time
Today, 11:49
Joined
Dec 19, 2007
Messages
15
Dear all,

I just attended a VBA programming class, but still very very new to VBA codes.

I have one table that consist of two fields (F1, DateAdded)

Field1 DateAdded
--------------------
ABC 1/5/2009 10:30
ABC 1/5/2009 12:30

P/S: The default value of DateAdded has been set as Now()

Trying to import a text file everyday that consist of [field1] data only. I would like the access program to check if duplicate data imported, then whichever earlier (based on DateAdded) will be deleted immediately.

Take for example, in the above case. Only the ABC 1/5/2009 12:30 will be retained in Table. How should I go about it?

The following is the code I used.

Code:
Private Sub Import()
 
 Dim myfile As String
 Dim db As Database
 
 Set db = DBEngine(0)(0)
 
    myfile = OpenFile("c:\", "txt", "*.txt", 0)
    If myfile = "" Then
      End
    End If
 
On Error GoTo ImportPREVV_Err
    DoCmd.SetWarnings False
    DoCmd.TransferText acImportFixed, "PREVV", "PREVV", "C:\PREVV.txt", False, ""
 
ImportPREVV_Exit:
    Exit Sub
ImportPREVV_Err:
    MsgBox Error$
    Resume ImportPREVV_Exit
 
End Sub
 
Do you want to actually delete it? Or just only retrieve the latest value...

To retrieve the latest value would be best IMHO, just make a query to get the max value and export that query.
 
Do you want to actually delete it? Or just only retrieve the latest value...

To retrieve the latest value would be best IMHO, just make a query to get the max value and export that query.

Well noted. But i actually wanted to delete it...because once the new records come in, we do not need the old record anymore. If we keep that, the database is gonna be really HUGE~

Anyone could help?

P/S: By the way, could you please enlighten me, what is "IMHO"?
 
A common way to delete duplicates is to have an autonumber field in the table. Then use a group by to establish how many columns constitute a duplicate. For example, let's say a dup is any two records with the same last name and first name:

DELETE FROM Customers WHERE AutoNum NOT IN
(SELECT Max(AutoNum) FROM Customers GROUP BY FirstName, LastName)

I suppose it doesn't have to be autonumber, but it would have to be some kind of a unique identifier, perhaps a Customer ID# of some kind.
 
IMHO = In My Humble Opinion

The best way to dedubplicate this is actually pretty easy...

1)
make a group by query that selects the maximum date/time per "Field1"

2)
"Outer join" query 1 to your table and delete any records that do not match, no need to add an autonumber to your table :)

An alternative would be to import your data to a staging/import table. Then first delete any "Field1" that is in your staging table from your target table before appending your data to the target.
 
IMHO = In My Humble Opinion

The best way to dedubplicate this is actually pretty easy...

1)
make a group by query that selects the maximum date/time per "Field1"

2)
"Outer join" query 1 to your table and delete any records that do not match, no need to add an autonumber to your table :)

An alternative would be to import your data to a staging/import table. Then first delete any "Field1" that is in your staging table from your target table before appending your data to the target.

Thanks Jal and Namliam for yours swift response.

Hmm...not intend to add any autonumbering...normally i will encounter a lot of challenges when i want to reuse the table elsewhere, because of autonumbering restriction.:p

Namliam,

I can understand 1). But how to delete any records that do not match? Match between? Dateadded or Field1?

I will assume is DateAdded? But how?:confused:
 
hmz, on second thought.... use a staging table during import... much easier ;)
 
not that profound... widely used...

1) Import file to a temporary, loading table.
2) Compare you staging table to your target, in this case delete anything that needs deleting
3) Append your data to your target table.
 

Users who are viewing this thread

Back
Top Bottom