Query Append to prevent duplicate

tomtom1126

Registered User.
Local time
Today, 22:30
Joined
Aug 11, 2015
Messages
27
Der all,

Need help for the query return the grouped data and append to another table. we need to prevent the user import more then one times.

Query around 15 field, so it's no way to use the Primery or Index to prevent record duplication. Beside the group data return it's also have no
unique field.

Refer to the above, is there any way to prevent duplicate import?

Thanks so much for the help.

Regards
Tom
 
Why can't you make a unique index of all of the fields. Is the limitation below 16.

Yeh ok I tried it. The limit is ten.

But you can't find uniqueness with ten?
 
Last edited:
Dear Sneuberg,

Thanks for your feedback !

Original data file we do have the unique field, but the data in a file talking about 400 ~ 600 records and every week we import around 10 to 15 files. So I do the group query to store the summery of the data.

The group query return the record should be unique by each row, and I do the append query for this result to the table for final storing.

According the above, is there any way to prevent this group query to append not duplicate? Or this concept is not correct in this way.

Thanks your support.

Regards
Tom
 
Sorry but I don't understand your situation. Perhaps you can expand on your situation.

"Original data file we do have the unique field."

Is this an Access table? Does this have fifteen fields. Is this what's being appended to?

" but the data in a file talking about 400 ~ 600 records and every week we import around 10 to 15 files"

What are these files? Access tables, Excel spreadsheets, text files?
How many fields do they have?
How is the data in these files different than the original?

"So I do the group query to store the summery of the data."

What is this and exactly what does it do?
 
Last edited:
Dear Sneuberg,

1. I have an excel file it's container 400 ~ 600 records and it have the number filed (unique number).

2. The excel file import into the temp table and I do the group query to make it as the summery (reduce to around 60 ~ 80 records) perpose to minimize the file size so the unique number filed is not included.

3. group query will be append to the new table for final storing.

Hope this is clear.

I am the beginer for access, so not sure this is the right way to do this or not.

Regards
Tom
 
Sorry but I still don't understand what's going on, but I gather that during the summary process of this group query the unique number disappears.

Still, of the remaining 60 - 80 summary records I would think there must be something about them that would tell you the different between them and the difference between them and the next group that is imported. What would that be?

If you could show me (upload) some of this data and this group query maybe I could understand.
 
Last edited:
Dear Senburg,

Thanks again, attach is the file may be more clear to understanding.

1. Excel file already import to the table "tmpVslData"
2. "Container no" field in the table can be the primary key or index, but after I group these data the "container no" filed will be not included.
3. I use the Query call "qryAppend" to group these data and append to the new table call "tblVslData".

Problem is how to prevent duplicate data append by this query "qryAppend" to the new table.

Thanks again for your greate help!

Regards
Tom
 

Attachments

Well I can't see any candidate keys in the consolidated table tblVsData which makes me wonder how useful this data is.

That being said, I have a way for you to accomplish what you want albeit somewhat of a Rube Goldberg. The union query without the ALL keyword consolidates duplicate records. So the way you can do this is to create a union between the existing tblVsData table and the output of qryAppend and then make a new table out of the result. Then delete the tblVsData table and rename the new table to tblVsData. I've upload a modified test database that has this except for the code to delete and rename the tables. If you need help on that let me know.

The upload contains:

Part 1 of Union QRY: This is just a query of the tblVsData table and is copied and pasted into the qry Union of Existing with New

qry Part 2 of Union QRY: This is the qryAppend with the INSERT part removed and is copied and pasted into the qry Union of Existing with New

qry Union of Existing with New: Combines the existing tblVsData with the grouped data from the tmpVsData table

qry Make New tbVsDate: Creates a new table named tbVsDataNew which is the results of the qry Union of Existing with New

You don't get back any space when you delete a table until you compact and repair so I suggest doing that often. I'd also consider making a backup before this operation.

Base on what you said you might import up to about a half a million records a year. That sounds like a lot but storage is so cheap nowadays I suggest considering keeping all of the data from the imported spreadsheets. As the consolidate data grows, this consolidation process will get slower and slower.
 

Attachments

Last edited:
Dear Senburg,

Thanks again for your reply.

You are right, storage is cheap now a days so I will consider to keep all the excel data records. But I just think can it be using this way to work for prevent duplicate append by "qryAppend".

1. I will create another table name "tblVslSke", it has the filed name "vvd" as the primary key.

2. The "qryAppend" also had the field "vvd" and this is the index (can be duplicate).

3. I will create the new append query "qryAppendVvd" to add the vvd data only to the new table "tblVslSkd",

4. First will ask system to check "vvd" is not exist in the table "tblVslSkd", if true than run "qryAppend" to update the data into the table "tblVslData".


About just my concept but not sure it's workable or not, if still not the best way to do this. I will take your suggestion to keep all the original data.

Really thanks for your information again.

Regards
Tom
nnnn
 

Attachments

If the value vvd is different in each import then your method would at least prevent the same import from being imported more than once. Is that want you want? It that going to be enough.

I don't know anything about you data so I can't tell you if your concept will work or not.
 

Users who are viewing this thread

Back
Top Bottom