I have a database for metallurgical defects in castings. The database contains fields for the casting part number, lot number, defect code, defect zone, inspection cycle and defect size. It's possible for a single lot to have multiple defects, therefore none of the data fields lend themselves to be used as a key, where a unique entry is required.
Once a week, the users will import an excel file with the new data. I want to prevent duplicates from being appended to my master table.
I'm currently importing the data to a temporary table, performing several manipulations, then appending it to my master.
I've searched the forum & have seen lots on preventing duplicates one record at a time for data entry, or performing a one time mass deletion.
Any ideas on the best way to prevent duplicates or preforming a mass duplicate deltion deletion on a regular basis ?
Once a week, the users will import an excel file with the new data. I want to prevent duplicates from being appended to my master table.
I'm currently importing the data to a temporary table, performing several manipulations, then appending it to my master.
I've searched the forum & have seen lots on preventing duplicates one record at a time for data entry, or performing a one time mass deletion.
Any ideas on the best way to prevent duplicates or preforming a mass duplicate deltion deletion on a regular basis ?