Primary Key Problem

Aristillus

Registered User.
Local time
Today, 05:19
Joined
Nov 19, 2011
Messages
15
Hi,
I'm hoping someone can help with a problem that I've been trying to solve for some time. It seems to me that my problem can't be that uncommon, and yet I can't find a solution anywhere.

I will simplify what I'm trying to do here for the sake of clarity. I have one table with around 30 fields. Each entry is a record of a quality check on an individual. I have created several queries and created various reports and a form in order to make data entry simple. So far so good.

Now, these quality checks are performed by different people. Lets say that 5 people all have a copy of the database (front and and back end) on their laptops. Now lets say each person records 2 quality checks. These have a primary key (autonumber) of 1 and 2 on each laptop.

The records have to be combined ie uploaded onto a central database. Obviously they can't just be added or I would have 5 entries with "1" as the PK and 5 entries with "2" as the PK. So I realised I needed to use an Append query but leaving off the PK field. This means that the master database will re-generate PKs as the records are appended. So lets say these 5 quality inspectors upload their 2 records to the main database. This now consists of 10 records with PKs of (say) 1 to 10.

Now, the quality inspectors need to update some data on each of their quality checks on their laptops. When they come to upload the amended records to the main database, the PKs no longer match up (ie quality Inspector number 2 has 2 records with PKs 1 & 2 but those very same records on the main database have PKs of 3 & 4).

After a lot of head scratching and googling, I decided that I could create a unique primary key calculated by using an identifier for each inspector and the autonumbered PK. That is to say, the 1st Inspector is given a ref of 123. This is his number and no-one elses. He inputs this when creating a new record and an autonumbered field creates a unique number for that record. I thought I could then multiply the inspector number by 1,000 and add the autonumber to create 123001 as a PK. The next Inspector will be given a ref of 124 and so he will have his first record as 124001.

I thought this was a good idea because (as long as no inspector inputs more than 1,000 quality checks) there will be no duplicates but each record will be unique and the have the same PK on the main database so can be updated.

however, I can't work out how to do this and the Access fraternity will not contemplate calculated PKs and would rather execute anyone who asks how to do this than answer the question. Hence I am stuck.

Any advice very gratefully received.

Sorry for the long post but wanted to try to explain fully what I need to achieve.
 
This is silly:
the Access fraternity will not contemplate calculated PKs and would rather execute anyone who asks how to do this than answer the question.
There is no Access fraternity. People, by and large, are conservative and opinionated. Furthermore, creative and unconventional thinkers are commonly burned at the stake. And this is true in every serious field of study: that first you are schooled in the rules, and then you go about learning when and where it pays to break them.

So do whatever you must to uniquely identify a record. I've used a date/time value before, so capture that when your user enters a new record. Then upload the record creating a new ID in the master table and ensuring the date/time stamp is also unique. Then distribute a copy of that new MasterDb so your user edits historical records, and save a new date/time for edits. Save a new date/time for deletes. Then run code when your user returns his next remote Db and you can compare IDs. If they match, then compare DateTimeCreated, and if they match this is an edit. If DateTimeDeleted is present, find the ID and the DateTimeCreated and delete from the master.

See where this is going? The chance that two remote users perform the same operation on the same record at the same second is very, very low, so you can tell who editted a record last. If the MasterDb, RecordID 20456, DateTimeCreated 11/15/11 14:56:34, has a DateTimeEdited that is later than a new update--possible if two remote users edit the same record--then prompt the current uploader to manually decide which record to keep.
Making sense?
 
Lagbolt, my quip about the access fraternity was not meant to be taken seriously. However, my problem is serious as I am totally stuck. Thank you for your suggestion, but it seems very long winded and complicated. My idea of using a calculated PK from a quality checker ID and an autonumber will definitely provide the necessary uniqueness to make a PK. But I don't know how to do it. When I search for an answer, I can only find comments such as "it's bad practice to use a calculated key" and no one will explain how to do it (hence my quip).
What I would like is for someone to show me how to make a calculated PK directly in the table. If no one is prepared to do that because they think it's bad practice then can anyone see another solution to my problem?

Thanks
 
Sorry, I should have added that the main database cannot be distributed out to all the checkers because of a confidentiality issue which means that each checker is only allowed to see the checks they did themselves. Also, the checkers have no knowledge of Access so i need to keep things easy for them.
Thanks
 
How to make your own key you can find in this forum by searching for custom autonumber.

An alternate solution is to get the GlobalID( if we call the PK in the main DB for GlobalID) for each new record you just appended to the main db, and store it in a field in your local record. So all local records with a globalID exist in main db and just need updates, whereas the ones without it are new and need to be appended to main db.
 
You can very easily make a unique index out of multiple fields. Put a table in design view, find the Index button on a toolbar or a ribbon, which opens a dialog. In the first column type a name for a new index. In the next column select a field name. At the bottom of the form, set the IsUnique property to Yes. Go to the next row and select another field name (without creating a new index name) and you have a composite index that requires the values of the two fields you selected be unique.

You can also create a unique index on a single field and construct that field's value from other data in any way you see fit. People may object to this on the crounds that it duplicates data.

But all you need to do it come up with a scheme to uniquely identify records such that when a checker returns with data you can determine if a record in his table is an edit or an insert or a delete. Then perform that operation against your master.
 
I've been working to put together an enhanced version of the code that I used to create a series of around 40 (different office) independent Access databases.

I gave each database an Instance ID and generated a series of prefixed Primary Key values, i.e Prefix 1: 1x,xxx,xxx up to Prefix 40: 40x,xxx,xxx.

Each table could have up to a maximum of 9,999,999 records instead of the usual +/- 2,147,483,647 available with an Long Integer field but I considered it to be a reasonable trade off to give me a theoretical 214 separate instances (double that if I used negative IDs) of the database.

The ID field, together with a datestamp (updated each time the record changed), allowed me to successfully aggregate the data from all 40 Access databases together in a central location.

I also included a deleted field on each record so I could hide the record rather than physically deleting it, that way there was no danger of accidentally recreating it from a rogue datafile because the datestamp would be earlier.


The function getCounterNext(), there is also an annotated version of the code getCounterNext_WithComments(), and the table tblCounters do all the work.


Feel free to play around and have a nose around "under the bonnet"

If tbCounters was to placed in a separate file, and linked to your application, the counters should be immune from any upgrades of the front or back ends.

By including an AppName column I should be able to share the central counters with any other databases I created.

It's possibly a little over-complicated but I thought I'd share it anyway.
 

Attachments

Spikepl, already tried the search route...wading through dozens of posts which do not match my problem, and after hours coming across one that does, only to read an answer that says "don't use calculated PKs" is a very frustrating pastime. I did however find an answer on a different site and am now testing it. This is simply to use 3 fields: an autonumber field that is NOT the PK, a users unique ID number and a field which will store a calculated number and become the PK. What I hadn't realised is that I then need to use my input form to store the calculated value in the PK. D'oh!
Your other suggestion wouldn't work for me because the users don't have access to the main database. They export a CSV of the table and a central data team append this. It would be too messy to have them then return a file which users have to then import.
 
Lagbolt, Thanks for that! I think I have achieved similar using my input form, although you seem to be saying I can do this directly in a table? I will try this when I get some time.
 
Nanscombe, Thank you for your idea. I am unable to work on the database for a day or two but as soon as I can I will look at the file you posted. Thanks for your time.
 
Why mess with the PKs? The issue is not PK but something that uniquely identifies the record both locally and globally.. Something along the lines of Lagbolt should do. That would be some QualityInspector initials/unique identifier to identify the provenance of the record and datetime of creation - they don't even need to get married in one field. Those two should be enough for the import code in the central db to determine whether an incoming record is new or an update of an existing, when reading the csv.
 
You wrote:
the quality inspectors need to update some data on each of their quality checks on their laptops.

A quality reporting system that allows overwrite/modification of data already reported in strikes me as unusual.
 
I would hazard a guess that the inspectors are given background data about what they are inspecting, i.e. where, what & who (or some such thing) and they are just adding information based on the outcome of the inspection i.e. when, actual location (if different) and result. :)

Either that, or the naughty inspectors are not adding all of the required information at the time and are being chased for it. :D

My old company used to do site visits and we would give them a copy of any data about the site and what they needed to check and then pull back any amended data produced as a result.
 
Quite. It's just that "quality inspection" together with "updating data" sort of screeches in my ears :) But maybe you're right - perhaps this is some kind of a drip-wise inspection.
 
Where I worked, before I was made redundant, "Inspections" were events performed 6 monthly or annually where either equipment, or assets, were checked for condition or 3rd party operating procedures were checked for compliance.

We all have our little pre-conceptions.
 
i don't see why the PKs change

if a user edits some fields on certain records, and another user edits some fields on certain records, then the issue is:

a) just updating any changes in the main database - which is not a problem
b) deciding who should take precedence if two users edit the same record - which I suspect they cannot anyway

now - you can issue the full database to each user, as long as you have code to restrict the records each of them can see - which comes back to data design.

the real problem is what happens if users can add new records- as two different users raising a new record will both get the same new autonumber.

However, even if that happens, when you reassemble the database you can deal with new records, differnetly from amendments to existing records - allocate new autonumbers for all the new records - and thenwhen you resissue the database, it shoulsdn;t matter anyway.

bear in mind that your users should not need to see the autonumbers (although it doesn't matter if they do), and they should not be working in tables directly.

I think it is also possible that your table is not designed in the best way. ie - if for each client you have a number of checks, then these should not really be stored in the client table - they should be in a sub table, linked to the client, I should think
 
Gemma/Dave,
Updating changes to the main database IS the problem, or rather it is a problem unless the user's laptop "knows" the new PK. as I don't want to keep copying the database both ways (the users are not IT-savvy and just click a button to export the data they have created/edited into a CSV which gets appended/modified by a person allocated as the data inputter.
I get your point about a sub-table, but realistically there are probably 150 people who will be checked but they may only have 2 or 3 checks each And these people change on a regular basis.

Thanks

Dave
 
The "inspectors" in this case are have to tick and comment on certain crieteria being met. If the criteria are not being met, they don't get a tick in the box (literally, on the database) but they wil be tasked with putting the problem right, at which time the record will need to be amended and a tick and some comments put in place. This needs to be on the same record hence the need for an update.
 
Great - but I still don't get your concerns about PK, as outlined in #11
 
Are the inspectors creating any new records?

Or is it all amendment to pre-existing records?
 

Users who are viewing this thread

Back
Top Bottom