@jdraw - For the most part, yes. We wanted the DM's as a tracker for who changed what and when and they work fantastically for that. Only slight problem is the users sometimes click something by accident or forget to send it on and click it again and the DM (correctly) records that the field was updated 3 times ... but that's hard to find fault with.
We have a separate, semi-recurring issue where the data from one record completely overwrites the data in a different record (often in an entirely different table). I was hoping the DM's might show something with that, but it showed my as the person making the change, when I hadn't even opened that record to my knowledge.
But that is a side problem. It has helped with trying to close records when not actively updating them, and I might need to change to record level-locking, but from other posts on here, I don't think I want to do that.
We have a separate, semi-recurring issue where the data from one record completely overwrites the data in a different record (often in an entirely different table). I was hoping the DM's might show something with that, but it showed my as the person making the change, when I hadn't even opened that record to my knowledge.
We have a separate, semi-recurring issue where the data from one record completely overwrites the data in a different record (often in an entirely different table). I was hoping the DM's might show something with that, but it showed my as the person making the change, when I hadn't even opened that record to my knowledge.
A few years ago I had problems with a form who's source was a table. Editing a text field in one record updated a different record. The form had been in user for over 10 years with only minor changes. It turned out that when the record count surpassed some number over 100,000, Access started have this problem. The fix was to change the source to a query (as it should have been) limiting the number of records to a more reasonable number.
Verify your forms all use queries sources limiting the record count. Start with any referencing tables with more then 100,000 records.
However, the DBEngine runs on the client. And if you edit the linked table in the frontend, you can even use frontend functions in the data macro.
At least, if that's what you want to do
Not 100% clear on what this means, but since I could use a UDF to get values, I thought maybe TempVars would work - they don't. What is interesting is that DMs can and will fail silently and not write an error to the Application Log table like it will for others.
However, all in all, using DM's for audit purposes is fantastic.
I decided to do a write-up on things I learned with data macros - hopefully it helps others:
Disclaimer: I've never actually used the Allen Browne style VBA Audit Macros.
Bluf: Data Macros are somewhat more versatile and generally more difficult to set up.
Things that might trip up a new user:
Like most of the federated Access Macros, you can only call functions, not subroutines from a data macro. Also, there is no MsgBox command. I often use MsgBoxes for debugging. You COULD use them with federated macros, but you would have to do something like:
Code:
Function MsgBox()
MsgBox "I'm here."
End Function
"Old".values are retained. This somewhat simplifies datamacros. For example, I can have an After Delete data macro that writes all of the previous values to a record in the tblAudit. With VBA Audit Macros, I think I would have to have a BeforeDeletion macro that wrote all of the values to variables and then an AfterDeletion macro that wrote all of the variables to tblAudit.
DataMacros handle most of the ways your table can be changed - i.e. Update queries, VBA code, manual changes, etc.
DataMacros are stored in the backend, but are called from the front end (in a split database). Therefore, if you have a function like GetUserName(), it must be included in EVERY FE that uses the BE table. If you also may edit the table directly via the BE, the function must ALSO be included in the BE.
The Data macro must be tied to one of the actions: After Insert, After Update, After Delete, Before Delete, and Before Change. There are three ways to accomplish this:
In jdraw's example, he calls a named data macro to gather his parameters, then he calls a different named data macro to write the parameters to his tblAudit.
In my example, I called a named data macro from the action and used the named data macro to create the record in tblAuditLog. I did not use a named data macro to write the changes and I did not use parameters.
It would also be possible to not used NAMED data macros at all and simply write the steps to the After Insert action, etc. The drawback to this is it is somewhat harder to copy partial parts of the step, if you are troubleshooting. (With NAMED data macros, you can create Macro1, Macro2, etc. and experiment with calling different ones from each step.)
Errors are written to UsysApplicationlog and MAY or MAY NOT have any additional error message.
It is helpful to create DM's incrementally - i.e. record one field, test, record another field, test, etc.
Especially in a split-database, creation and testing is NOT user-friendly. Primarily for four reasons:
You cannot comment out steps like you can with VBA using the '. There is a comment function, but not an easy way to comment/uncomment a step.
It is possible, but somewhat difficult to copy steps. You can click in a blank area and Ctrl-A and Ctrl-C and copy the entire macro, or you can select and copy single steps or single blocks (like an If-Then) block. Also - Shift-click and Ctrl-Click can be used to copy multiple steps, which you cannot easily do with VBA - Update 16-Jan-2024.
Syntax is not consistent, for example, in my example data macro, I use:
If Updated ("Event 1 Complete") <Must be in quotes, even though the interface uses square brackets.>
Create record in tblAuditLog
SetField Name tblAuditLog.Reference <Must have the tablename, even though I specified the tablename in the create record step>
Value = [tblSource].[REFERENCE] - <Must have the tablename, even though I am creating the data macro in this table>
Or value = [Old].[Event 1 Complete] - <Must NOT have the source table name, even though current field values DO require this.>
The interface is extremely clunky and mouse-intensive when troubleshooting. For example, as opposed to VBA where I typically write my code and run the code with F8 to execute line-by-line, I have to:
Open the BE File.
Open the tblSource in DesignView.
Select Table Design>Create Data Macros> Create Named Macro or Table Design>Create Data Macros> Edit Named Macro
Add the steps.
Save the macro.
Close the interface.
Save the table <The DM is technically not saved until you save the table.>
Close the BE <Otherwise I will get an error about the FE not being able to open b/c the BE is locked by me.>
Open the FE.
Test the DM.
Close the FE and open the BE if changes are required.
Repeat.
UPDATE 16-Jan-2024 - Adding two more bullets:
As @Pat Hartman and @jdraw clarified, I forgot to mention that DM also only work with MS Access (Jet/Ace) BE files, so if you are using SQL Server or other formats, or think you may migrate the BE to these formats, you would have to remove the data macros and convert to VBA Code audit macros.
Data Macros require MUCH more co-ordination between the BE and the FE updates. For example:
If I am performing an update with VBA Code audit macros, as I understand it, I have to make two changes:
An update to the BE to add the new tblAudit. However, since this is not used, it will be dormant until the FE is updated. In addition, since it is a brand new table, it doesn't require exclusive access to the BE to add the table.
I can then work at leisure to develop and release the new FE supporting the VBA audit macros.
So two updates, neither one requiring exclusive access. Note that if users have an older version of the FE, and you don't force them to update, they can avoid your audit history.
With DM, my update plan is as follows:
I'll release a FE update with the new GetRealUserName function. This MUST be released prior to adding the DM to the BE or the user will get errors when they update a field. I won't change anything else related to the DM in this release (linking to the tblAudit which doesn't exist yet.)
After everyone updates, I'll update the BE to add the new DM, which will require exclusive access to the BE. Pat's DDL database and Jack's save and import from text macros will be GodSend's for this, and it shouldn't be much more than 5 minutes, if that, of downtime. Note that the tblAudit will start to be populated at this point, even though I can't easily view the changes.
I'll then release the SECOND updated FE that actually makes use of the new DM's. So three updates, one of which requires exclusive access, and might cause errors to the user.