Automatic send e-mail on table field change (1 Viewer)

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
I provided pseudo code
Ok, clear, but this is only to understand if I need to do the verification process or not but my main issue is how I can understand if the value of one field is change in the same linked table ? Data Macro can be used only in local table (that's why i though to create a copy), any other suggested why ? maybe somebody already answer but is seems is not clear to me.
Cheers
 

Josef P.

Well-known member
Local time
Today, 09:43
Joined
Feb 2, 2023
Messages
826
I would not copy the table of the external DB, but write the email dispatch incl. value of RFSU_PLANNED_DATE and a timestamp into a log table.
As soon as the last value from the log table differs from the value from the orignal table, there must have been a change. => new email.

Then you can check via a batch job in regular time intervals if emails (=spam? ;)) should be sent.
(Spam ... because you could also just show in a list that there are changes.)

If a change of values in RFSU_PLANNED_DATE triggers other data operations in another database (used by the VU), one could also use it to build the comparative locus.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 08:43
Joined
Feb 19, 2013
Messages
16,612
how I can understand if the value of one field is change in the same linked table ?

assuming:

1. you have a local table called tblCheck with the fields projectPK and lastDate

2. and you have a linked table called lnkProjects which includes fields called projectPK and and projectDate

you can determine where project date has changed by using the query

Code:
SELECT lnkProjects.ProjectPK, lnkProjects.ProjectDate
FROM lnkProjects LEFT JOIN tblCheck ON lnkProjects.ProjectPK = tblCheck.ProjectPK AND lnkProjects.ProjectDate=tblCheck.lastDate
WHERE tblCheck.ProjectPK is Null

This will return all projects where the date doesn't match and new projects which are not in tblCheck

Edit: added upsert

If you are going to use data macros, you can turn the above into an upsert query which will trigger the data macros. The query would look something like

Code:
UPDATE lnkProjects LEFT JOIN tblCheck ON lnkProjects.ProjectPK = tblCheck.ProjectPK AND lnkProjects.ProjectDate=tblCheck.lastDate
SET tblCheck.ProjectPK=lnkProjects.ProjectPK, tblCheck.lastDate=lnkProjects.ProjectDate

This can also be used to initially populate tblCheck

you will need to check this works as intended - where dates haven't changed the upsert will still 'update' the date to the same value, not sure if this would trigger the update data macro or not. Don't have time to test it myself
 
Last edited:

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
I would not copy the table of the external DB, but write the email dispatch incl. value of RFSU_PLANNED_DATE and a timestamp into a log table.
As soon as the last value from the log table differs from the value from the orignal table, there must have been a change. => new email.
Ok this is clear but, in order to have something to compare as you said, at the first run I should add in the log table the actual value (RFSU_PLANNED_DATE) for all Projects, . Now my question is: what if user add some new project in the Linked table (which i don't have control, i have only a view) ? I will not have that new record in my log table and i may lose some Projects. That's why my idea was: every day at the first logon (to be controlled and I can do) I make a local copy of the linked table which I will use the day after to compare the 2 values (local of the day before and actual of linked table). If there are differences, i will send e-mail, otherwise no. Have sense ?
Then you can check via a batch job in regular time intervals if emails (=spam? ;)) should be sent.
(Spam ... because you could also just show in a list that there are changes.)

If a change of values in RFSU_PLANNED_DATE triggers other data operations in another database (used by the VU), one could also use it to build the comparative locus.
 

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
assuming:

1. you have a local table called tblCheck with the fields projectPK and lastDate

2. and you have a linked table called lnkProjects which includes fields called projectPK and and projectDate

you can determine where project date has changed by using the query

Code:
SELECT lnkProjects.ProjectPK, lnkProjects.ProjectDate
FROM lnkProjects LEFT JOIN tblCheck ON lnkProjects.ProjectPK = tblCheck.ProjectPK AND lnkProjects.ProjectDate=tblCheck.lastDate
WHERE tblCheck.ProjectPK is Null

This will return all projects where the date doesn't match and new projects which are not in tblCheck
Thank you, it is exactly my Idea i tried to explained (step by step) in my previous messages. The only thing to understand was refer to the "tblCheck" which I need to create everyday by copy of the linked table in order to have something to compare with (as you well explained). In any case i decide to do the control on daily bases so, if after i create a local copy ("tblCheck") someone change record, fine i will find next day.
Cheers
 

Josef P.

Well-known member
Local time
Today, 09:43
Joined
Feb 2, 2023
Messages
826
The principle I described would also work with the table copy.
If you use a copy of the table, you also have to check which values change and then update these records.
Why don't I use a mail log table with only the necessary data fields?

For comparison (only dummy code!):
a) 1:1 Copy of table
Code:
update SnapshotTable As S inner join OrginalTable as O ON O.ID = S.ID
set S.RFSU_PLANNED_DATE = O.RFSU_PLANNED_DATE
where
   S.RFSU_PLANNED_DATE <> O.RFSU_PLANNED_DATE

+

insert into SnapshotTable(...)
select ...
from OrginalTable as O
left join
  SnapshotTable as S ON S.ID = O.ID
where
   S.ID IS NULL
=> Datamakro for Update and Insert to send mail

b) Email Log-Table
Code:
insert into MailLogTable (..., RFSU_PLANNED_DATE)
select ...
from OrginalTable as O
left join (
     select ID, Max(LogDate) as MaxLogDate
  from
     MailLogTable
group by ID
) as X ON X.ID = O.ID
left join MailLogTable L ON L.ID = X.ID and L.LogDate = X.MaxLogDate
where
   L.ID IS NULL
   OR
   L.RFSU_PLANNED_DATE <> O.RFSU_PLANNED_DATE
=> Datamakro for Insert
 

CJ_London

Super Moderator
Staff member
Local time
Today, 08:43
Joined
Feb 19, 2013
Messages
16,612
The only thing to understand was refer to the "tblCheck" which I need to create everyday by copy of the linked table in order to have something to compare with
not sure if you are saying that is what you are going to do but you do not need to create it every day. If you do, your app will bloat and you won't be able to use data macros.

Seems you have a number of responders all throwing out similar suggestions and probably confusing you, so I will drop out
 

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
not sure if you are saying that is what you are going to do but you do not need to create it every day. If you do, your app will bloat and you won't be able to use data macros.
Any idea why all button in Data Macro are disabled in the Table Label ? I open a local table just for test the Data Macro functionality but I can't use any button (Before/After Events, Named Macro, etc.) are all disabled. I just did a test creating new db with one table and there it work. It may depend from the setting of db itself (which i didn't change anyway).
Seems you have a number of responders all throwing out similar suggestions and probably confusing you, so I will drop out
I'm happy different people gives suggestion because all together it may allow me to find and understand the solution. So thanks to all who try to support me and i really appreciate it. Then if with my questions it seems i'm confused, apologies but this is only because probably something for you is logic and understandable, it is not the same for me.
 
Last edited:

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:43
Joined
Feb 28, 2001
Messages
27,186
Let's clarify the situation.

You do not have control over the external application, which you say is not Access. You also do not control the form through which changes can be made because THAT isn't Access either. (Or if it is, you still have no control over it.) Therefore you cannot directly get an Access data macro event or form update event to let you know that something needs to be sent. You have nothing to "hook" onto. Therefore, in regards to any direct method, you CANNOT do this. PERIOD. Forget about data macros. They aren't going to cut it for you. (Remember, we're talking "DIRECT" triggers and events in this paragraph.)

This situation allows you to only have indirect methods. You can build a "shadow table" that is (literally) a shadow copy of the real table. Your problem then becomes how often you compare the "real" table to the "shadow" table, how you detect changes, and what you do for each change. There IS a chance you can make this work but it won't be a matter of any type of event except a TIMER event behind a form in your Access database that stays open for the duration of the requirement.

There is a potential major drawback to this method I will describe in that the timing requirements of this operation will be crucial to its success. If you need essentially "real time" detection of changes, i.e. detect changes within seconds, we can stop right here because you can NEVER reach that point. NEVER EVER. Only a direct event could do that and you don't have any.

However, if "less than real time" is permitted, you still have some hope. The biggest remaining drawback is that the tighter the time requirements, the worse this is going to get.

The closest I ever came to this was to copy the real table to a shadow table as an initialization step. Thereafter run a periodic comparison of the real table and the shadow table. You have to contend with three possible changes: Add, Update, and Delete. It would be possible to do this in various ways, but it will always eventually involve a loop at some point to identify, one record at a time, what needs to be done by comparing current "real" table values to last recorded "shadow" table values. Even if it were possible to detect the results in a single query, you would still have to loop through the result-set to send the individual messages.

Here is how I might approach the problem. There can be other ways. This is MY suggestion. To detect the Add, Update, and Delete cases, you need three queries that are all similar.

First, a query that does an INNER JOIN between the real table and the shadow table, JOINed on the PK of the real and shadow tables, selecting only those records where the PKs match but the field in question differs. All you need here is the PK of the mismatching record taken from either table and the letter "U" representing an UPDATE and the value from the real table.

Second, a query that does a LEFT JOIN between the real table and the shadow table, selecting only those records where the real table has no matching shadow record. All you need here is the PK of the real table and the letter "A" representing an ADD and the real table's value.

Third, a query that does a RIGHT JOIN between the real table and the shadow table, selecting only those records where the shadow table has no matching real record. All you need here is the PK of the shadow table and the letter "D" representing DELETE and a 0 value or blank depending on the data type.

Fourth, a UNION query based on the first three queries to give you the PK of the records, the single-letter flags to show you the nature of the change, and the value now in the real table (for ADD/UPDATE) or 0 (for DELETE).

Then you can open a recordset of the UNION query to send your messages one record at a time. If you need more data than I suggested, you can make the three queries also collect the needed extra fields for you from the real or shadow tables as appropriate.

There is a massive penalty in this approach if your required timing in response to a change is extremely stringent. I was on a mainframe system when I applied the above algorithm to a non-Access situation. This mainframe was multi-processor running some pretty fast data buses and the fastest I could run that without overloading the system due to data transfers was about once every 15 minutes for a relatively short table of several hundred possible records.

You would be running this once every so often and the real table is only accessible via ODBC methods, so already not fast. Therefore, do not expect blazing performance or rapid response to changes.
 

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
Let's clarify the situation.

You do not have control over the external application, which you say is not Access. You also do not control the form through which changes can be made because THAT isn't Access either. (Or if it is, you still have no control over it.) Therefore you cannot directly get an Access data macro event or form update event to let you know that something needs to be sent. You have nothing to "hook" onto. Therefore, in regards to any direct method, you CANNOT do this. PERIOD. Forget about data macros. They aren't going to cut it for you. (Remember, we're talking "DIRECT" triggers and events in this paragraph.)

This situation allows you to only have indirect methods. You can build a "shadow table" that is (literally) a shadow copy of the real table. Your problem then becomes how often you compare the "real" table to the "shadow" table, how you detect changes, and what you do for each change. There IS a chance you can make this work but it won't be a matter of any type of event except a TIMER event behind a form in your Access database that stays open for the duration of the requirement.

There is a potential major drawback to this method I will describe in that the timing requirements of this operation will be crucial to its success. If you need essentially "real time" detection of changes, i.e. detect changes within seconds, we can stop right here because you can NEVER reach that point. NEVER EVER. Only a direct event could do that and you don't have any.

However, if "less than real time" is permitted, you still have some hope. The biggest remaining drawback is that the tighter the time requirements, the worse this is going to get.

The closest I ever came to this was to copy the real table to a shadow table as an initialization step. Thereafter run a periodic comparison of the real table and the shadow table. You have to contend with three possible changes: Add, Update, and Delete. It would be possible to do this in various ways, but it will always eventually involve a loop at some point to identify, one record at a time, what needs to be done by comparing current "real" table values to last recorded "shadow" table values. Even if it were possible to detect the results in a single query, you would still have to loop through the result-set to send the individual messages.

Here is how I might approach the problem. There can be other ways. This is MY suggestion. To detect the Add, Update, and Delete cases, you need three queries that are all similar.

First, a query that does an INNER JOIN between the real table and the shadow table, JOINed on the PK of the real and shadow tables, selecting only those records where the PKs match but the field in question differs. All you need here is the PK of the mismatching record taken from either table and the letter "U" representing an UPDATE and the value from the real table.

Second, a query that does a LEFT JOIN between the real table and the shadow table, selecting only those records where the real table has no matching shadow record. All you need here is the PK of the real table and the letter "A" representing an ADD and the real table's value.

Third, a query that does a RIGHT JOIN between the real table and the shadow table, selecting only those records where the shadow table has no matching real record. All you need here is the PK of the shadow table and the letter "D" representing DELETE and a 0 value or blank depending on the data type.

Fourth, a UNION query based on the first three queries to give you the PK of the records, the single-letter flags to show you the nature of the change, and the value now in the real table (for ADD/UPDATE) or 0 (for DELETE).

Then you can open a recordset of the UNION query to send your messages one record at a time. If you need more data than I suggested, you can make the three queries also collect the needed extra fields for you from the real or shadow tables as appropriate.

There is a massive penalty in this approach if your required timing in response to a change is extremely stringent. I was on a mainframe system when I applied the above algorithm to a non-Access situation. This mainframe was multi-processor running some pretty fast data buses and the fastest I could run that without overloading the system due to data transfers was about once every 15 minutes for a relatively short table of several hundred possible records.

You would be running this once every so often and the real table is only accessible via ODBC methods, so already not fast. Therefore, do not expect blazing performance or rapid response to changes.
Thanks a lot for detailed and clear explanation. Lets say that the "time" is not a problem, I'm planning to do such verification one time per day and i will provide feedback based on this period of time.
Also, i want "only" focus on the "change" of one field for "existing" Projects, i know could be ADD/DELETE records but I don't care on it, that records will appear (ADD) or disappear (DELETE) at the moment i re-create a shadow table.
One question about shadow table: i suppose to create it by copy of main linked table every first login of the database. Is the correct way or how i should manage it ?
Cheers
 

ebs17

Well-known member
Local time
Today, 09:43
Joined
Feb 7, 2020
Messages
1,946
One question about shadow table
Counter questions:
- How big is the table to be checked, how many records are in it? As you have learned, data volumes are an important point for the measures that will be applied.
- Is it enough if every recipient receives the complete set of results, or does individual results need to be split up to individual recipients, as @The_Doc_Man suggests, up to one message per record?
 

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
Counter questions:
- How big is the table to be checked, how many records are in it? As you have learned, data volumes are an important point for the measures that will be applied.
Actual size is 15000 Records
- Is it enough if every recipient receives the complete set of results, or does individual results need to be split up to individual recipients, as @The_Doc_Man suggests, up to one message per record?
I will send unique email to multiple recipient with one excel (or csv) file which include 2 fields: Project , Old Date and New Date. This file i will create based on the verification i do (with the check query).
 

ebs17

Well-known member
Local time
Today, 09:43
Joined
Feb 7, 2020
Messages
1,946
Good. There are two tasks:
1) Since the contents of the shadow table or the shadow table itself have to be constantly renewed, you have the problem of bloating the database because simply deleting the records or the table does not free up the used memory and therefore the new data takes up new memory space.
It would therefore make a lot of sense to keep the shadow table in its own local backend. This can be compressed or deleted entirely after the measures have been completed.
The table would also be linked to the frontend.
2) When comparing 15000 records against 15000 records, you should pay attention to the best possible use of the index in order to achieve acceptable runtimes. So the fields in the shadow table should be well indexed. You can safely assume that a primary key is also indexed in the real table.
One question about shadow table: i suppose to create it by copy of main linked table every first login of the database.
That depends on how the general way of working is. If you open and close the application several times a day but only need one evaluation per day, you may want to log such a check in a separate log table.
Either just the date of execution, or, if you need your own historization of the changes found, the entire result set as records with the date.
 

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
Good. There are two tasks:
1) Since the contents of the shadow table or the shadow table itself have to be constantly renewed, you have the problem of bloating the database because simply deleting the records or the table does not free up the used memory and therefore the new data takes up new memory space.
It would therefore make a lot of sense to keep the shadow table in its own local backend. This can be compressed or deleted entirely after the measures have been completed.
The table would also be linked to the frontend.
OK noted, i will create the shadow table in my BE database and i will link it in the FE then i will clean and re-build (delete/append records) everyday.
2) When comparing 15000 records against 15000 records, you should pay attention to the best possible use of the index in order to achieve acceptable runtimes. So the fields in the shadow table should be well indexed. You can safely assume that a primary key is also indexed in the real table.
Yes i will add the same index I have in the Main Table (Project_ID)
That depends on how the general way of working is. If you open and close the application several times a day but only need one evaluation per day, you may want to log such a check in a separate log table.
Either just the date of execution, or, if you need your own historization of the changes found, the entire result set as records with the date.
Yes the idea was to create a table where i have one field (Data_Execution) and update it anytime i do the process. So when the first user login to the DB, i will update that field with the <Today> date and before to do the verification process i will be sure that date is not >= <today> because in this case process is already done.

Thanks for support and sugeestion.
Cheers
 

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
Better in its own backend.
You talking about the shadow Table ? if yes, this is "my copy" table from the linked one (i don't have access to BE Database of the linked table so i can't add any table).
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:43
Joined
Feb 28, 2001
Messages
27,186
One question about shadow table: i suppose to create it by copy of main linked table every first login of the database. Is the correct way or how i should manage it ?

When you perform one scan cycle, i.e. one check for changes, the last act to be done after the last mail message has been sent is to REPLACE the shadow table completely with the real table. This makes this checking process incremental. AND this means you DON'T replace the shadow table when you log in because (unless I misunderstood your setup completely), keeping yesterday's table gives you continuity of coverage over things that might have been changed before your login but after your check cycle. Even if you miss a day because your system is down, you maintain full checking continuity.

You say you don't care about 'ADD' and 'DELETE' cases. OK, that simplifies it a lot. Just think about it because at least the last time I tried something like this, adds and deletes DID count for something. I'm not saying you are wrong, though. I'm just suggesting that it is a point worth thinking about carefully before deciding. It is worth extra attention.
 

LeoM

Member
Local time
Today, 08:43
Joined
Jun 22, 2023
Messages
41
When you perform one scan cycle, i.e. one check for changes, the last act to be done after the last mail message has been sent is to REPLACE the shadow table completely with the real table. This makes this checking process incremental. AND this means you DON'T replace the shadow table when you log in because (unless I misunderstood your setup completely), keeping yesterday's table gives you continuity of coverage over things that might have been changed before your login but after your check cycle. Even if you miss a day because your system is down, you maintain full checking continuity.
Yes absolutely, if I'm not wrong I mentioned in my ideal flow in another message, the replace (update) of the shadow table will be done as soon as after I sent e-mail (which means there are modification).
You say you don't care about 'ADD' and 'DELETE' cases. OK, that simplifies it a lot. Just think about it because at least the last time I tried something like this, adds and deletes DID count for something. I'm not saying you are wrong, though. I'm just suggesting that it is a point worth thinking about carefully before deciding. It is worth extra attention.
Ok noted, in any cases I can manage if required in the future.
Thank you,
Cheers
 

Users who are viewing this thread

Top Bottom