Calculate DateDiff Between 2 records in same table/query (1 Viewer)

arnelgp

..forever waiting... waiting for jellybean!
Local time
Tomorrow, 01:55
Joined
May 7, 2009
Messages
19,169
see the code of the button.
 

Attachments

  • rainfall_v02.zip
    444.7 KB · Views: 73

Eugene-LS

Registered User.
Local time
Today, 20:55
Joined
Dec 7, 2018
Messages
481
see the code of the button.
Excuse me, - which button?
...
I see your changes. It is acceptable and so, of course, but the relation by Long fields (as in my version) works faster than by Date fields (Double)
 

arnelgp

..forever waiting... waiting for jellybean!
Local time
Tomorrow, 01:55
Joined
May 7, 2009
Messages
19,169
The open event.
 

Eugene-LS

Registered User.
Local time
Today, 20:55
Joined
Dec 7, 2018
Messages
481
The open event.
Thanks, I've already seen it...
Code:
Me.RecordSource = "query1"
it does not matter to much ...
You got my idea, i hope.
(TC is off now - will see his point ...)

Thank you for pointing out my omission, the new version is below.
 

Attachments

  • rainfall_v04.zip
    335.4 KB · Views: 59
Last edited:

SHANEMAC51

Active member
Local time
Today, 20:55
Joined
Jan 28, 2022
Messages
310
I am trying to calculate the elapsed time between 2 records in the same table. I have a table which records rainfall data per day. Obviously, not all days are recorded.
Code:
insert into data( recordingDate,rainFall ) values(#10/19/2020#,1.054527)
insert into data( recordingDate,rainFall ) values(#10/22/2020#,1.282499)
insert into data( recordingDate,rainFall ) values(#10/22/2020#,5.367941)
insert into data( recordingDate,rainFall ) values(#10/29/2020#,5.44014)
insert into data( recordingDate,rainFall ) values(#11/06/2020#,0.8189356)
insert into data( recordingDate,rainFall ) values(#11/08/2020#,6.789134)
insert into data( recordingDate,rainFall ) values(#11/12/2020#,3.570231)
insert into data( recordingDate,rainFall ) values(#11/14/2020#,7.043958)
insert into data( recordingDate,rainFall ) values(#11/23/2020#,5.302125)
insert into data( recordingDate,rainFall ) values(#11/24/2020#,7.577293)
'it doesn't rain every day
'_*___**_______*________*__*_____*_*_________*_*
d1 currentd2 prevd1 - d2max period
24.11.202023.11.202010
23.11.202014.11.202091
14.11.202012.11.202029
12.11.202008.11.202049
08.11.202006.11.202029
06.11.202029.10.202089
29.10.202022.10.202079
22.10.202019.10.202039
 

Drand

Registered User.
Local time
Tomorrow, 04:55
Joined
Jun 8, 2019
Messages
179
using temp table and a form (datasheet form, dsForm) can speed things up.
Arnellgp and All

Firstly, apologies for my tardy reply. We have experienced major flooding and power outages for the past week and have only just got back on line!

I have imported my own data into the table in your latest example and all works really well. Over the 11 years of data that I have, if I eliminate zero rainfall days there are only 640 records so far and this runs really quickly.

I really appreciate everyone's input on this.

Many thanks
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 17:55
Joined
Sep 12, 2006
Messages
15,613
Thanks for your reply.

Perhaps I didn't explain this properly. I am not looking to find elapsed time between the entry of records.

What I am looking to do is determine the number of days between the occurrence of each event. This is a table which records daily rainfall and I am trying to determine the longest period of drought during the 10 year period that data has been captured.

In other words, what is the maximum difference between each consecutive date in the table/query.

Thanks

The underlying problem is that access has no concept of a next/previous record so things you can easily do in excel are not possible in access. The reason there is no next/previous record is because what you really have is an unordered set of records, that just happen to display in a particular (or more correctly no particular) order. If you had a calculation based on a "previous record", and then resorted on a different field, your existing calculation mechanism would no longer have any meaning for the new order. For a large set of results, evaluating the differences between successive records each time might be too slow a process, so it's something you need to test and evaluate.

So with a database, it's better to find a way to work with the SET of records as a whole, (if you can) and not bother about the order, or more correctly the comparison between successive records.

So one way you can do this is to take a temporary copy of the records and store that in a different table, including a record order counter. Now you could iterate the records in the temporary copy, in the required sort order, one at a time and store the time differences in that table. And now you can sort the records on the elapsed period, and easily analyse the frequency, as well as restore the original order if you need to do so.

You could also prepare a report, because a report is necessarily ordered, and calculation based on successive records become available, but that probably wouldn't help you find the largest gap.

Another way is to anticipate that you might need the date and time of the previous record, so when you store a new record in the live table, you store information in the new record about the previous record. You have to bear in mind now that deleting and inserting records in the middle of your data will invalidate the stored data, but carefully controlled, it's another solution.

So you could add a new field to your table to include time of last reading, or time since last reading, and iterate all of your records to evaluate that field for all of your records. So it's a one-time process, but now each row of data holds the elapsed time between readings. It doesn't identify the previous reading, but it does hold the time. You could have another field to identify the ID of the previous reading. This would be far more efficient than trying to re-evaluate the sort order of mainly static data each time you need the information, but may not be 100% reliable because of the effect of inserts/deletions and edits.

I hope this helps.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 13:55
Joined
Feb 19, 2002
Messages
42,971
Thanks Dave. Now we're back to what I suggested 25 posts ago:)
 

Users who are viewing this thread

Top Bottom