IIF with LIKE in an unbound text box (1 Viewer)

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
Is it something like this you want?

qryAnalysis qryAnalysis

GroupNumberOfEvents
AA
84​
AB
7​
KP
98​
KP -
63​
KP +
35​
If so just follow the pattern. Cut and past one of the queries and change the criteria and the group name. Then open the union query in and you will have to do some more cut and paste.
 

Attachments

  • Sample.accdb
    624 KB · Views: 99

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
Is it something like this you want?

qryAnalysis qryAnalysis

GroupNumberOfEvents
AA
84​
AB
7​
KP
98​
KP -
63​
KP +
35​
If so just follow the pattern. Cut and past one of the queries and change the criteria and the group name. Then open the union query in and you will have to do some more cut and paste.
Thanks for getting back me, yes that's it sort of. It seems to be a alternative way of doing it, than I have at the moment, which is cool. With the help from earlier posts, I managed to get the data by. Using multiple IIF() in a report based on query criteria of USER and SECTION number. The problem is the GROUPS (search criteria AA, AB) have to be done for each user and each section ie 3 queries (USER 1 + section1, USER1 + SEction2, USer1 + Section 3) and a report with 9 groups (search criteria) in unbounded texts with IIF() for the 700 users, which is soul destroying slow .......... so we ie me :) was hoping that with the constraints of very varied data poorly constructed DB, these steps could be reduced further to something like 1 query per user which could give the 9/10 group totals for section 1,2,3 ie 30 totals.

Thanks again for for the help, I hope the above makes sense.

The eventdetail field is created by a user log, and I have no control of the data but have to extract the 9 Groups (search criteria) from very varied data.
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
If I understand just modify the child queries. Something like this? If this is what you mean then this is instantaneous.
qryAnalysis qryAnalysis

SectionNumberuserNameGroupNumberOfEvents
1​
User 042AA
2​
1​
User 047AA
1​
1​
User 058AA
1​
1​
User 1AA
14​
1​
User 100AA
4​
1​
User 154AA
4​
1​
User 155AA
15​
1​
User 156AA
2​
1​
User 2AA
1​
1​
User 25AA
3​
1​
User 250AA
13​
1​
User 39AA
7​
1​
User 43AA
3​
2​
User 031AA
1​
2​
User 049AA
2​
2​
User 051AA
2​
2​
User 145AA
6​
2​
User 2AA
2​
3​
User 39AA
1​
1​
User 005AB
1​
1​
User 155AB
1​
1​
User 30AB
1​
1​
User 39AB
1​
1​
User 450AB
1​
1​
User 50AB
1​
3​
User 39AB
1​
1​
User 049KP -
1​
1​
User 155KP -
11​
1​
User 30KP -
2​
1​
User 39KP -
1​
1​
User 43KP -
1​
2​
User 01KP -
1​
2​
User 035KP -
1​
2​
User 059KP -
1​
2​
User 1KP -
3​
2​
User 100KP -
38​
2​
User 2KP -
1​
2​
User 250KP -
1​
2​
User 450KP -
1​
1​
User 07KP +
1​
1​
User 30KP +
1​
1​
User 39KP +
1​
2​
User 100KP +
28​
2​
User 148KP +
1​
2​
User 250KP +
1​
3​
User 151KP +
1​
3​
User 39KP +
1​
 

Attachments

  • Sample2.accdb
    684 KB · Views: 102

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
That can also go in a crosstab
qryAnalysis_Crosstab qryAnalysis_Crosstab

userNameAAABKP -KP +
User 005
1​
User 01
1​
User 031
1​
User 035
1​
User 042
2​
User 047
1​
User 049
2​
1​
User 051
2​
User 058
1​
User 059
1​
User 07
1​
User 1
14​
3​
User 100
4​
38​
28​
User 145
6​
User 148
1​
User 151
1​
User 154
4​
User 155
15​
1​
11​
User 156
2​
User 2
1​
1​
User 25
3​
User 250
13​
1​
1​
User 30
1​
2​
1​
User 39
1​
1​
1​
1​
User 43
3​
1​
User 450
1​
1​
User 50
1​
 

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
If I understand just modify the child queries. Something like this? If this is what you mean then this is instantaneous.
qryAnalysis qryAnalysis

SectionNumberuserNameGroupNumberOfEvents
1​
User 042AA
2​
1​
User 047AA
1​
1​
User 058AA
1​
1​
User 1AA
14​
1​
User 100AA
4​
1​
User 154AA
4​
1​
User 155AA
15​
1​
User 156AA
2​
1​
User 2AA
1​
1​
User 25AA
3​
1​
User 250AA
13​
1​
User 39AA
7​
1​
User 43AA
3​
2​
User 031AA
1​
2​
User 049AA
2​
2​
User 051AA
2​
2​
User 145AA
6​
2​
User 2AA
2​
3​
User 39AA
1​
1​
User 005AB
1​
1​
User 155AB
1​
1​
User 30AB
1​
1​
User 39AB
1​
1​
User 450AB
1​
1​
User 50AB
1​
3​
User 39AB
1​
1​
User 049KP -
1​
1​
User 155KP -
11​
1​
User 30KP -
2​
1​
User 39KP -
1​
1​
User 43KP -
1​
2​
User 01KP -
1​
2​
User 035KP -
1​
2​
User 059KP -
1​
2​
User 1KP -
3​
2​
User 100KP -
38​
2​
User 2KP -
1​
2​
User 250KP -
1​
2​
User 450KP -
1​
1​
User 07KP +
1​
1​
User 30KP +
1​
1​
User 39KP +
1​
2​
User 100KP +
28​
2​
User 148KP +
1​
2​
User 250KP +
1​
3​
User 151KP +
1​
3​
User 39KP +
1​
Wow! Thanks for taking the time to help me, I really appreciate it. Ooohhhh! This might be it!! 🎯 Thanks so much. I'll give it a go on the real data set in the morning but I think you have nailed it for me and saved me days of work. Thank you !!!
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
I did not complete your cases, but it is a real simple cut and paste to add more categories and then update the union query. This will be instantaneous for the most part. You also may need to add a calculated field to this so that the user sort properly.
 

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
I did not complete your cases, but it is a real simple cut and paste to add more categories and then update the union query. This will be instantaneous for the most part. You also may need to add a calculated field to this so that the user sort properly.
You did it!! Thank you so much for the help. It works great! Thank you. Unfortunetly, I can only repay with virtual beer 🍻.
 

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
Big thank you to everyone for all the help, time and willings to share your expertise to a mere mortal. Greatly appreciated. Virtual Beer all round 🍻. Thanks again. S
 

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
I am sorry ...... I am sure you thought you got rid of me :) , can I use your expertise again please.

As I now have everying in one query 🥳 thanks to you guys.

Could I ask you to show me, if its possible. To create the "Average Time Between Negitive press", for each section and user. I can copy and past for Positive Press. So as I have the total KP -* already. I now need to use the Timestamp Field from the individual query USER1 with SECTION 1 by taking the Last Timestamp from the First Timestamp and Dividing it by KP -* again for all 700 USERS and 3 Sections and create a new total Average Time between Negitive Presses per Section.

This was the easier option of the time analysis as originally "Average Key Press", was to be analysis but this had to be done by counting KP - and KP + pairs except the data is not even ie there is an uneven amount of -'s than +'s and some weird random KP.

MAny Thanks. S
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
One thing all datetime fields have both a date and a time, even if one or the other is zero. This can present a big problem to you. Does all data happen on the same day and never goes over 12:00 AM? If not this is not doable in your current data. Here is your data unformatted.
qryKPminusTime qryKPminusTime

Time
2/15/1900 7:12:04 PM​
3/2/1900 9:35:52 AM​
3/2/1900 9:36:13 AM​
3/2/1900 9:36:34 AM​
3/18/1900​
3/18/1900​
3/18/1900​
3/18/1900 12:00:21 AM​
3/18/1900 12:00:21 AM​
4/2/1900 2:23:47 PM​
4/2/1900 2:24:08 PM​
So in the above case 9:35 Am is greater that 7:12:pM. You can strip all the date information away if it makes sense and that is only if all data comes from the same day.
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
By first look you have significant issues with the data, and until it is fixed then not sure it is useable. The data cannot be sorted by time as shown so the hope is that it can be sorted by the way it was entered. That could be by sorting by the autonumber.

I sorted by User then by ID. So the events should be in order that they occurred but they are not. This is probably because how it was imported. Assume you stripped the date part away.


qryKPminusTime

userNameTime
User 155
1/15/1900 2:24:08 PM​
User 155
12:00:00 AM​
User 155
2/15/1900 7:12:04 PM​
User 155
7/20/1900 7:12:25 PM​
User 155
2/15/1900 7:12:25 PM​
User 155
1/15/1900 2:23:47 PM​
User 155
1/15/1900 2:24:30 PM​
User 155
3/18/1900​
User 155
3/18/1900​
User 155
4/18/1900 4:48:17 AM​
User 155
4/18/1900 4:48:17 AM​
So as far as I can tell there is no way to determine the order of the time. Unless you say all of these events happened on the same day. In that case you can strip the date part and sort by time. If these can roll past midnight, then there is no possible solution without updated data.
 

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
Thanks again @MajP for all the help. I might have messed up the time a little creating the sample data set. The log is from a few years back and from several organisations but Time is represented in using UNIX Timestamp. I have attached a single instance ie USER 1050 for Question 1 for all Keypress. Is this Date/Time format useable?
 

Attachments

  • SampleTime.accdb
    464 KB · Views: 86

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
So you would need the elapsed time between each KP -. Then the average of the elapsed time?
Sample Q1 KP Sample Q1 KP

IDuserNamequestionNumbertimeStampeventDetail
1104911​
1050
1​
1450252356377.83​
K: -73
1104912​
1050
1​
1450252356409.03​
K: +73
1104913​
1050
1​
1450252356767.83​
K: -8
So for record 1104913 the elapsed time is 1450252356767.83 - 1450252356377.83?
Do you need Avg Elapsed Negative Key Press Duration for each user or per section? Does not really matter if you can do one you can do both?
 

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
So you would need the elapsed time between each KP -. Then the average of the elapsed time?
Sample Q1 KP Sample Q1 KP

IDuserNamequestionNumbertimeStampeventDetail
1104911​
1050
1​
1450252356377.83​
K: -73
1104912​
1050
1​
1450252356409.03​
K: +73
1104913​
1050
1​
1450252356767.83​
K: -8
So for record 1104913 the elapsed time is 1450252356767.83 - 1450252356377.83?
Do you need Avg Elapsed Negative Key Press Duration for each user or per section? Does not really matter if you can do one you can do both?
Thanks again. I am looking for Each user for each section. Their average time on each section. As I am trying to see if time is a factor, and will help improve accuracy on a machine learning alogorithm. I think this is the only way I can use or analysis time, within the database.
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
See if this is what you mean. I added two more UserNames.

Here is the elapsed query between K- events
Code:
SELECT CURRENT.id,
       CURRENT.username,
       CURRENT.timestamp,
       CURRENT.eventdetail,
       (SELECT Max(B.timestamp)
        FROM   [qrykminus] AS B
        WHERE  CURRENT.username = B.username
               AND B.timestamp < CURRENT.timestamp) AS Previous,
       [current].[timestamp] - [previous]           AS Elapsed
FROM   qrykminus AS [Current]
ORDER  BY CURRENT.username,
          CURRENT.timestamp;
qryElapsed
qryElapsed qryElapsed

IDuserNametimeStampeventDetailPreviousElapsed
1104896​
1050
1450252354069.02​
K: -80
1104898​
1050
1450252354459.02​
K: -82
1450252354069.02​
390.000732421875​
1104899​
1050
1450252354552.62​
K: -73
1450252354459.02​
93.60009765625​
1104905​
1050
1450252355332.63​
K: -78
1450252354552.62​
780.00146484375​
1104907​
1050
1450252355722.63​
K: -84
1450252355332.63​
390.000732421875​
1104909​
1050
1450252356159.43​
K: -76
1450252355722.63​
436.80078125​
.....

Then just take the AVG by USERName
Code:
SELECT qryElapsed.userName, Avg(qryElapsed.Elapsed) AS AvgOfElapsed
FROM qryElapsed
GROUP BY qryElapsed.userName;

qryAvg qryAvg

userNameAvgOfElapsed
1050
2131.30​
1051
892.32​
1052
473.20​
 

Attachments

  • SampleTime_MajP.accdb
    544 KB · Views: 92

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
Thanks again, I would be pulling my hair out with your help. So i think you may given me an additional factor (Elapesed Time per KP), which is super cool, but just to double check my logic of what you have done for me:
  1. "Elapsed Time" is the time eslased between each keypress for USER 1050 for Section 1
  2. Then the code uses an average Function to Sum the elapsed times and divide by 51 (number of records KP -*)
  3. Giving USER 1050 average time between presses for question 1 of 1727.23503417969
My thought was taking LASTTIMESTAMP (1450252440430.77) - FIRSTTIMESTAMP (1450252354069.02)/51 which gives
1693.37

Give a variation of 33.87

As I write this your calculation gives a more comprehensive answer :). Thanks again @MajP, I seriously owe you more than virtual pints 🍻 🍻

Can these queries be added to Union Query like before?
 
Last edited:

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
My query only includes K- key strokes not all key strokes. I believe that is what you said. If it is all key strokes then modify the original query to include all keystrokes.
To create the "Average Time Between Negitive press", for each section and user.
In my opinion taking the aggregate is meaningless information. If I take elapsed times I can find things like the median, mode, variance, and distribution of the elapsed times. These things are usually far from normally distributed. Likely you have lots of short events and a few outlier long term events. A distribution of the data is far more telling of what is happening.
 

Slaine2000

New member
Local time
Today, 10:57
Joined
Sep 9, 2021
Messages
24
you may given me an additional factor (Elapesed Time per KP), which is super cool
This is factor is brilliant! As you say I can do alot more than with :), I just could not figure out out to do it so when for the more simple version.
As I write this your calculation gives a more comprehensive answer :).
It sure does and I really appreciate the time, the patience :) and the ability to break it into understandable chucks for me. Virtual pints :) 🍻 🍻

K- is perfect, the way you shown is great as I understand it and can replicate it ... its perfect!!! I probabely will add K+ to the algorithm.

I'll get cracking on the full data set today @MajP I really cannot thank you enough, you have been a "Life Saver", as I was looking into a back hole, trying to get this done, the more I think of it you have saved me weeks not days of "hair pulling". If you ever get to my side of the pond, drop me a message so I can turn virtual pints into real :) .

Thanks again for all your help. S
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:57
Joined
May 21, 2018
Messages
8,463
I have spent a lot of time here.
If you step through this your query knowledge will greatly improve. Very well structured and understandable.
 

Users who are viewing this thread

Top Bottom