IIF with LIKE in an unbound text box (1 Viewer)

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
Hiya I am looking for some help please. I been looking over the form to see if I can sort it out. They have been a great help and have I think got be here, ie thinking using an IIF with LIKE in an unbound text box might be a solution. I think I am close but my inexperience is just not letting spot where I am going wrong
Background
I have a DB (Users) with 700 for each users there is a separate DB(logs) of their logs. This DB has 4 fields USER/SECTION/TIME/EVENT. At the moment I have to research the EVENT field which is a LONG TEXT and contains a mix of text strings like AAA 123 or AAB 123, BB 111, CC 222 or Success or Failure. So at the moment I am doing the analysis by creating individual queries and letting the display tell me the total. So I want count the number of occurrances of AAA. I am using the query design (not too familiar with SQL) so I can use the LIKE "AAA*" select run and it works and shows me the number of occurances. The Problem is the are approx 10 searches for each user. So for the moment I think if I use a report I can create the count total for these 10 search criteria.
What I have so far
So create a report in the Report footer have inserted a textbox call Total AAA once I get it working can copy and chance for AAB, AAC, BB etc. Here what I got so far, bearing in mind I have pieced this together from this forum (Thanks)

=Count(IIF[Event] LIKE "AAA*", 1, 0))

When I preview the report it shows the total events (so the wildcard did not work) of 1077 but the answer should be 180 AAA ###

Hopefully this makes sense and soory for the long winded explaination. Thanks again. S
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 12:26
Joined
Jan 20, 2009
Messages
12,849
Count will give the same answer to 1 or 0.
Try Sum()
 

June7

AWF VIP
Local time
Yesterday, 17:26
Joined
Mar 9, 2014
Messages
5,423
=Count(IIf([Event] LIKE "AAA*", 1, Null))

or

=Sum(IIf([Event] LIKE "AAA*", 1, 0))
 
Last edited:

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
=Count(IIf[Event] LIKE "AAA*", 1, Null))

or

=Sum(IIf[Event] LIKE "AAA*", 1, 0))
Genius!! Thanks a million, and for the examples. Oh why Oh why did I not post up here sooner, I been at this on and off for the past 2 days. Thanks again
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 21:26
Joined
Feb 19, 2002
Messages
42,981
If the data were properly normalized, you could create a totals query and bind it to a subreport and not have to run ten queries for each record.

Select GroupField, Count(*) As GroupCount
From YourTable
Group by GroupField
Order By GroupField;

With that query, it doesn't matter how many groups there are or what their names are. With your current schema, you have to remember to modify the report and who knows what else if a new group ever gets added.

In first normal form, it tells us that every column should contain one atomic value. That leads us to a structure like Prefix, FirstName, MidInit, LastName, Suffix rather than just "Name". When you mush things together that have individual uses, you are stuck with extra work to pull them apart.
 

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
If the data were properly normalized, you could create a totals query and bind it to a subreport and not have to run ten queries for each record.

Select GroupField, Count(*) As GroupCount
From YourTable
Group by GroupField
Order By GroupField;

With that query, it doesn't matter how many groups there are or what their names are. With your current schema, you have to remember to modify the report and who knows what else if a new group ever gets added.

In first normal form, it tells us that every column should contain one atomic value. That leads us to a structure like Prefix, FirstName, MidInit, LastName, Suffix rather than just "Name". When you mush things together that have individual uses, you are stuck with extra work to pull them apart.
Thanks Pat for the detailed reply, I am afraid that would be beyond me at the mo :) both in terms of knowledge and time. Analysis of the DB(Logs) is a vital but smaller part of the overall project that I got asked to do. I knew I could do it one way or the other using Access as I am not a coder. Thanks again. S
 

cheekybuddha

AWF VIP
Local time
Today, 01:26
Joined
Jul 21, 2014
Messages
2,237
@Slaine2000

I think you need to change you avatar to something like:

1631265787267.png


;)
 

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
The forum has saved me hours if not days :) Thank you so much. Can I ask 1 further question please (being a bit lazy .. or hopefully on a roll). As I have a report showing the 10 summations ie total amount of AA ## and AB ## etc ..... which is brill. Can the report reduce the iterations further.

As I also need to find for each user the above 10 summations for each section. So instead of creating the query User1, with the additional criteria Section 1 with the above report, then the same again User1 with criteria Section 2 with same report, and then same again user 1 for section 3 report

As each user accesses 3 sections (section 1, Section 2, section 3) and the logs are kept for each section. Is there a way in the report to show the 10 summations for each section .................. meaning I only need to create a single query for user 1 and a single report that will show the 30 summations I need or just leave it as it is and create 3 queries user1 for section 1, 2, and 3 and 3 reports .... I ask as I have 700 users. Thanks again. S
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 12:26
Joined
Jan 20, 2009
Messages
12,849
Can I ask 1 further question please (being a bit lazy .. or hopefully on a roll). As I have a report showing the 10 summations ie total amount of AA ## and AB ## etc ..... which is brill. Can the report reduce the iterations further.

Sounds like you are faced with working around badly structured data.

Pat is probably on the mark.
If the data were properly normalized, you could create a totals query and bind it to a subreport and not have to run ten queries for each record.

My answer earlier was based entirely on reading these two lines.
Count(IIF[Event] LIKE "AAA*", 1, 0))

When I preview the report it shows the total events (so the wildcard did not work) of 1077 but the answer should be 180 AAA ###
 

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
Sounds like you are faced with working around badly structured data.

Pat is probably on the mark.


My answer earlier was based entirely on reading these two lines.
Yes you and Pat is bang on ... the data is badly structured as the 10 (possibley more) queries, are creating new information (for a Machine learning algorithm) just from the event field. I have not looked at the time field yet. You have already saved me hours if not days as the DB(logs) has 650 000 records from 700 users.

The big problem for me is I am a basic user or just a cabbage :) , so I am a afraid the time spent trying to normalies the data, which will allow me to view the data I want more easly (in the form Pat explained) will take me longer than just manually plugging away (although its soul destroying). The query data are all over the event field so I do think (Know) (that parsing or separating the text field) how easy it would be to put in first normal form (Thanks Pat) ie idividual columns of AA:, AB:, BS:, K:, COM: Success and Failure.

Is it opening a can of worms for you :) as greatly appreciate the help already. I be happy to learn how to do it, if you have the time, to break it into baby steps

Thanks again
S
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 21:26
Joined
Feb 19, 2002
Messages
42,981
In the long run it is always better to do it right the first time and I have the scars to prove it. You are creating kludges and every new thing you want to count will require more kludges. You were told how to create some kludges to get the first 10 counts. Use the same techniques to do the next 100 or think about normalizing the data.

Just keep in mind that each of the like queries is not using an index and so requires the query to read every single row in the table. The more of these queries you run, the slower the process becomes.

You might consider doing all Iff's at the same time.

Select Sum(IIf(Instr(yourfield, "AAA") > 0, 1, 0)) as CountAAA, Sum(Instr(yourfield, "BBB") >0, 1,0) As CountBBB, etc.

The Instr() function finds the starting position of the string you are searching for. Therefore if the result is > 0, the string is found so you want the IIf() to return 1, otherwise 0. The Sum() adds up all the 1's.

The first thing to learn about Access is the catalog of functions. Look for the list ordered by type. You don't want to search an alpha list for a function whose name you don't have any idea of or even if it exists. The grouped list puts functions of the same type together so all the "date" functions are together and all the "string" functions are together, etc. which is a much easier way of finding what you need.

This will get the job done using one pass of the recordset and so might be faster than using separate queries with LIKE.

Don't consider for a second that I think this is a good solution. It is also a kludge.

If this is a one-time effort, it might be too much for you to learn about normalization to get the task done but if this is something you are going to do regularly, or expect to expand, the time to normalize won't be wasted. It is actually less of an effort than you think if the string is consistently formatted.
 

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
In the long run it is always better to do it right the first time and I have the scars to prove it. You are creating kludges and every new thing you want to count will require more kludges. You were told how to create some kludges to get the first 10 counts. Use the same techniques to do the next 100 or think about normalizing the data.

Just keep in mind that each of the like queries is not using an index and so requires the query to read every single row in the table. The more of these queries you run, the slower the process becomes.

You might consider doing all Iff's at the same time.

Select Sum(IIf(Instr(yourfield, "AAA") > 0, 1, 0)) as CountAAA, Sum(Instr(yourfield, "BBB") >0, 1,0) As CountBBB, etc.

The Instr() function finds the starting position of the string you are searching for. Therefore if the result is > 0, the string is found so you want the IIf() to return 1, otherwise 0. The Sum() adds up all the 1's.

The first thing to learn about Access is the catalog of functions. Look for the list ordered by type. You don't want to search an alpha list for a function whose name you don't have any idea of or even if it exists. The grouped list puts functions of the same type together so all the "date" functions are together and all the "string" functions are together, etc. which is a much easier way of finding what you need.

This will get the job done using one pass of the recordset and so might be faster than using separate queries with LIKE.

Don't consider for a second that I think this is a good solution. It is also a kludge.

If this is a one-time effort, it might be too much for you to learn about normalization to get the task done but if this is something you are going to do regularly, or expect to expand, the time to normalize won't be wasted. It is actually less of an effort than you think if the string is consistently formatted.
Thank you Pat. I appreciate the advice and for taking the time for the comprehensive reply, which is extremely helpful. It is a one-time effort :) as I'll keep my mouth closed next time this come up :).

However, your right I am only on user25, :)-) only 675 to go) and the boredom is killing me. So I'll try the all IFF's. Failing that, time to learn normalization, I suppose "Once you stop learning, you start dying". I think I have a lead time of around 2/3 weeks. Thank you again for all the help and advice. S
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 21:26
Joined
Feb 19, 2002
Messages
42,981
You're welcome:)

Include the UserID in the query and you'll get the whole 675 done with a single query!!!!
 

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
I am sorry about this but could I ask someone to help with parsing the Eventdetails field please. I am going to try put it into first normal form as Pat recommends ... however, I have no SQL or VBA experience I can use the query design however :) so you have to keep it simple please.

There are 4 fields as stated above ID -> User -> Section ->Time -> Eventdetail the table are the logs of each user. They look like below

ID -> User -> Section ->Time -> Eventdetail
12 -> user 1-> 1 ->123.3 ->AAA 123454565.566
13 ->user 2 -> 2 ->123.4 ->AB 12334.6
14 ->user 300 -> 1 ->1222.4 ->Z: +8
15 ->user 500 -> 4 ->123.44 ->Z: -256
16 ->user 50 -> 2 ->1245.6 ->Operation Success: code 122 abx
17 ->user 700 -> 3 ->1276.5 ->Operation failure: code 234 band ixx

So in at the moment I want to analysis the eventdetails field for total groups of AA, AB, Z +8, Z +, Z -, Success and Failure. So I want to move the entire field of AAA ### to its own field AAA and the entire field if it has a Z: + to a Z: + field in order to break the eventdetail into distinct groups so I can use the above advise. I have been looking and it seems it can be done by UPDATE QUERIES but I can't get it to move to the new FIeld ie AA, AB or Z: + and the success INSTR function is fighting with me also :). I looked over the help, google and here on the form so I am close but I just can't see where I click to get the update query run to move the field to the new field as all it does is replace on the original. Hope I am making sense.
 

June7

AWF VIP
Local time
Yesterday, 17:26
Joined
Mar 9, 2014
Messages
5,423
You want a field for AA, a field for AB, a field for Z, etc? This would not be normalization. This means each record will have a bunch of blank fields as only one of these fields could have a value. That is actually a de-normalization (which is what the IIf() calcs or a CROSSTAB query accomplishes). Normalization would be putting "AAA" in one field and the number part in another (atomic values).

The data in Eventdetail is variable structure so it's hard to tell what the "atomic values" are, especially for the Success and Failure data.

The IIf() calcs in an aggregate query provided by Pat may the best way to deal with your data - if there really is only 7 groups. A CROSSTAB query could also probably be designed to produce this same de-normalized output.

Having said all that, it is a balancing act between normalization and easy of data entry/output. "Normalize until it hurts, de-normalize until it works."
 
Last edited:

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
Thanks a million for the reply. Unfortunately, I tried to do the IIf() calc in an aggregate query provided by Pat but I could not figure out the SQL side to get it to work, can it be done within the Query Design. As you say the eventdetail field data is so varied, it may not be able to normalized, the main problem is this is going to be a one-off analysis ... for me anyway, but I am happy to put the work, as Pat indicates new groups might be added then I am back to square one :).

I have 3 solutions (Pat's help and expertise has been great) the problem is I am such a basic user I can't get either to work no sql or vba and still can't figure out FNF :).

Solution 1 (IIf() calc in an aggregate query) can this be done from in the QUERY DESIGN? Is it something someone would not mind explaining to a access newbie and not take them a year :)

Solution 2 (Normalization)
Thank you I never really been able to get the whole normalization thing (wash your mouth out you say :) ) The analysis part for me I think will would on the most part work without the number parts for AA and AB and the Z: are very specific a (Z: *), (Z: *8); (Z: +*); (Z: -*) and Operation Success and Operation failure. But now this meanings the INSTR function in the UPDATE queries if you still think their suitable will be a monster 😨

Solution 3 (Keep crunching away)
This is alot better than it was thanks to the lads above but have to create 3 queries and 3 reports per user (section 1, 2, 3) its slow going but at least I can get it done eventually assuming no new groups are added
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 21:26
Joined
Feb 19, 2002
Messages
42,981
If you post a db with the table and some data and a list of the 10 strings you are looking for, I'll build the query unless someone beats me to it.

I'm guessing that this is not data that is under your control. Looks like it comes from an event log. If the criteria you are searching for is just the 10, they my suggested query is probably the simplest solution. It should produce the "results" with one query.
 

Slaine2000

New member
Local time
Today, 01:26
Joined
Sep 9, 2021
Messages
24
Wow! Pat thank you so much, I really do appreciate the help, the advice and the time. You are brilliant to offer. I have created a sample data set

Criteria for each user for each section (1, 2, or 3):
Total AA ... for Section 1, 2, 3
Total AB ... for each section
Total KP ... for each section
Total KP???8 ... for each section .........this is a specific button press
Total KP: +* ..... ditto ......... All positive presses
Total KP: -* ..... ditto ......... All negative presses
Total Operation * for each section
Total *Success for each section
Total *Failure for each section

Thanks again for all your help
 

Attachments

  • Sample.accdb
    488 KB · Views: 163

Users who are viewing this thread

Top Bottom