old dog, new trick - but is it efficient? LIKE Statement in VBA 2 choices (1 Viewer)

Rx_

Nothing In Moderation
Local time
Today, 07:47
Joined
Oct 22, 2009
Messages
2,803
Would this Like Statement be as efficient as using the InStr statement?
A well name can have a H or D that always appears in the last 3 characters of its name (Horizontal or Directional)
There is already a existing recordset -

The function is a Rule that is run on the Well Name and returns the T/F to a 2nd table column. Efficiency is important as this may be run many times.

Does the Like statement run in code efficienctly?
It sure codes nicely returning the T/F based on a string comparison.


Code:
Public Function RE_R73() As Boolean ' Well Name has H or D in last 3 char Well Type
            Dim tableValue        As Variant
            Dim Result            As Variant
10    On Error GoTo err_Trap
      'RStblvRE_1Seg
20            RE_R73 = False  ' false until proven true
30            RStblvRE_1Seg.MoveFirst ' bring pointer to top
40            tableValue = RStblvRE_1Seg.Fields("well_Name")
50            tableValue = Right(tableValue, 3)
60            Result = tableValue Like "*H*" Or tableValue Like "*D*"
              ' write result to tblvRe_1SegResult
70             RStblvRE_1SegResult.Edit
80                RStblvRE_1SegResult.Fields("RE_R73") = Result
90            RStblvRE_1SegResult.Update
100   Exit Function
err_Trap:
110    Debug.Print "RE_R73  " & Err.Description
End Function

Notice, there is an H OR D to locate and report a T/F.
I use to use a nested IF Then with the Instr (in string).
Came aross this. Just wondering if it would be more efficient.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 14:47
Joined
Feb 19, 2013
Messages
16,601
Shame the H or D is not at the beginning in which case Like D* would be the most efficient (it will use the indexing if it exists)

Just done a rough test on filtering 250k records and the Like was about 25% faster, but may not be comparing the same scenario

Incidentally, not tested in in VBA but

Like *[H/D]*

will test for either a H or D in one go so may be quicker
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 23:47
Joined
Jan 20, 2009
Messages
12,851
Incidentally, not tested in in VBA but

Like *[H/D]*

will test for either a H or D in one go so may be quicker

That will actually test for the presence of "H", "/" or "D"

The sysntax for H or D is
Like "*[HD]*"
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 23:47
Joined
Jan 20, 2009
Messages
12,851
BTW. If you are doing a Find or Filter on an existing recordset with the idea that an already open recordset will be faster than opening a new one with the precise criteria you would be mistaken.

If is faster to open the new recordset.
 

vbaInet

AWF VIP
Local time
Today, 14:47
Joined
Jan 22, 2010
Messages
26,374
That will actually test for the presence of "H", "/" or "D"

The sysntax for H or D is
Like "*[HD]*"
CJ got that from me. I confused it with Perl's "|" for Or in a character set.
 

vbaInet

AWF VIP
Local time
Today, 14:47
Joined
Jan 22, 2010
Messages
26,374
On the subject of efficiency it would be better to run an update once in SQL than cycling through a recordset (which is what I suspect you're eventually doing).
 

Rx_

Nothing In Moderation
Local time
Today, 07:47
Joined
Oct 22, 2009
Messages
2,803
Wow, I would not have expected that!
So using a OpenRecordset against an existing open recordset is faster?
I would not have expected that at all!

This is the code against an open recordset:
Code:
Public Function RE_R11() As Boolean ' Rule9 Navigator Header
      Dim tableValue        As Variant
      Dim Result            As Variant
      Dim Has_SHLorBHL      As Boolean
On Error GoTo err_Trap
10      Has_SHLorBHL = False  ' false until proven true
    RStblvRE_1Seg.FindFirst "SHLBHL like ""*H*"""    ' either SHL or BHL
    If RStblvRE_1Seg.NoMatch = True Then  ' RESULT of FIND FIRST
        Has_SHLorBHL = False
    Else
        Has_SHLorBHL = True
    End If
'RStblvRE_1Seg
      RE_R11 = False  ' false until proven true
            RE_R11 = Has_SHLorBHL
    ' write result to tblvRe_1SegResult
         RStblvRE_1SegResult.Edit
            RStblvRE_1SegResult.Fields("RE_R11") = RE_R11
        RStblvRE_1SegResult.Update
Exit Function
err_Trap:
 Debug.Print "RE_R11  " & Err.Description
End Function
 

vbaInet

AWF VIP
Local time
Today, 14:47
Joined
Jan 22, 2010
Messages
26,374
BTW. If you are doing a Find or Filter on an existing recordset with the idea that an already open recordset will be faster than opening a new one with the precise criteria you would be mistaken.

If is faster to open the new recordset.
I'm sure I've tested this before and found that FindFirst is quicker or if I'm not mistaken, Filter is even quicker. It's been a while I did all this so I don't remember for sure but I don't see how opening a new recordset will be quiker.

I'll test this again later and report back.
 

vbaInet

AWF VIP
Local time
Today, 14:47
Joined
Jan 22, 2010
Messages
26,374
...but I don't see how opening a new recordset will be quiker.
And I suppose the Filter method is going to require a "re-opening" of the recordset so I probably won't bother testing this.

It'll be a comparison between FindFirst and OpenRecordset.
 

Rx_

Nothing In Moderation
Local time
Today, 07:47
Joined
Oct 22, 2009
Messages
2,803
That what I expected to, the filter required a new Recordset.
I quickly sketched the conceptional re-design of the Rule Engine.

During prototype, with many changes almost daily, each "rule" was a discrete query (recordset) that called the SQL Server to return 0 to many fields / records. The Function then used business rules to return a T/F.
So, for one singe entitiy (a well) this process filled in one field with a result. This happens about 85 times for one record.
Later - the 85 results in the result record are examined to make a Status determination.
By the way, this works perfect and automated a complex regulatory application. Problem, it takes forever. Imagine 20,000 records x 85 functions.

Re-Design.
Now that I have a grasp of what is being re-pulled over and over, a new but more complex Recordset for one entity is pulled and Persisted.
Now the functions poll the persisted recordset rather than each one reaching out over the network to the SQL database with a query.
The reason this disccussion is of interest to me is to make this run as fast as possible. This runs basically real-time for each client and automates the status.
So, in this case, mili-seconds count.

Thanks for the valuable observations and feedback!
 

Attachments

  • Rule Engint RE_ Redesign A.gif
    Rule Engint RE_ Redesign A.gif
    37.5 KB · Views: 113

vbaInet

AWF VIP
Local time
Today, 14:47
Joined
Jan 22, 2010
Messages
26,374
Code:
    RStblvRE_1Seg.FindFirst "SHLBHL like [COLOR="Red"]""*H*"""    ' either SHL or BHL[/COLOR]
You're better off with
Code:
FindFirst "SHLBHL LIKE '*[BS]HL*'"
... your current search is returning more than SHL or BHL. However, since your field is called SHLBHL I suspect that it will only ever contain 'SHL', 'BHL' or be Null. If this is the case then you want to do a Is Not Null comparison instead.
 

vbaInet

AWF VIP
Local time
Today, 14:47
Joined
Jan 22, 2010
Messages
26,374
As promised, here are some test cases and results.

Table - contains all fields listed below, ~100k records
Queries - based on table, showing all fields from table and with the sort stated below

Field1 - AutoNumber, Indexed
Field2 - Text, Indexed, Sorted ASC
Field3 - Number
Field4 - Date

Scenario 1: LIKE '*X*', where X is a character
Field used: Field2 (Text, Indexed, Sorted ASC - no other sort applied)
Record(s) returned: ~24k

Test Cases
  1. Filter method
  2. FindFirst method
  3. OpenRecordset method
  4. Requery method - parameter set using Let property before requery, and parameter in query gotten from Get property
  5. QdfParam method - parameter set using querydef and recordset set to querydef
... at the beginning of each test a recordset is opened after which the timer starts.

Results - timer used is a high res timer with a precision of about 100 microseconds:
Code:
Filter:        1        0.0021
Filter:        2        0.0035
Filter:        3        0.002
Filter:        4        0.002
Filter:        5        0.002
Filter:        6        0.0028
Filter:        7        0.0037
Filter:        8        0.0019
Filter:        9        0.0018
Filter:        10       0.0034
Filter:        11       0.0025
Filter:        12       0.0051
Filter:        13       0.002
Filter:        14       0.0021
Filter:        15       0.002

FindFirst:     1        0.002
FindFirst:     2        0.0027
FindFirst:     3        0.0014
FindFirst:     4        0.0014
FindFirst:     5        0.0013
FindFirst:     6        0.0014
FindFirst:     7        0.0025
FindFirst:     8        0.0014
FindFirst:     9        0.0014
FindFirst:     10       0.0014
FindFirst:     11       0.0021
FindFirst:     12       0.0025
FindFirst:     13       0.0014
FindFirst:     14       0.0026
FindFirst:     15       0.0014

OpenRecordset: 1        0.3435
OpenRecordset: 2        0.3959
OpenRecordset: 3        0.397
OpenRecordset: 4        0.3179
OpenRecordset: 5        0.3528
OpenRecordset: 6        0.3156
OpenRecordset: 7        0.3086
OpenRecordset: 8        0.3291
OpenRecordset: 9        0.3152
OpenRecordset: 10       0.3264
OpenRecordset: 11       0.3356
OpenRecordset: 12       0.3291
OpenRecordset: 13       0.3335
OpenRecordset: 14       0.317
OpenRecordset: 15       0.3214

Requery:       1        0.3211
Requery:       2        0.3263
Requery:       3        0.3213
Requery:       4        0.3109
Requery:       5        0.3149
Requery:       6        0.3261
Requery:       7        0.3513
Requery:       8        0.325
Requery:       9        0.3436
Requery:       10       0.3131
Requery:       11       0.3573
Requery:       12       0.3501
Requery:       13       0.3351
Requery:       14       0.4935
Requery:       15       0.5507

QdfParam:      1        0.3729
QdfParam:      2        0.3781
QdfParam:      3        0.3186
QdfParam:      4        0.3374
QdfParam:      5        0.4385
QdfParam:      6        0.4895
QdfParam:      7        0.3
QdfParam:      8        0.3122
QdfParam:      9        0.3117
QdfParam:      10       0.3105
QdfParam:      11       0.4669
QdfParam:      12       0.3161
QdfParam:      13       0.2982
QdfParam:      14       0.301
QdfParam:      15       0.5328

Following are other scenarios using the same ~100k records with results averaged out:
Code:
Scenario 2: Equal To (=), Text field (Indexed, ASC), ~200 records returned
Filter:                 0.1363
FindFirst:              0.0301
OpenRecordset:          0.004
Requery:                0.0011
QdfParam:               0.0031


Scenario 3: Equal To (=), Number field (Indexed, ASC), 1 record returned
Filter:                 0.109
FindFirst:              0.1183
OpenRecordset:          0.0024
Requery:                0.0001
QdfParam:               0.0027


Scenario 4: Greater Than (>), Number field (Indexed, ASC), ~50000 records returned
Filter:                 0.0643
FindFirst:              0.0603
OpenRecordset:          0.1504
Requery:                0.3317
QdfParam:               0.0027

Make your own judgement.
 

Rx_

Nothing In Moderation
Local time
Today, 07:47
Joined
Oct 22, 2009
Messages
2,803
Wow, that is a real eye opener.
I read on MSDN that an Access Local Table (not linked table) is basically treated the same as the Recordset object - that it stays in the OS priority RAM.
Naturally, the OS Virtual Memory swapping is ultimatlly in control of that. But, the intent is to keep it basically the same as a Recordset.

Per my chart, the view of SQL Server has about 100,000 records with the ID_Well (PK) in groups of 1 to 10.
Each group is brought into a Local Table (the 1 to 20)
Then the functions with the Find First or other methods run against each column(s) of the local table to make a single T/F determination.
The T/F determination is written (update) to a single PK Result row's column. The result is 10,000 rows times 20 comumns. Or 200,000 discrete analysis.

Old Method: The attached diagram shown on the previous post (top method) was for each rule (function) to Open Recordset and to run against the Linked Table per column one at a time.
That took about 25 minutes.

New Method: The new method fetches the PK (1 to 20 records), places it in a Local Table, opens one recordset that stays open (persisted) while the 20 function run column by column for that 1 PK. The results are updated column by column to a local table.

Instead of 25 minutes, it now runs in 2.5 minutes.

So, makeing a single pull across to SQL Server basically reduced this process by 8 times.

My timer result shows the local Find First and Update local table process is taking 80% of the 2.5 minutes.

Your excellent analysis is a real eye-opener!
I am going to re-write the code to OpenRecordset with SQL text to pass in with variable parameters.

It is intersting that after being a Certified Microsoft Access Trainer since Access 97 first appeared, much of the code and courseware "code samples" we use to teach keeps changing. Of course, lets remember how expensive RAM was back then and how the OS plus Hard Drive Cache has changed in 15 years.

Thanks so much for your excellent analysis. It really made my day!
 

vbaInet

AWF VIP
Local time
Today, 14:47
Joined
Jan 22, 2010
Messages
26,374
I should have mentioned that the tests were performed on a local table on a non-split test db.
You may get different results when run on a linked table so it's worth testing that.

Also, as an alternative, you could look into disconnected recordsets (using ADO) and see if that makes a difference.
 

Users who are viewing this thread

Top Bottom