Does it not cache BE table data? Very slow if repeatedly access BE table (only 5 records) (1 Viewer)

bignose2

Registered User.
Local time
Today, 20:07
Joined
May 2, 2010
Messages
219
Hi
Never really noticed this before, I know there is overhead with BE on local network (gigbit) but I have large tables with 10000+ records & a get minimal slow down really.

However I have a BE table of only 5 records, lookup of rates depending on a dates range.

I am testing as noticed in Dlookup in a query was very slow (I know not the best & can be slow but on 5 records I figure OK)

So test on a loop 100 times.
just Dlookup searching through these 5 records each time.
Takes 1.8 seconds
If I do the same test on a larger table I have, 3800 records, similar dates etc.
takes 1.8 seconds !!
so assume the time taken is really just accessing the BE on the network & not looping through.

If I make the 5 record table local
test takes 0.06

Thinking about it I guess it has to re-access in case data has changed.

Any ideas/advise

I can copy to a local table at the start of each day as not going to change very often.

Just seems messy
 

arnelgp

..forever waiting... waiting for jellybean!
Local time
Tomorrow, 04:07
Joined
May 7, 2009
Messages
19,169
google "Persistent Connection to Ms Access Backend".
 

bignose2

Registered User.
Local time
Today, 20:07
Joined
May 2, 2010
Messages
219
Hi,

Not sure persistent connection is the problem, although I am no expert.

Running the loop 100 times I would think it has not disconnected also I have run it with the table actually open.

Public Function VatRateDate(DateX As Date)
VatRateDate = DLookup("[VatRate]", "PricesChanges", SQLDate(DateX) & " Between [incStartDate] AND [incEndDate]")
End Function

SQLdate just makes sure right format for UK dates, same result if I #05-01-2020# direct.

Public Function testvatrate()
Dim x As Integer, t As Double
t = Timer
For x = 1 To 100
VatRateDate ("1 may 20")
Next x
Debug.Print Timer - t
End Function

?testvatrate
1.8 seconds

Is on a gigbait wired LAN, I get 450Mbps.
Same 1.8 seconds on a similar table with 4000 records with much more data in each field.

Local Table is 0.05 seconds, I think I will just use that & copy across as required but never had to do this before on a DB I have been developing/tinkering with for over 10 years

PricesChanges PricesChanges

PriceIDIncStartDateIncEndDateDateXToyWHTCockerLabDobeGDaneCatsVatRateVatRate2DogsOwnDiscCatsOwnDiscPairing2DiscPairing3DiscPairing4Disc
2017CatchANYBefore
04-Jan-11​
31-Mar-18​
12.66​
13.11​
13.75​
14.17​
15​
16.12​
8.52​
20​
0.2​
5​
5​
10​
17.5​
25​
2018
01-Apr-18​
31-Jan-19​
12.66​
13.11​
13.75​
14.17​
15​
16.12​
8.52​
20​
0.2​
5​
5​
10​
17.5​
25​
2019
01-Feb-19​
31-Jan-20​
15.83​
16.39​
17.19​
17.72​
18.75​
20.15​
8.78​
20​
0.2​
2.5​
2.5​
25.7​
32​
38​
2019PreBookedPrices
01-Feb-19​
31-Jan-20​
13.04​
13.5​
14.17​
14.6​
15.45​
16.6​
8.78​
20​
0.2​
2.5​
2.5​
10​
17.5​
25​
2020
01-Feb-20​
31-Jan-21​
16.3​
16.88​
17.71​
18.25​
19.31​
20.75​
9.04​
20​
0.2​
2.5​
2.5​
25​
32​
38​
2021
01-Feb-21​
31-Jan-22​
16.79​
17.39​
18.24​
18.8​
19.89​
21.37​
9.31​
20​
0.2​
2.5​
2.5​
25​
32​
38​
 

Minty

AWF VIP
Local time
Today, 20:07
Joined
Jul 26, 2013
Messages
10,355
As that type of data never changes, or very seldom, I would make a local copy of your lookup table in the FE.
You can automatically update it on opening the DB.

Be warned though that joining a local table to a remote one in a query can really slow things down.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:07
Joined
Feb 28, 2001
Messages
27,001
A narrow answer regarding your timing question:

The first question is whether the field on which you did the original DLookup was indexed. If so, then in neither case did you read the whole table. You read the index first, then went straight for the desired result. Which is one way that your times could be so close despite the table size disparity.

Second question is what kind of BE do you have? Native Access or something residing on an active SQL product like SQL Server, MySQL, ORACLE, etc.? I believe I know the answer by context, but I didn't see where you told us that information.

So far as I know, Access does not cache table information because of its use of SMBv1 protocol, which originally was in vogue when Access was first designed. SMB (Server Message Block) is the Windows File Sharing and Printer Sharing protocol and v1 of that protocol did not allow caching to occur. The fact that SMBv2 and v3 allow it does not matter because Access was designed at a time when such caching was not allowed. In fact, sometimes you have to turn off the caching feature to prevent certain types of table corruption.
 

arnelgp

..forever waiting... waiting for jellybean!
Local time
Tomorrow, 04:07
Joined
May 7, 2009
Messages
19,169
You have misconception about "persistent connection".
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 16:07
Joined
Feb 19, 2002
Messages
42,973
DLookup() and other domain functions should NEVER be used in a query or a VBA code loop. each execution of the function initiates a separate query. So, if your loop is processing 1000 rows, it is running 1000 queries. Eventually the overhead will kill you.

For DLookup() use a join. For other domain functions, you might need to create a totals query and join to the totals query.

I still remember one of my first Access apps from 20 years ago. I wrote it using dlookup() because that is what the samples I saw used. The loop took over two hours to run on about 190,000 records. There were actually 3 dlookups() in the process. It was conversion code so I was converting code values from the old set of codes to a new set for the new application. After changing the three dlookup()s to left joins, the time was reduced to less than 3 minutes.
 

Isaac

Lifelong Learner
Local time
Today, 13:07
Joined
Mar 14, 2017
Messages
8,738
Stepping back a little,

If you are really talking about a mere FIVE set of numbers or factors (or whatever), that either never or rarely change, and if you are looking them up frequently, then maybe don't even use a table driven method. I mean, I'm all for doing that, from an app dev and scalability and manageability perspective, but there are always exceptions to rules.

Having a function in VBA that simply executed conditional logic on 5 sets of numbers might be better than looking it up thousands of times from a table ... VBA is faster at performing math than databases are at retrieving data :)
 

Users who are viewing this thread

Top Bottom