I have split a database--a front-end on the user's PC, and the back-end on a network drive. For me, like others, performance has suffered. I have researched issues on various websites on how to alleviate some of this and found that the use of domain/database functions slows everything down.
I'd like to know if this approach is better and more acceptable. For example, I have a form that aggregates data for reference. This form has 32 controls that are DMax/DCount/DLookup functions. I have replaced them with setting DAO recordsets to SELECT SQL queries in VBA, and then setting the control sources to something like below:
It loads a little bit quicker, but I am just wondering if these are the things that people are doing from the get-go when designing a split database.
I'd like to know if this approach is better and more acceptable. For example, I have a form that aggregates data for reference. This form has 32 controls that are DMax/DCount/DLookup functions. I have replaced them with setting DAO recordsets to SELECT SQL queries in VBA, and then setting the control sources to something like below:
Code:
Me.Control1.ControlSource = "=" & RST.Fields(0)
It loads a little bit quicker, but I am just wondering if these are the things that people are doing from the get-go when designing a split database.