Can "batch" lookup to Table be faster than to Array?

patwwh

New member
Local time
Yesterday, 16:52
Joined
Feb 3, 2016
Messages
2
Hi!

I am coding for a mini-translator in Access, among several languages.
I am thinking about the best container for the data table.
Multi-Dimension Array is of course a standard programming choice.
But I am considering to put it into a Table, so that I can share the same set of data table with both VBA (by DLookup etc. standard handy tools) and Queries.

The problem and question is: Reading physical table must be much slower than reading Array. But when I use DLookup to read the same "physical" table repeatedly in short period of time (batch process), would Access smartly optimize the process by keeping it in memory? If not, how can I optimize it to be?

N.B. I don't use my beloved Dictionary object as the container, since it cannot show all languages (No. >2) in parallel in the same time, and thus bring less intuitive in data maintenance.
 
imho, put it in table, much easier to maintain. you can use elookup functions (google it).
 
Thanks arnelgp. But ELookup is optimized for looking up a (Dictionary) Word in the each Context's Str repeatedly, by loading each Context's record into memory.

My needs in the case of "Translator" is opposite. It is to lookup each Content's Word in a Dictionary List which should be loaded into memory. Although I can use any technique and static variables to load whole dictionary list into memory (i.e. become an array), I cannot apply handy tools like DLookup, Vlookup etc. on it. I guess the most elegant and optimized (in terms of performance) solution is to build a "Virtual Table". Any idea or opinions?
 

Users who are viewing this thread

Back
Top Bottom