Brent,
Yes, I was referring to a one-time thing where you drag data out of some textfile "report".
For a large input set, are talking about the differences between Windows "virtual memory usage"
defining a large array versus the overhead of iterative inserts? That is with the "array"
approach?
In the past, if the data had "ugly" things like single/double quotes, that was the determining
factor in whether to use a recordset (no punctuation problems) or a simple CurrentDb.Execute ...
Yes, they were row-by-row inserts and I swore that the ADO method caused more bloat, but no ...
As a one-time thing, regardless of "bloat", just get the data in and clean up afterwards.
Maybe it would be better to "refine" the input file and use an import spec, but there are
also parent/child things to consider.
I dunno, sometimes you just gotta get the data in.
Even in SQL Server (luv it too, Leigh), I'd probably take the ADO approach.
Anyway, tasks like this don't come up too often, and either method works. It's nice to
see that we have like 8 Pats here nowadays!
btw, I've never used the "Get" and "Put" array operations. Probably wouldn't use them,
unless I wanted to read a LookUp table. Any comments as to why that would be applicable.
Thanks,
Wayne