How to handle a large data cleaning task?

sync

Registered User.
Local time
Yesterday, 20:59
Joined
Jan 9, 2006
Messages
14
I need to create a program that will regularly import a text file of over one million records into an Access table. I've been give a list of about fifty different updates to perform on the data to clean it.

I can't imagine performing all these updates in one query. However, creating fifty individual queries seems horribly inefficient from a processing perspective.

I'm accustomed to stepping sequentially through a table in FoxPro, which seems ideal to me for this type of situation. What is the best way to handle this in Access?
 
Last edited:
Can you do it with DAO or ADO? I think this is the same as the 'stepping sequentially through a table' you mentioned...
 
Are there any good books for using ADO/DAO with Access? I have a few Access books and they barely mention it, if at all.
 
Set a reference to DAO
then something like

dim dbs as database
dim rst as recordset

set dbs=CurrentDb
set rst=dbs.OpenDatabase("SELECT * FROM YourTable")
if(rst.RecordCount<>0)then
rst.MoveFirst
do
rst.Edit
'Do cleanup stuff here
rst!FieldName="What ever should go in this field, etc."
rst.Update
rst.MoveNext
Loop Until rst.EOF
endif

rst.close
set rst=nothing
set dbs=nothing
 
Depending on the edits, it may be more efficient to do them with update queries even if you have to run several. Stepping through a file of this size could be slower even though it seems like it should be more efficient. Do some timings to see if it matters.
 

Users who are viewing this thread

Back
Top Bottom