ADO vs SQL Insert (1 Viewer)

Bigmo2u

Registered User.
Local time
Today, 00:11
Joined
Nov 29, 2005
Messages
200
I am trying to educate myself on Access and become a better designer. I have yet to find anything in the books I have bought as to which process is better or when to use ADO and when to use SQL Insert. If someone could please shed some light on this subject I would greatly appreciate it.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 01:11
Joined
Feb 19, 2002
Messages
43,445
I use an append query when I need to add multiple records to a table and they are coming from some other table.

I use DAO or ADO when adding a single record via code.

My choice is based on the fact that in almost all cases I use stored querydefs because they are more efficient than dynamic SQL. So, unless I actually need a dynamic query, I don't use embedded SQL.
 

WayneRyan

AWF VIP
Local time
Today, 06:11
Joined
Nov 19, 2002
Messages
7,122
Pat,

For large amounts of "external" inserts doesn't:

CurrentDb.Execute "Insert Into ..."

cause less "bloating" than

rst.Addnew ...

?
Wayne
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 01:11
Joined
Feb 19, 2002
Messages
43,445
The .AddNew's are used for one-off inserts where if I used an Append query, I would need to create the SQL Dynamically. Creating dynamic SQL is what causes the bloat.

The .AdNew's either reference the table directly or reference a saved querydef.

The bloat caused by the rows themselves would be the same regardless of how the rows were added.
 

WayneRyan

AWF VIP
Local time
Today, 06:11
Joined
Nov 19, 2002
Messages
7,122
Thanks for the answer Pat.

Not entirely convinced ... might have to try a sample when I get some
free time.

Wayne
 

WayneRyan

AWF VIP
Local time
Today, 06:11
Joined
Nov 19, 2002
Messages
7,122
Pat,

Allocated some time to play with this. I inserted 100,000 records using a recordset (.AddNew)
and CurrentDb.Execute "Insert Into ...".

Surprisingly there was no noticable "bloat" difference, but the .AddNew performed about 20
times faster!

Wayne
 

tokoloshi

Registered User.
Local time
Today, 07:11
Joined
Jul 30, 2008
Messages
63
Ado, Ole Db, Dao

The major difference between OLE DB (SQL Insert/Update/Delete) is client vs server and security.

The first thing to remember is that SQL is broken into 3 distinct categories of languages (DDL - Data Definition Language, DCL-Data Control Language and DML - Data manipulation language) The Insert/Delete/Update commands fall into the DML category.

If the back end is SQL Server or any other enterprise class RDBMS then the preferred method of all DML statements should be via stored procs or SQL functions.

There are many reasons to do this, some of the most important that I have found are:

  1. You gain noticeable performance enhancements when getting the server to handle the manipulation at the point where the processing is happening right up against the database, as opposed to trying to manipulate the data from the client side
  2. SQL Server gives you the opportunity to use "triggers". These are events very similar to "After Update" on a form, but give you control at the database level to suspend operation from the application, perform auditing and other business object driven logic on the data prior to passing control back to the app.
  3. There are many data functions that you may wish to perform that do not require any UI, such as frequently shipping denormalised data into a seperate reporting set of tables. This can be done at the back end using something like Sql Integration Services and can be scheduled to run at specific times of the day using the SQL Server Agent. All of the SQl commands run then are going to use OLE DB and not ADO.
I am certain that there are several other contributors who will be able to add to this list with ease, but these are the top 3 in my book for when to look at ADO vs OLE DB/SQL Stored Procs, etc.

When I am importing or working with bulk data I actually rely more on the bulk tools provided by SQL Server than Access. If there is a need to provide UI tools to the user to perform some of the bulk data manipulation then there are clear performance gains in using a standard SQL script and running an SQLCMD batch file.

I find that the big issue is whether or not the overhead of the UI needs to be considered.

... or have I missed the point ???

T
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 01:11
Joined
Feb 19, 2002
Messages
43,445
Hi Wayne,
Bloat is associated with embedded SQL. I don't use embedded SQL unless I need it to be dynamic. So for any given application, I may have one or two situations where I need to build a query on the fly to accomidate complex criteria but that's it. In every other case, i would use a querydef with arguments.

Also, I think you missed the part of my explaination where I said I would only use .AddNew to add a single record. I would not use that method inside a code loop unless there was no way to accomplish the task with an action query. In my current application, I actually have to add a status record when the form adds a new transaction or update the status record if the transaction's status changes so I use .AddNew/ .Update methods for that. I also have to add an activity record each time a status changes so that we can track status changes over time and I use the .AddNew for that.
 

LPurvis

AWF VIP
Local time
Today, 06:11
Joined
Jun 16, 2008
Messages
1,269
I'd probably just touch on a point (as nobody else has yet) that your question is rather ambiguous.
By that I mean "ADO vs SQL Insert" doesn't necessarily refer to two different concepts. Certainly you probably mean (and it's safe to say it has been assumed as much) "ADO Recordset vs SQL Insert" - but there's no way to be sure of this.
The point I mean to raise is that using ADO to execute the SQL Insert is a perfectly valid (and again - potentially more efficient) method of processing.

For example - if you've established an ADO connection to a server database (say SQL Server - cos I love it and don't know any other server RDBMS lol) and then append a record via a recordset, this is potentially a more direct, less bloating process than executing a dynamic SQL statement via a front end with linked tables.
But there's nothing to stop you using that same ADO connection to execute a (likely very similar) SQL statement on the server - or better yet - a Stored Proc on that server.

So while it's true - a vast majority of database professionals would lean towards SQL oriented data update requests, the means of executing that doesn't negate the use of ADO... and therefore make it an obvious comparison between the two as the question suggests.

Cheers!
 

datAdrenaline

AWF VIP
Local time
Today, 00:11
Joined
Jun 23, 2008
Messages
697
>> Allocated some time to play with this. I inserted 100,000 records using a recordset (.AddNew)
and CurrentDb.Execute "Insert Into ...".

Surprisingly there was no noticable "bloat" difference, but the .AddNew performed about 20
times faster! <<

So .... what is the code you used? ... did you execute the INSERT statement for EACH record? ...

By the way ... Leigh comments should be considered closely simply because the comparisons this thread tends to be refering to can be a huge matrix because we can insert records (via code) with ...

DAO Recordset Object
ADODB Recordset Object
The Execute method of an ADODB connection
The Execute method of a DAO Database

And each will probably have an advantage/disadbantage in a given situation and configuration.
 
Last edited:

WayneRyan

AWF VIP
Local time
Today, 06:11
Joined
Nov 19, 2002
Messages
7,122
Brent,

Yes, I was referring to a one-time thing where you drag data out of some textfile "report".

For a large input set, are talking about the differences between Windows "virtual memory usage"
defining a large array versus the overhead of iterative inserts? That is with the "array"
approach?

In the past, if the data had "ugly" things like single/double quotes, that was the determining
factor in whether to use a recordset (no punctuation problems) or a simple CurrentDb.Execute ...

Yes, they were row-by-row inserts and I swore that the ADO method caused more bloat, but no ...

As a one-time thing, regardless of "bloat", just get the data in and clean up afterwards.

Maybe it would be better to "refine" the input file and use an import spec, but there are
also parent/child things to consider.

I dunno, sometimes you just gotta get the data in.

Even in SQL Server (luv it too, Leigh), I'd probably take the ADO approach.

Anyway, tasks like this don't come up too often, and either method works. It's nice to
see that we have like 8 Pats here nowadays!

btw, I've never used the "Get" and "Put" array operations. Probably wouldn't use them,
unless I wanted to read a LookUp table. Any comments as to why that would be applicable.

Thanks,
Wayne
 

datAdrenaline

AWF VIP
Local time
Today, 00:11
Joined
Jun 23, 2008
Messages
697
Maybe try something like this with your importation of text data ...

Code:
INSERT INTO tblSomeTable ( [I]<field list>[/I] )
SELECT [I]<field list>[/I] 
FROM [Text;
      FMT=Delimited;
      HDR=YES;
      CharacterSet=437;
      DATABASE=[I]<Full Path with out trailing slash>[/I]].[[I]<file name and valid 'Text' extension>[/I]]

Then there is no need for IM/EX specs ... IMO, a MUCH cleaner way to bring in records from a CSV ...
 
Last edited:

Users who are viewing this thread

Top Bottom