Run-Time Error 2465

poohbear2012

Registered User.
Local time
Today, 09:37
Joined
Jan 8, 2012
Messages
30
Hi

I am a self-taught beginner VBA scriptor and I am encountering the Run-time error '2465' on a few of my forms with the error occurring on the same line (in bold below). Not sure how to correct it as the reference to the table and field is correct.

Private Sub Form_Open(Cancel As Integer)
Dim db As Database
Dim rsDemography As Recordset
Dim customercount As String

Set db = CurrentDb
Set rsDemography = db.OpenRecordset("SELECT * FROM tblDemography ORDER BY [customerNumber]", dbOpenDynaset)
If rsDemography.RecordCount = 0 Then
customercount = 1001
Else
customercount = DMax("[tbldemography].[customerNumber]", "tbldemography") + 1
End If
Me.[tblcustomerdetails].[CustomerNumber] = customercount
End Sub

Many thanks
Trisha :confused:
 
Hi,

Unfortunately you can't access a table and field through the Expression Service as we would a form or report and their controls. (how would that syntax tell which record to update)?

If you want to update a particular record in a table, use an UPDATE query:

Code:
CurrentDb.Execute "UPDATE TableName SET FieldName = ValueToSet WHERE YourConditions", dbFailOnError

hth
 
Only thing I would add to that is that I'm a great believer in NOT embedding SQL that talks directly to tables in your application. You create a dependence on your table structure within the application making it more difficult to change at a later date without also having to recode chunks of your application.

your db.openrecordset() statement could just as easily open a query rather than execute a SELECT * which is vague as to its intention and purpose compared to

Code:
db.openrecordset("qrySELECT_Demography_OrderedbyCustomerID")
or similar meaningful name.

Likewise a parametised query on the UPDATE can also simplify things

Code:
  dim db as dao.database
  dim qdf as dao.querydef


    set db = currentdb

    Set qdf = db.querydefs("qryUPDATE_CustomerDetails_CustomerNumber_byCustomerID")
          qdf.Parameters("@Customer_number") = intCustomerCount 
          qdf.parameters("@customerID") = lngCustomerID
    qdf.Execute dbFailOnError

Your tables are now separate from your application. If necessary you need only update the relevant queries before having to consider another round of coding. You also don't need to spend a huge amount of time deciphering what that big block of SQL is doing in the middle of your code.

you can also easily see by looking at the queries in Access where you have dependencies that need to be maintained without having to search your entire project for instances of SELECT, INSERT, UPDATE and DELETE.

It does require a bit of extra time when first implementing and for a single form, couple of tables database I might be inclined not to bother but little Access Databases that will only be around for a couple of weeks/months have a nasty habit of evolving and growing and needing more development down the line.
 
OT: forgive me for busting in - does anyone of you two "newbies" have an opinion on the subject of performance of saved vs. "buried-in-code" queries as discussed here? http://www.access-programmers.co.uk/forums/showthread.php?t=219528 I am curious, because that entire discussion ended a litttle ambiguously.

Aside from testing performance on each individual query you're running, I'm not sure you're going to get a straight answer on that. Everything stated in the other thread seems to be on target to me (for whatever that's worth), but I also tend to think that there's times where dynamic queries (those created in VBA) are fitting... for example, with VBA you can adjust stuff like table names or fields to be brought in easily, but it doesn't always make sense to do so... just some of the time.

Also note that there's a big different between a dynamic SQL statement in VBA and a Recordset. You'd generally use a recordset to perform some operation on a set of records... in which case, whenever you can do that with a single SQL statement (be it a VBA executed sql string or a saved querydef), the SQL statement will pretty much always be faster. Recordsets are old school... in most cases you can do it with SQL a lot faster.

Consider the two:

Code:
Set rs = CurrentDb.OpenRecordset("SELECT * FROM table WHERE condition")
While Not rs.EOF
  With rs
    .Edit
      .Fields(0) = SomeValue
    .Update
  End With
  rs.MoveNext
Wend
rs.Close: Set rs = Nothing
vs.
Code:
CurrentDb.Execute "UPDATE Table SET FieldName = SomeValue WHERE condition"

Both do the same thing, but the second version of that is much, much faster.

But then, take a look at these two:

Code:
A few saved querydefs to delete records from temp tables
vs.
Code:
CurrentDb.Execute "DELETE * FROM ThisTempTable"

Is it really worth it to us to save a separate version of a querydef for each temp table you want to delete? IMO, no, but as tehNellie points out, consideration towards the former could be given based on long term maintenence.

Cheers,
 
Hmm -- I don't think anyone is in doubt that muddling through records of a recordset is slow compared to an optimised SQL query. The point was more the optimisation of queries : a saved query carries an execution plan with it, and the question left hanging in the air was whether a code-SQL does the same, and if so the execution plan is probably then made on the fly, at a price. Or not.
 
Hmm -- I don't think anyone is in doubt that muddling through records of a recordset is slow compared to an optimised SQL query. The point was more the optimisation of queries : a saved query carries an execution plan with it, and the question left hanging in the air was whether a code-SQL does the same, and if so the execution plan is probably then made on the fly, at a price. Or not.

I'm not quite in tune enough with the DAO engine to give any real advice on that. I've heard some say, yes, some say no, and both sides come from people who know far more of this than I do.

Myself, if I think a SQL is going to handle loads of records and tables and relationships to crunch, I'll dedicate a querydef to it. If I've got a little handful or records in a single table to run based on some VBA procedure, I'll run it straight from a string instead - there's not going to be enough of a performance difference to make a difference, and it keeps the object counts down and navigation from being cluttered.

Cheers,
 
That thread seems to be directed more at SQL Server so from that point of view, calling a Stored Procedure or a View absolutely has performance benefits compared to firing dynamic SQL at it. It's dificult to quantify in terms of x% performance benefit because so much is dependent on what the query does, indexes, number of joins,rows etc etc etc and a rubbish query is a rubbish query whether it's dymamic SQL or a stored procedure. If your dynamic SQL is being fired at the server over and over again then the execution plan will be cached. Noobies performance tuning queries on SQL Server fall for this all the time, they write a query, run it, tweak something, see that it's just executed three times as fast and declare themselves uber-leet SQL authors without realising that the server's simply cached the execution plan and used it again.

As a (ex) DBA, dynamic SQL is the work of Satan himself. it's uncontrolled, it's unsecure on so many levels that you could write a novel on it but if you fancy a laugh google SQL injection or "Bobby Tables" and I can't do anything about the performance of SQL I don't have control over, but I know it'll be me who gets it in the neck as "your database is slow".

Access is a different Kettle of fish though. Puirely from a performance point of view, there's no execution plan to tune a query to begin with so any performance benefit I'd expect to be fairly minimal.

The complaints about dynamic SQL are still valid but in the context of what Access is and how it's being used so I'm far less concerned about SQL injection in my Access Databases for example.

in my new life as a developer I hate coming across huge chunks of SQL buried in code. Jack's examples are fine to some extent but once you end up doing

Code:
WHERE [field] = " _ 
& chr(34) & "something" & me.txtbox.value & "_somethingelse" & chr(34) & ";"

or even worse where people don't use CHR(34) so you have & """""stuff""""" then it's very hard to read compared to

Code:
 set qdf = db.querydefs("qryDELETE_Stuff_byaMeaningfulName")
qdf.parameters("@MeaningfulName") = [i]whatever[/i]
qdf.execute

There's no second guessing there what that is doing. If you really want to see what the SQL does, open the query. From my point of view I'm either coding or I'm writing SQL rarely both at the same time if I can help it.

I'm probably overly cautious from SQL Server days but I firmly believe that if you care about the data in the table, don't let anything have access to the table except via [paramtised] queries. The application is not your friend, it is the friend of the user and users want to bugger up your data and make your life harder so give it as little access to your tables as possible :D.
 
Hrmpf... did you actually read that thread? It's very definitely pointed at Access. And Microsoft recommend revamping (by an opening and saving the query again or by Compact/repair) the queries if the number of records changes significantly (see the quote there) , because a saved Acess query apparently does have an execution plan. The question was what happens to a code-SQL in Access. If you dont know that's fair enough :-)
 
I did read it, but I have to be honest I was trying to have a conversation and go home for the last half a page or so.

I can't see it not having an execution plan of some form or another but I don't know for sure what's going on behind the scenes in Access. Anecdotally I don't think it makes a massive amount of difference based on what I've seen of it during testing running SQL from code or executing a query. Running a compact and repair on close on any Access Database once it hits a couple hundred meg or so is not a bad idea imo regardless of any query performance improvement.

My impression is that given a choice between executing a query rapidly and not bogging down the entire workstation the latter gets priority in any case. I've certainly sat and watched long running queries do their thing while the idle process clocks 90% of the CPU. The problem is Access provides you precious little insight into its inner machinations to be able to try and figure out what it's up to.
 

Users who are viewing this thread

Back
Top Bottom