Recommendation for calling Share Procedures with Parameter (1 Viewer)

MajP

You've got your good things, and you've got mine.
Local time
Today, 05:35
Joined
May 21, 2018
Messages
8,525
I am converting my first DB to SQL Server and teaching myself to use SQL as a backend. To teach myself as much SQL Server I am doing everything on the server side even though the database is relatively small. It is however pretty complex code since it is a self refencing hierarchical database with some very complex coding. So I am doing as much as I can using stored procedures and pass through queries. For parameterized SP how do you recommend calling them. I built a single Pass through and call it in code to change the sql string. Seems easy and if I move the database then it would require just one query to update the connection. I assume the other way would be to have a settings table to store the connection information then build the query dynamically. Then just change the settings table if this is moved. Any recommendations or other approaches? Thanks
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 05:35
Joined
May 21, 2018
Messages
8,525
From several searches I see the two approaches. It does seem the recommendation is just to make a pass through. Any reason to use the ADO recordset method?
 

Isaac

Lifelong Learner
Local time
Today, 02:35
Joined
Mar 14, 2017
Messages
8,777
For parameterized SP how do you recommend calling them.

I often use vba to build up the literal sql string (as you are probably very used to doing), and end up with a saved query object whose underlying .sql property has been manipulated to:

exec [dbo].[procedurename] 'value1', 'value2', 'value3'

If all you want to do is execute the SP. Which of course can end with a Select statement, which can then allow you to get records from that saved query object as usual, if its ReturnsRecords property is set to true, and subject to other gotcha's when using SP's to return records.
 

NauticalGent

Ignore List Poster Boy
Local time
Today, 05:35
Joined
Apr 27, 2015
Messages
6,321
I built a single Pass through and call it in code to change the sql string. Seems easy and if I move the database then it would require just one query to update the connection.
With my less than 2 years experience with SQL as a backend, this is the method that worked best for me. In my case, I had to build the string anyway and changing the SQL property for a single PT query made the most sense - only ONE query name to remember.
 

NauticalGent

Ignore List Poster Boy
Local time
Today, 05:35
Joined
Apr 27, 2015
Messages
6,321
  • Editable recordsets,
  • easier parameter handling,
  • and output parameters and return values.
That bit on knowledge would have come in handy. But then I have avoided ADO like the plague. I was already intimidated using SQL and stored procedures and didn't need more "stuff" to clog up my brain.
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 05:35
Joined
May 21, 2018
Messages
8,525
  • Editable recordsets,
  • easier parameter handling,
  • and output parameters and return values.
Thanks. Yeah I forgot this whole project is really read only, so I was not thinking updating, but that would be important. Is ADO preferred over linked tables? I would assume more efficient since you are processing on the server side.
I will have to play with it more to see what is easier in the parameter handling, since it seems pretty straight forward with SP too.
I have not played with SP output parameters, but why would ADO be more advantageous for dealing with outputs? Thanks.
I am somewhat proficient in writing ADODB back when we were told DAO was going away, so I do not have a preference either way.
 

WayneRyan

AWF VIP
Local time
Today, 10:35
Joined
Nov 19, 2002
Messages
7,122
MajP,

You will definitely want a server-side VIEW.
The view is a recursive CTE that is very cool for traversing your hierarchy.

The view is accessible either as linked table or as an ADO record-set.

Google recursive CTE.
Google materialized views if you want to index.

HTH,
Wayne
 

WayneRyan

AWF VIP
Local time
Today, 10:35
Joined
Nov 19, 2002
Messages
7,122
MajP,

For parameters, look at table-valued functions.
They can use your recursive CTEs.
They look like tables, but AFAIK you can't link them ... only as ADO entities, but can be used in SQL Server queries.

Sorry, *typing*on phone S L O W L Y.

Wayne

P.S. experiment in SSMS, SQL servers querying is way ahead of MS Access in most respects.
 

Minty

AWF VIP
Local time
Today, 10:35
Joined
Jul 26, 2013
Messages
10,368
I agree with Wayne - I use stored procedures to return complex datasets sometimes, and tbh they aren't normally something you would have a need to update.

CTE's are very useful with recursive data, as are the inbuilt ROW_Number and partitioning in returning very specific datasets.
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 05:35
Joined
May 21, 2018
Messages
8,525
@WayneRyan,
Thanks. This is one reason I choose to migrate this DB to leverage the recursive CTE. I knew these existed already so I was planning to use them. The genesis of the DB was to display a family tree like sort that required traversing the tree both in depth and in breadth. The code and the queries got pretty complex and putting them all together was hard to keep straight. I have read up on table valued functions and trying to understand where to use them in place of SP. The author describes them as a parameterized view in its flexibility. Also states he uses SP for actions and table valued functions for selecting. I have not got to the recursive part yet, but I will try to use some table valued functions in place of stored procedures.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:35
Joined
Feb 19, 2002
Messages
43,233
If you are going to continue to use Access as the FE, there is no upside to "doing everything" on the server. There may be some things you have to change about the application if you used old school Access practices like filters on forms. But if your forms are already bound to queries with criteria, there is probably nothing urgent that needs to change. Just delete the old links and relink to the server.

Here is code to delete the dbo prefixes:
Code:
Public Function Remove_DBO_Prefix(strPrefix As Variant)
 
Dim obj         As AccessObject
Dim dbs         As Object
Dim altPrefix   As String
Dim strName     As String
 
    Set dbs = Application.CurrentData
    altPrefix = strPrefix & ""
    If Right(strPrefix, 1) = "_" Then
    Else
        strPrefix = strPrefix & "_"
    End If
    
    'Search for open AccessObject objects in AllTables collection.
    strName = "Begin --" & vbCrLf
    For Each obj In dbs.AllTables
        'If found, remove prefix
       If Left(obj.Name, 4) = "dbo_" Or Left(obj.Name, 4) = strPrefix Then
            strName = strName & obj.Name & vbCrLf
            DoCmd.Rename Mid(obj.Name, 5), acTable, obj.Name
        End If
    Next obj
    
    Debug.Print strName
    MsgBox strName
 End Function

I've been using Access since the early 90's and I fell in love with it when I realized that I could link to DB2 (IBM's Relational Database) on the mainframe just as if the tables were local Jet and I haven't looked back. Almost all my apps use SQL Server or some other RDBMS (DB2, Oracle, Sybase, Progressive, etc). With proper client/server techniques and an understanding of how Access works with linked tables, you don't have to rebuild your apps with unbound forms and pass-through queries and stored procedures.

Over the years I've had to occasionally use views to optimize joins or pass through queries to run bulk updates and even more rarely - stored procedures for complex reports. In no case do I use unbound forms for normal edit forms. I have used unbound forms to create very complex search forms if I choose to not limit the rows returned by the query. But the forms are never updateable. The user would double click on an item and open a normal bound form to edit the selected record. My BE's frequently contain million row tables but that is irrelevant unless I bind a form to that table and expect to filter it locally with Access.
 

Isaac

Lifelong Learner
Local time
Today, 02:35
Joined
Mar 14, 2017
Messages
8,777
+1 for whoever commented on TVF's ... I had no clue how unbelievably valuable they were (from several different angles) until not that long ago, I worked once for a bank SQL Development team on huge call center data with a lot of parametrized 'stuff' and just a lot of collaborative sql objects in general.

Boy, you talk about versatility - the table valued function yeah.

As far as indexing views, more power to whoever has found them beneficial, but I hammered away on those for a while and ended up finding so many 'gotchas' (just things that disqualified the view from being index-able), that I mentally abandoned it in favor of simply making sure my source Tables were indexed 'just right' and that my View SQL was as well optimized as possible. But I am not criticizing; if you find them helpful it's great.

@MajP I do the same thing too - saved PT query object, I call it qryBucket and just dump sql into it as needed.

Lastly, one of the (many) nice things about properly saving/storing/organizing/structuring stuff on SQL Server and then minimally coding the Access invocation of it (i.e. exec dbo.procname 'value', 'value') to me, is this:

Even though you might start out assuming that Access surely ought to allow you to paste ANY valid t-sql code chunk into a passthrough query and run it.....such is not quite the case. There are a variety of tiny things (which make NO sense to me), that, if included in your t-sql, will make a passthrough query unable to work properly - with vague error messages that will generally lead you nowhere.

But, encapsulated in a SP, most if not all of those problems seem to fade.

Overall, since SQL server allows a veritable mountain of advantages in organization, scripting > ansi sql statements, commenting, version control, etc.....I find that a compelling reason, but I appreciate Pat's perspective too, it's all a matter of balance, how much is worth the time of converting, etc.

A lot of my apps which do end up using SQL as a back end, someday, they get slated for conversion by the inimitable ".Net team" ( :ROFLMAO: ), and the first thing they're going to do is place a 100% bulletproof server side setup anyway.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:35
Joined
Feb 19, 2002
Messages
43,233
The last two Access apps I built that the IT department decided were mission critical and so they needed to control them cost millions of dollars and years of effort to convert even though the Web teams were starting with a working data model and code base. Each app took less than 6 months to develop and cost the client ~ $100,000 to develop in Access. Each ended up taking over two years and ~ $ 3 MILLION dollars to convert. In both cases, the teams elected to start by destroying the properly normalized data model and it went downhill from there. I'm in the wrong line of business I guess:(
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 05:35
Joined
May 21, 2018
Messages
8,525
@Pat Hartman
If I understand what you are saying, then in many cases simply linking the tables and using local queries is sufficient even if the table on the backend is very large. My understanding is that Access is much more efficient than years ago in using linked tables. Besides using form filters what is considered " old school Access practices like filters on forms"?

My BE's frequently contain million row tables but that is irrelevant unless I bind a form to that table and expect to filter it locally with Access.
So if you have a table of that size are you saying you would need to use a pass through query, or you would use a local query to a linked table and bind that to a form to reduce the records bound to the form? Thanks.

For this current project, I am purposely doing everything on the server to practice converting queries and using the power of SS, but want to understand the different ways of doing things. What is the best approach as well as what is easier and still viable.
 

Minty

AWF VIP
Local time
Today, 10:35
Joined
Jul 26, 2013
Messages
10,368
@MajP Most of the principles of a separate backend apply, e.g. don't drag every record onto a form if you don't need to.
Create views on the server to display lists where you don't need to display a combo for drop-down options instead of joining to lookup value tables in Access.

There are many advantages to SQL Server. I suspect if we all thought about it, the list would be very extensive.
Things that immediately spring to mind are support for complex calculated joins, and if necessary joins on different data types without complaining.
Inbuilt string aggregation, row numbering and partitioning functions are incredibly efficient.
In a stored procedure you can create indexed temporary tables, then manipulate them and join them to other temp tables, real tables and then output a result in a really efficient fashion, that allows you to work in a set-based process without iterating over rows.
Merge statements allow you Insert, Edit and Delete in one process if applicable.

We use a lot of Azure SQL backend's so moving data around has more of an overhead than on a local SQL server. Because of that, we tend to move almost all the heavy processing to the server. We also use local tables for many of the lookup values and have routines in place to keep them updated and accurate. The cloud-based storage medium has many benefits for our clients.

I'm sure once you experiment further you will be a strong convert to the improvements available to you.
 
Last edited:

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:35
Joined
Feb 19, 2002
Messages
43,233
What I said was that all my editable forms are bound, including the ones where the tables contain millions of rows. The point is that the form is bound to a query that returns only the rows and columns needed for the immediate form. That means that I take advantage of both the ability of SQL Server to retrieve rows from large tables quickly by being specific in what I ask for and the Access RAD feature of bound forms. Since I make sure to not use UDF's and other things that cause Access to ignore my where clause and request all records, my queries are essentially pass through even though they are Access querydefs against bound tables. Access makes every effort to ensure that all queries are passed through to the server although you can defeat this. There is a certain amount of overhead doing things this way because Jet/ACE do act as middle men but the overhead is not significant enough to make the forms appear slow and I doubt you could even measure it. We know there is overhead because pass through queries are sent directly to the server whereas querydefs go through the ODBC driver before being sent to the server but the end result is the same. Only the specific rows and columns are returned by the server.

When I develop apps, I always lean toward client/server methods since they work fine with Jet/ACE and should I have to convert to a different RDBMS, I have no work to do except to relink the tables and test everything carefully to make sure I haven't overlooked something like a DAO operation where an Identity column is involved and I need to use arguments that I wouldn't need if the recordset were Jet/ACE for example ---
Set rsIn = qd.OpenRecordset(dbOpenDynaset, dbSeeChanges)

Jet/ACE work with or without the arguments, SQL server needs them.

When I have to convert a database created by someone else, I search the code for DAO/ADO and fix it up as needed, then I just convert the tables and see what happens so I get an idea of what I need to change. Most of the time, I can add one or two text boxes to the header of an edit form to select the record I want. The form's query uses a select clause that referenes these unbound controls which makes the forms always open "empty". If there is only one search box, I use its AfterUpdate event to Requery the form. If I use multiple criteria, I add a button so the requery happens when the button is pressed. Sometimes I create search forms when the selection criteria is complex and return minmum columns in a subform which is a snapshot and not updateable. Double click then opens the edit form.

One record from a million row table can be located with a rudimentary binary search in less than 20 read operations against an index. The technique is to find the midpoint of the index and then use either the upper or lower half to do the next split depending on whether the ID is higher or lower than the midpoint.

I've spent most of my career as a consultant. I don't get to decide whether to use SQL Server or Oracle. I use whatever the client wants me to use. Therefore, being able to use Access querydefs for the vast majority of my work means I can muddle through with whatever BE the client wants me to use and believe me, I've used some strange ones. This tends to happen in small companies that buy an accounting package or something specific to their industry. It uses some RDBMS and so that's what the Access apps use. And they don't even know it but should they upgrade to an ERP that uses something else, a relink of the BE is pretty much all the Access app needs. Granted every RDBMS has its own cool features but they are all implemented slightly differently or not at all with other RDBMS' so I stick to the basics. Only if I run into a roadblock or really bad performance do I move out of Access and into "native" mode. I always take advantage of Access' RAD features both to save development time as well as to add flexibility to the app. It's not like my days developing CICS transactions for huge insurance companies where they were running thousands of transactions per minute and I had to provide sub-second response time for all but a very few of the ones that I wrote.

Bottom line is use the tool the way that makes the tool most useful. If you are going to do everything by hand, you shouldn't be using Access as an FE, you should be using something with less overhead, a smaller footprint, and more flexibility. If you are going to use Access as the FE, use Access to take advantage of what Access does for you. The company will decide to ditch Access as an FE at some point so it is a waste of time and money to "fix" things that simply are not broken. Not "fixing" things that aren't broken is very hard for most of us. I've spent years both as a manager and as a developer learning to keep my eye on the ball. We aren't paid to use the latest and greatest bright shiny things. We are paid to make our employer's money. We are paid to make their data as accurate as is humanly possible and to make our applications as simple and fool proof as possible. While, I do agree that you need to learn some things about SQL Server to make the Access app effective, pick your battles. If the application is not appreciably slower than it was when you started, the conversion is a success. If you do it in a week you're a HERO. If you do it in two months, they'll probably be very happy because they won't have a clue that you could have accomplished the actual task in a week.
 

Isaac

Lifelong Learner
Local time
Today, 02:35
Joined
Mar 14, 2017
Messages
8,777
Minty brought up a list of cadillac features for sure, +1
I once learned Windowing, extra fancy partitioning, but forgot all of it due to never having a use case to practice it on.

Good reminder on the string aggregation, (if you meant string_agg). On this issue I admit being like an "old dog". I learned for xml path in the past, and now am so in the habit that I have never uses string_agg, even though I'm sure I should.
Probably neither of them provides an optimization advantage, since they are usually used in the context of a sub-optimal correlated subquery anyway (rbar). But this reminds me...as I am using for xml path now but really must stop and get on board.

Just SS's way of letting you freely type step after step after step - with comments and meaningless white space - even if JUST for readability and ease of development, is amazing.
Doing extremely helpful things in that step, like Minty points out with indexing a temp table.

one of my favorite recent updates is drop object if exists 'name'

no longer do i have to never-endingly look up the 2 old methods of testing if an object existed (even a session temp table), which syntax i could never remember, i can be lazy

then i also think of the system views, like INFORMATION_SCHEMA.
talk about useful! search it for any code text in the definition of any view, SP, - makes it a lot easier to search for potential impacts, what reports or processes already exist, etc. it's nearly like the ability to read your colleague developers' minds.
unless you have a very strict environment where the dba's have decided thou shalt not.

you can tell i kind of like sql server
 

oleronesoftwares

Passionate Learner
Local time
Today, 02:35
Joined
Sep 22, 2014
Messages
1,159
I am converting my first DB to SQL Server and teaching myself to use SQL as a backend. To teach myself as much SQL Server I am doing everything on the server side even though the database is relatively small
It's advisable you read/take a short course on SQL as against using a live project to teach yourself.

After the course/ book you have read and understood some principles, you can then work on a live project.
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 05:35
Joined
May 21, 2018
Messages
8,525
@Pat Hartman,
In the scenario where you leverage linked tables and local queries, do you do anything special to handle subforms that may be based on a very large table too? If the main form is based on a local query to return one record, are the subform records efficiently handled.
 

Users who are viewing this thread

Top Bottom