Does an Access query negate the benefits of utilizing SQL (1 Viewer)

gojets1721

Registered User.
Local time
Yesterday, 22:56
Joined
Jun 11, 2019
Messages
430
Full disclosure, I'm completely new to SQL so I apologize if this is a silly question.

I am using SQL as a backend and access as a front end. In general, does using a query to filter data for a form negate some of the speed and performance of SQL?
 
You are wanting a blanket answer but one cannot (honestly) be provided. It all depends.

Here are a few notes, in no particular order and not necessarily related, that I've observed over the years:

  1. It is common for people to point out that simply using SQL Server as a back end is not a cure-all for performance, as extremely low-performance design is still very possible. While I agree with that statement, it's also worth pointing out that sometimes, simply using SQL Server will, in fact, cause things to speed up - but we avoid saying that (I think) because we don't want people to be lazy and use bad habits, just hoping the mere move to SQL will erase the consequences, which is fine, but it's still a fact nonetheless in my experience.
  2. It is true that if you are going to use SQL Server, you can often realize a major gain in performance by using pass-through queries, written in T-SQL (not Access, but saved in Access), and executed on the server. And in that case, my point #1 becomes especially relevant - where you often WILL see major performance gains, even if the SQL/query/performance-related design is not great and even if it's exactly the same as it was in Access. Again, we encourage people to do good design - not just move to a faster processor - but there's nothing wrong, I think, with pointing it out. Use a table with 20 million records and see how fast Access can return (and render on screen) "select * from table" vs. SQL Server, and the point is made.
To my mind, beyond those two points, nothing further can be made in the form of a blanket statement.
I have done a number of projects where my "overhaul" project consisted in nothing more than moving the back end to SQL - changing nothing at all about the FE design or queries - before the project was decommissioned or the sponsors lost interest and moved me to other things. In most of those cases, performance sped up considerably - on the sole basis of the new back end. Despite it being a very bad idea to rely on that in place of performant design, it is what it is.
And, after all, most of the world's tech users behave the same way, if we're being honest. Most people, when their laptop starts giving them problems or slowness, they just rave about the next biggest and fastest laptop. Rather than understanding that an older, slower laptop works quite well if maintained and used correctly. But we have come to accept that. I suppose some of that mentality is present in tech design, too.

90% of the "our data warehouse load jobs are taking too long"-conversations at the DBA+Developer table consist in promises to "buy more cores" rather than improve performance of design...Even in the best of shops.
 
Short answer, "no". It's an apples and oranges comparison in many ways.

Long answer, "sometimes". You can write awkward SQL in Access queries that make it hard to retrieve data from a remote database, and that may drag down performance. On the other hand, Access is pretty smart about how it executes queries, so many times there's no noticeable difference. And in still other situations, factors other than query design impact performance even more.

As a rule of thumb, I'd venture a guess that something like this would be pretty efficient as the recordsource for a form.

SELECT Field1,Field2, Field3, Field4, Field5, Field5,
FROM Table
WHERE Field1 = 1000

And moving it into a passthru to process on the server would not make a noticeable difference in most cases.
 
You are wanting a blanket answer but one cannot (honestly) be provided. It all depends.

Here are a few notes, in no particular order and not necessarily related, that I've observed over the years:

  1. It is common for people to point out that simply using SQL Server as a back end is not a cure-all for performance, as extremely low-performance design is still very possible. While I agree with that statement, it's also worth pointing out that sometimes, simply using SQL Server will, in fact, cause things to speed up - but we avoid saying that (I think) because we don't want people to be lazy and use bad habits, just hoping the mere move to SQL will erase the consequences, which is fine, but it's still a fact nonetheless in my experience.
  2. It is true that if you are going to use SQL Server, you can often realize a major gain in performance by using pass-through queries, written in T-SQL (not Access, but saved in Access), and executed on the server. And in that case, my point #1 becomes especially relevant - where you often WILL see major performance gains, even if the SQL/query/performance-related design is not great and even if it's exactly the same as it was in Access. Again, we encourage people to do good design - not just move to a faster processor - but there's nothing wrong, I think, with pointing it out. Use a table with 20 million records and see how fast Access can return (and render on screen) "select * from table" vs. SQL Server, and the point is made.
To my mind, beyond those two points, nothing further can be made in the form of a blanket statement.
I have done a number of projects where my "overhaul" project consisted in nothing more than moving the back end to SQL - changing nothing at all about the FE design or queries - before the project was decommissioned or the sponsors lost interest and moved me to other things. In most of those cases, performance sped up considerably - on the sole basis of the new back end. Despite it being a very bad idea to rely on that in place of performant design, it is what it is.
And, after all, most of the world's tech users behave the same way, if we're being honest. Most people, when their laptop starts giving them problems or slowness, they just rave about the next biggest and fastest laptop. Rather than understanding that an older, slower laptop works quite well if maintained and used correctly. But we have come to accept that. I suppose some of that mentality is present in tech design, too.

90% of the "our data warehouse load jobs are taking too long"-conversations at the DBA+Developer table consist in promises to "buy more cores" rather than improve performance of design...Even in the best of shops.
Gotcha. Thanks so much. This is very helpful info. Sorry to be so vague; what prompted me to ask this was also just a vague statement that I saw which essentially said "Using Access queries with SQL negates the benefits of SQL".

Your response is much more thorough
 
Gotcha. Thanks so much. This is very helpful info. Sorry to be so vague; what prompted me to ask this was also just a vague statement that I saw which essentially said "Using Access queries with SQL negates the benefits of SQL".

Your response is much more thorough
Hi. Providing a link to where you read that might help us explain any confusion. Just a thought...
 
what prompted me to ask this was also just a vague statement that I saw which essentially said "Using Access queries with SQL negates the benefits of SQL
Funny, now that you say it like that, it inspires me to do what might appear to be an about face (but isn't really) and mention that I do, in fact, see some (usually older) Access developers who use SQL as a back end but never seem to have made the jump into actual SQL Server development and wow, what a waste that seems to me. (in fact I see the exact same phenomenon with VBA).
It's like getting to the edge of paradise, taking a selfie and then going back home. At least, that's how I felt after I made the jump and realized I'd been missing out on what I now consider the "99%" of the fun and opportunity.

maybe that's all they meant.
 
I am using SQL as a backend and access as a front end. In general, does using a query to filter data for a form negate some of the speed and performance of SQL?
Access makes every attempt to "pass through" all queries. That doesn't mean that you can't prevent Access from passing through your query. you need to understand a little about how Access and ODBC work.

Creating a pass through query and saving it as a querydef is of course the most efficient since Access bypasses ODBC in this case. Otherwise, you have the overhead of the querydef being pre-processed by ODBC. Doesn't mean the query won't be handled exactly as the pass through query was. It just means you have a small amount of overhead for the ODBC code to run. In exchange for this very small hit, you end up with an updateable recordset that you can bind to a form so you can use Access much the same way you always used Access. (you can't use pass through queries in subforms).

HOWEVER, you are still responsible for making your queries as efficient as possible. The old style Access technique which you see everywhere in all the MS samples, is to bind a form to a table and filter the local recordset using the built in form tools. If the table is 100 rows, it makes absolutely NO difference what you do. But, the larger your table gets, the higher the price you will pay for this method. You actually want the server to do the heavy lifting and by that, we mean use criteria passed to it to select the smallest number of records and columns as possible to minimize the data transferred over the LAN. Therefore, all your forms should be bound to queries that have WHERE clauses that severely limit the number of rows returned. Preferably, edit forms would retrieve only a single record to edit. Subforms are constrained by the main form criteria so they may contain many records. In most cases, I use a couple of textboxes or combos in the header of my edit forms. So, the form always opens empty because the query returns no records.
Select ... From MyTable Where fld1 = Forms!myform!fld1;
Then when the user enters a value in the search field, the code does a requery:
Me.Requery
To run the query again and return the record. If there are several search options, I use a button to do the Requery but if there is only one, I use the AfterUpdate event so the Requery runs when the user tabs out of the search field.

My introduction to Access was at Reader's Digest in the early 90's where I was asked to do a project that required retrieving data from their mainframe DB2 database (IBM). What an eyeopener this was for me the old COBOL programmer. I fell in love instantly. I could now retrieve data from the database on the IBM mainframe AND UPDATE it if I needed to. I switched immediately to becoming an Access expert and the vast majority of my applications written since the early 90's have used SQL Server, Oracle, DB2, Sybase, you name it, as BE's. I used whatever RDBMS the client's other applications were using (DB2 is still my favorite though). Occasionally, I used Jet/ACE but I always built my Jet/ACE apps using exactly the same techniques I used for my ODBC apps.

My apps always use bound forms. Occasionally, I use an unbound form to display search results, especially if they are very complicated. I always use linked tables and querydefs. The only time I use embedded SQL is if the SQL is dynamic. Otherwise, the querydefs just take parameters. Occasionally, I use a view to speed up a join that is frequently used. The view allows SQL Server to calculate an execution plan and save it so it cuts overhead. Otherwise the execution plan must be calculated for each query sent to the server, even a pass through unless it is executing a stored procedure). For some bulk updates (especially deletes), I create pass through queries because they will run outside of Access' transaction umbrella. You know how when you run an Action query, Access always tells you something like, "you are about to delete 5893 records. Do you want to continue?" That is your clue the the delete is happening inside a transaction. This adds overhead. If you use a pass through query, it just runs and you don't get a second chance to change your mind. And finally, every once in a while I have a report that is so complex and requires so many tables and joins that a stored procedure is called for.

So, start by assuming your querydefs and bound forms will work with acceptable speed (assuming you understand how to not interfere with Access passing through your query) and then deal with slowness on a case by case basis.
 
Access developers who use SQL as a back end but never seem to have made the jump into actual SQL Server development and wow, what a waste that seems to me. ...........
It's like getting to the edge of paradise, taking a selfie and then going back home.
We have several tables in sql server, each one with more than a million records. The tables are linked (DSNLess) in an Access FE and we use what @GPGeorge suggested. Queries that return 0 rows and then change the WHERE clause to what we need.
I have also done several tests to see performance changes if we use pass-through queries instead and honestly I don't see any difference.

With your experience, do you think it's better to us pass-through queries (specially for search forms)?

Thanks
 
I am a SQL Server aficionado. I love it. One of the neglected features of using Access as a front end and connecting to a SQL Server database on the back end is the programming available in SQL Server. One can execute SQL Server stored procedures on the server from Access. Most Access development includes "action queries". A tremendous performance boost will occur when action queries are converted to stored procedures. I want to provide an example. Let's say you have an action query to update customer information. It might look like this:

Code:
' Access QueryDef:
UPDATE sales_staffs SET sales_staffs.first_name = "New First Name"
WHERE (((sales_staffs.staff_id)=1));

If this QueryDef was converted to a stored procedure, the database update performance would be immensely improved. Below is Access VBA that is needed to call a DML stored procedure. The call to run the stored procedure occurs within a module-level procedure called ExecuteSQL. The code uses ADO. The procedure can be called from anywhere in an Access project. No need to repeat the code. DRY principle! This functionality is many times more efficient than running an action query out of Access. Please review the stored procedure further down the page.

Code:
' Access VBA to call stored procedure update
Private Sub btn_UpdateFirstName_Click()

On Error GoTo Err_btn_UpdateFirstName_Click()
   With Me
       ExeccuteSQL "EXEC uspUpdateCustomer " & .CustomerID & ",'" & .FirstName & "';""
   End With

Exit_btn_UpdateFirstName_Click():
    Exit Sub

Err_btn_UpdateFirstName_Click():
       MsgBox "Error number " & Err.Number & ": " & Err.DESCRIPTION
        GoTo Exit_btn_UpdateFirstName_Click()
End Sub

' Module code to run stored procedures 
Public g_adoConnection As New ADODB.Connection
Public g_adoCommand As New ADODB.Command
Public g_adoRS As New ADODB.Recordset


Public Function SetConnxn()

    On Error GoTo HandleError

    SetConnxn = "Provider=sqloledb;Data Source=<server name.>" & _
             "Initial Catalog=<database name>;User ID=<user name.;Password=<password>;trusted_connection=no"
          
HandleExit:
    Exit Function
HandleError:
    MsgBox Err.Description
    Resume HandleExit
End Function

Public Sub ExecuteSQL(strSQL, Optional lngTimeout As Long)

  On Error GoTo Err_ExecuteSQL

  If g_adoRS.state = adStateOpen Then g_adoRS.Close
  If g_adoConnection.state = adStateOpen Then g_adoConnection.Close

  g_adoConnection.ConnectionString = SetConnxn
  g_adoConnection.Open
  
  g_adoCommand.CommandTimeout = 360
  g_adoCommand.ActiveConnection = g_adoConnection
  g_adoCommand.CommandText = strSQL

  Set g_adoRS = g_adoCommand.Execute

  If g_adoRS.state = adStateOpen Then g_adoRS.Close
  If g_adoConnection.state = adStateOpen Then g_adoConnection.Close

Exit_ExecuteSQL:
    Exit Sub

Err_ExecuteSQL:
    MsgBox "SQL statement could not be executed :" & vbCrLf & strSQL _
           & vbCrLf & Err.Description, vbOKOnly, "Avium " & Err.Source
    Resume Exit_ExecuteSQL

End Sub


Here is the stored procedure on SQL Server. It will execute the update much faster than the QueryDef would. Please notice the update_date field gets automatically updated when the stored procedure executes.

Code:
'SQL Server Stored Procedure'
USE [<database name>]
GO

drop procedure [dbo].[uspUpdateCustomer]
go

set ansi_nulls on
go

set quoted_identifier on
go

-- =============================================
-- Author:            <developer>
-- Create date:     <date>
-- Description:      Update single customer first name
-- =============================================
create procedure [dbo].[uspUpdateCustomer]
    @ID integer = 0
  , @first_name nvarchar(255) = ''
as
begin
    update sales.customers
    set first_name = @first_name
      , update_date = getdate()
    where customer_id = @ID;
end;
go
 
Last edited:
We have several tables in sql server, each one with more than a million records. The tables are linked (DSNLess) in an Access FE and we use what @GPGeorge suggested. Queries that return 0 rows and then change the WHERE clause to what we need.
I have also done several tests to see performance changes if we use pass-through queries instead and honestly I don't see any difference.

With your experience, do you think it's better to us pass-through queries (specially for search forms)?

Thanks

I don't think a blanket answer makes sense here. If you are experiencing unacceptable slowness, try a pass-through. But first, check whether you are doing something that is bad for performance, the solution to which is adjusting your SQL approach and completely transcends Access query vs. passthrough query.

For some bulk updates (especially deletes), I create pass through queries because they will run outside of Access' transaction umbrella. You know how when you run an Action query, Access always tells you something like, "you are about to delete 5893 records. Do you want to continue?" That is your clue the the delete is happening inside a transaction. This adds overhead. If you use a pass through query, it just runs and you don't get a second chance to change your mind.
A much easier way to avoid that is to use currentdb.execute. Creating a pass through to avoid that prompt rather than just using .execute doesn't make sense to me.

I am not sure what joins and complexity has to do with the decision to make a stored procedure. The biggest benefits of the stored procedure is that there is one, over-arching, centralized place to maintain code and enforce business logic and it makes a lot more sense to have SQL inside those objects than it does pasting raw SQL into a pass-through. It's also, of course, awful to have to use the Access IDE rather than SSMS.

I like many SQL developers tend to avoid Views unless there is really no other way. The problem is something that isn't evident when first starting out. At first a View seems like the "perfect toy/tool" for seemingly just about every situation. The problem comes because they are just too easy to select from as Tables - and as such, the natural effect that happens as time goes by is Views inside Views inside Views. Everyone wants to leverage an existing View rather than re-do the coding, which sounds good at first blush but actually leads to a lot of unchallenged assumptions (about Views that are being used that contain logic the next-level-up Select-er doesn't fully digest or even want), a LOT of bad performance, (Views that have to fully generate themselves every time they are referenced), and business logic buried too deeply that discourages anyone from taking the enormous time needed to investigate all the way down the 'chain' to verify that that is what they really want. Not to mention ripple effects from someone altering just one of them at any time.

TVF's have the redeeming quality of accepting parameters, at least, and I like them.

I never paste raw SQL into a pass through query. I make a proper stored proc, then only exec it in the pass thru.
All t-sql is maintained on the database where it belongs.
 
Here is the stored procedure on SQL Server. It will execute the update much faster than the QueryDef would. Please notice the update_date field gets automatically updated when the stored procedure executes.
Not sure how you can make this statement about an update to a single column of a single row. Methinks you're overselling:)
 
Not sure how you can make this statement about an update to a single column of a single row. Methinks you're overselling:)
It's an example. If the example is expanded to many more records with many more columns, the benefits become clearer. :-) There are so many more development features in SQL Server such as common table expressions, temporary tables, functions, triggers, table variables ... the list goes on. These features give developers the ability to create powerful stored procedures that far exceed what one could do with an Access action query. Instead of action queries, Access can call those powerful stored procedures and benefit from their sophistication.

I realize I'm getting away from the OP's question because he is inquiring about the performance of select statements with where clauses. I think it's important to consider all the other benefits of moving some Access development to the database server (SQL Server). I've been guilty of writing a bunch of VBA to slice and dice data before committing it to a database. Not good. One gets a big performance boost by moving all those action queries and DML-like VBA modules to SQL Server.
 
But you are selling this to people who have no clue. There is absolutely NOTHING superior about using a stored procedure in the way you demonstrated and your target audience doesn't know that. You've just told them stored procedures is absolutely the best way to use Access and that would be 100% wrong. If you want to show off your expertise, at least put your examples in context. There are one or two people who might actually need occasionally to use a stored procedure and need to know when it is appropriate and how to do it. Maybe create a sample database with side-by-side examples of using an Access update query vs a stored procedure to do the same thing along with timings.
 
Maybe create a sample database with side-by-side examples of using an Access update query vs a stored procedure to do the same thing along with timings.
I really appreciate this comment. I've seen so many comments about the benefits of using sql server tools (stored procedures, pass-through queries,...), but have never seen a decent example. As I explained above, we have a very complex database with more than 90 linked tables to sql server, some of them having more than a million records, but everything we do here is as fast as a blink of eye. I think those who designed it knew their job well. The only stored procedure in use is for backing up the server.
I am really interested to see how things can be faster and more convenient than this if someone has an example.

Not that I want to say using sql server tools is not preferred, I just want to see it in action.

thanks.
 
@KitaYama That has been my experience also. I can write bad queries but then I fix them. The key is understanding what elements cause problems that make Access and SQL do extra work. Maybe it's because my very first Access application used DB2 on our IBM mainframe as the BE. This is the project that changed my world view. Why stick with the tedium of a third-generation language where I had to code EVERYTHING when I could work with a RAD tool like Access and be a lazy bum. I've written my million lines of code. I don't need the practice. So I invested the time to understand event processing and the properties and methods of forms plus the various VBA functions so I didn't have to reinvent the wheel. Retraining myself to move from procedural logic to event logic was a challenge even though I was already a competent programmer. I also already knew that action queries were far superior to code loops. So my mantra is let Access do what Access does and I do what Access cannot do for me. That primarily comes down to data validation and whatever batch processing the application requires.

I really don't mean to discourage people from experimenting. You may find that stepping away from the "Access way", might give a significant enough performance increase to justify the extra work. But to me, if it ain't broke, don't try to fix it or you may break it:)

I find that the people who push the use of non-native Access methods don't actually understand Access or hate Access and are convinced they know better so they don't understand or ever learn how to use linked tables and bound forms.
 
I never paste raw SQL into a pass through query. I make a proper stored proc, then only exec it in the pass thru.
All t-sql is maintained on the database where it belongs.
I always have doubts about the case being presented when someone uses the word "proper" to justify their position. :rolleyes:

However I do understand what you are saying.

The only time I can recall having used pass-through queries is when passing arguments to TVFs. I've never considered using pass through for performance. Like Pat says, Access will pass through what it can anyway. If it can't, then the process probably does belong on the server.

I don't often write to the server from Access. If it is just a value or two the linked table works fine. For repetitive stuff I am more likely to send commands to a Stored Procedure if that is possible in the situation. Editing the sql in a passthrough seems a bit clunky.

If it involves vast amounts of data being uploaded then the performance is going to be terrible anyway so I would probably try reading the Access database from the server instead using OPENROWSET. That is what I do for data I need to read from Excel.
 
I find that the people who push the use of non-native Access methods don't actually understand Access or hate Access and are convinced they know better so they don't understand or ever learn how to use linked tables and bound forms.
That comes across as very arrogant. Is that what you "find" or what you "assume"?
 
I always have doubts about the case being presented when someone uses the word "proper" to justify their position
Point taken, but I am making no attempt to justify my position by the word proper - I'll be very direct, I used "proper" because I think that's the proper thing to do in many cases. :)
 
I really appreciate this comment. I've seen so many comments about the benefits of using sql server tools (stored procedures, pass-through queries,...), but have never seen a decent example. As I explained above, we have a very complex database with more than 90 linked tables to sql server, some of them having more than a million records, but everything we do here is as fast as a blink of eye. I think those who designed it knew their job well. The only stored procedure in use is for backing up the server.
I am really interested to see how things can be faster and more convenient than this if someone has an example.

Not that I want to say using sql server tools is not preferred, I just want to see it in action.

thanks.

You can even set aside performance for the moment: Just the organization and simplification and centralization is a reason to create SQL Server objects....in some cases. I can't tell you how many times I've seen Access databases where someone created and saved a new Query object every time they breathed, just about, it seemed like. That's because a relatively trivial process in Access can require a dozen Action queries. I don't know anyone who is substantially experienced in SQL Server development who would rather manage all that crap in the overly simplistic Access object list - it's just messy and hard to keep up with. You end up with the craziest naming conventions in the world, as there is no way to document anything. Managing a lot of that stuff in the form of SQL Server artifacts is a thousand times easier. IMO people who are resistant to it are the same crowd who are resistant to VBA, it's just new and unfamiliar and they'd rather not. I completely understand that - because at ANY time, any one of us has SOMEthing that they are currently resisting adopting or learning - it's human nature, and the nature of the development career.
Speaking of RAD, there is a big difference between rapidly creating stuff and then rapidly and efficiently maintaining it. People make too much out of the "how quick can I create something" question, and too little about the "how well can I keep it organized over the next 5 years" aspect.

Just for that reason alone SQL Server shines pretty bright and that's just the beginning.
Like piano player said, it's a question of taking advantage of the sophistication. They may have oversold and generalized it a bit much, but I think they're more in the right direction.

@KitaYama all I can say is, if you work on incorporating SQL Server as much as seems reasonable (for your own learning sake), in a year or two you will shake your head at the idea that you ever thought of doing 'just Access'. It will open doors that will swing much wider than the Access world - and the best thing is you can keep doing both if you like.
 
@KitaYama all I can say is, if you work on incorporating SQL Server as much as seems reasonable (for your own learning sake), in a year or two you will shake your head at the idea that you ever thought of doing 'just Access'. It will open doors that will swing much wider than the Access world - and the best thing is you can keep doing both if you like.
I've been trying hard with no significant progress. The first book I bought was too sophisticated that I had to stop trying following the writer in the middle of the first chapter. The second one is not helping more than the first one.

Can you guide me to a good start point? Any on-line reading material, helpful site, blog, book whatever is OK.
Yes, I know I can google it. But I prefer to hear it from someone experienced rather than myself picking something random.

Thanks.
 

Users who are viewing this thread

Back
Top Bottom