Performance question generally - @Variables vs. a Join to a small reference table (1 Viewer)

Isaac

Lifelong Learner
Local time
Today, 15:53
Joined
Mar 14, 2017
Messages
8,777
Many times in my T-SQL code I use variables, such as @DateStart and @DateEnd. The general purpose is of course to use them in a WHERE clause, such as:

where (columnname between @DateStart and @DateEnd)

Also, it's quite often that the query may take enough time to run at least enough so that I wish to run it in chunks as I am developing, testing, etc.

But if I'm running a chunk of code that's rather late in the query, the @DateStart type variables are of course out of scope, so I'd have to re-run the whole thing.

To solve this, I often use code like this near the beginning:

drop table if exists #DateParameters
Create table #DateParameters (DateStart datetime, DateEnd datetime)
Insert #DateParameters (DateStart, DateEnd) values (@DateStart, @DateEnd)


... So that later in my code I can instead simply run this:

from table1
inner join #DateParameters dp on (table1.columnname between dp.DateStart and dp.DateEnd)


My question is, do you think one or the other is necessarily better or worse for performance? Of course there is only one record in #DateParameters, but there are usually millions if not more in the large tables.

I already know I can "try and test it"; however, that's easier generalized than it is done, as I often run and re-run code many times in the same connection, and there may be QP's created and/or data cached, so even though I can try to test it, I'm interested in thoughts or insight nonetheless.

Thanks!
 

SQL_Hell

SQL Server DBA
Local time
Today, 23:53
Joined
Dec 4, 2003
Messages
1,360
I don't think it will make a lot of difference personally, the creation of the temp table and join will be slightly slower but probably only microseconds slower, they key thing for performance here is whether you have a good index on the date column in the table with millions of rows. Run both queries, one with temp table and one with variables and look at the execution plan of both, that will give you an indication of the percentage execution time for doing the join.
 

Minty

AWF VIP
Local time
Today, 23:53
Joined
Jul 26, 2013
Messages
10,371
I think it depends on the type of data/variables.
We produce reporting outputs for a client that stores millions of rows of pricing data for a certain group of products.

When we want to query a subset of it, we place the Product_ID's into a indexed table and join on it to produce the report.
This is measurably quicker than any other technique. (Using an In(12,15,20) etc.)

Between two dates probably not if the source data fields are indexed.
However, we also produce weekly reporting - e.g. pricing data for every Friday for a 3-month period. Again sticking that data into a temp table and joining on the dates is vastly superior to any other method.
 

isladogs

MVP / VIP
Local time
Today, 23:53
Joined
Jan 14, 2017
Messages
18,235
Suggest you watch two Access Europe videos which cover optimizing queries in some detail

The first is my AEU presentation for queries using Access tables:

The second is by Maria Barnes and covers similar ground for SQL Server data:
 

Users who are viewing this thread

Top Bottom