SQL Server 2014 Live Today - Customers Moaning (1 Viewer)

SQL_Hell

SQL Server DBA
Local time
Today, 07:41
Joined
Dec 4, 2003
Messages
1,360
Ok good,thanks.

So the config looks fine, all at default settings.

But the indexes are really fragmented, so we need to rebuild them. I will be back tomorrow with code to reindex, in the meantime do not try and rebuild them yourself, it has to be done out of hours.

Also can you tell me what the record source of the problematic form is? query, direct tables, stored procedure?

If the reindex doesn't completely fix it,then we need to look at tuning with the ultimate goal of this form opening in under a second.
 

PiedPiper70

Registered User.
Local time
Today, 07:41
Joined
Oct 21, 2012
Messages
115
SQL_Hell

Thanks again.Don't worry about out of hours because I'm doing all this on my own SQL Server, and will do it eventually on the client's. But neither of them are in use at the moment.

As for the troublesome form, it's not the only thing that seems slow, but it's the one form they use the most and that above all has to be quick. This is how it works, leaving out a few little bits of detail:-

It is an unbound form with 12 tabs. 5 of the tabs have subforms on them, and nothing else. The remaining tabs have corresponding tables. All of the tables have the same unique id. To open the form I have a master module that runs 7 functions, one for each table and tab, which use DAO to open a recordset, use findfirst to find the required record and then copy the required fields to the form.
 

SQL_Hell

SQL Server DBA
Local time
Today, 07:41
Joined
Dec 4, 2003
Messages
1,360
Hello again,

Read this, in fact read anything by Brent Ozar his articles and advice are extremely good.

http://www.brentozar.com/archive/2013/09/index-maintenance-sql-server-rebuild-reorganize/

And download IndexOptimize.sql from Ola Hallengren

http://ola.hallengren.com/downloads.html

Once you have the script, don't create the stored procedure in your application database,create yourself a new database called SQLAdmin or something similar, then run it from there. You can use this database for more useful scripts and anything you are testing rather than make your application database messy.

Some examples of running the stored procedure are here (down the bottom):

http://ola.hallengren.com/sql-server-index-and-statistics-maintenance.html

But basically you want option B which is:

Code:
EXECUTE dbo.IndexOptimize
@Databases = 'USER_DATABASES',
@FragmentationLow = NULL,
@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationLevel1 = 5,
@FragmentationLevel2 = 30,
@UpdateStatistics = 'ALL',
@OnlyModifiedStatistics = 'Y'

In your last post you mention using DAO recordsets (excuse me but my memory of access has almost diminished to nothing),are these being populated by strings of SQL? in which case post these and we'll create indexes for them, but I am fairly confident the reindex will improve things drastically.
 

PiedPiper70

Registered User.
Local time
Today, 07:41
Joined
Oct 21, 2012
Messages
115
I created a new database called SQLAdmin as suggested. I downloaded the full MaintenanceSolution.sql and installed it and executed it. But now what - as a total novice I have no idea what to do next. Would you mind spelling it out as I can't see where I can run something and select version B.

I can see a steep learning curve ahead and it feels that I'm only on the first step.
 

SQL_Hell

SQL Server DBA
Local time
Today, 07:41
Joined
Dec 4, 2003
Messages
1,360
Ok, so what you have done is create the stored procedures but not actually run them.

They should be in your SQLAdmin database, so the code to execute is:

Code:
Use SQLAdmin
EXECUTE dbo.IndexOptimize
@Databases = 'USER_DATABASES',
@FragmentationLow = NULL,
@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationLevel1 = 5,
@FragmentationLevel2 = 30,
@UpdateStatistics = 'ALL',
@OnlyModifiedStatistics = 'Y'
 

PiedPiper70

Registered User.
Local time
Today, 07:41
Joined
Oct 21, 2012
Messages
115
Right, got it. Easier that I realised.

So then I executed the code you sent me to check the top 20 worst index fragmentation but I get zero results. The table headers are there but no records.

Sorry - being a bit thick today, but any ideas where am I going wrong?

PS it goes without saying - a thousand thanks for all your help
 

SQL_Hell

SQL Server DBA
Local time
Today, 07:41
Joined
Dec 4, 2003
Messages
1,360
Perhaps you are running the index fragmentation code in your SQL admin database rather than in LCSDat?
 

PiedPiper70

Registered User.
Local time
Today, 07:41
Joined
Oct 21, 2012
Messages
115
Good news:- sorted that out
Not so good, no change to index fragmentation. I presume it's me!!

Thanks for spending the time on this. I'm out for an hour or so so I'll have a fresh look when I get back
 

kevlray

Registered User.
Local time
Yesterday, 23:41
Joined
Apr 5, 2010
Messages
1,046
I think just having SQL rebuild the indexes should clear everything up. I do not remember the command off the top of my head, but I think it was pretty simple.
 

Rx_

Nothing In Moderation
Local time
Today, 00:41
Joined
Oct 22, 2009
Messages
2,803
For rebuilding indexing I used DBCC DBREINDEX and for defragmentation DBCC INDEXDEFRAG
Reindex completely rebuilds the index, removing fragmentation at all levels.
Index Defrag shuffles the leaf pages into order, removing fragmentation at the leaf levels, but potentially leaving fragmentation at the other levels of the index.

Doing a defrag right after a reindex is redundant and a waste of time and server resources.
A reindex does all that a defrag does and more.

You would typically use a defrag is you have only a small window (it's often faster than a reindex) or you need the table accesssible during the defrag. (on SQL 2000, an index rebuild is an offline operation)



http://www.sqlservercentral.com/articles/statistics/109879/


This link might be useful for your quest.

http://www.sqlservercentral.com/blogs/sqlservernotesfromthefield/2012/05/29/dbcc-checkdb/
DBCC CHECKDB
Let me suggest that you run DBCC CHECKDB first.
This is somewhat like the MS Access Compact and Repair.
At least make sure the structure doesn't have a problem, and possibly fix the links if there were any.
 
Last edited:

SQL_Hell

SQL Server DBA
Local time
Today, 07:41
Joined
Dec 4, 2003
Messages
1,360
@SQL Hell, is there no option for manually defragging indexes within SQL server management studio. Like right clicking on an index a drilling down thru the options? Just wondering.

Hey, yeah there is an option to rebuild an index on its own but no GUI option to rebuild all indexes. (there might be in SQL server 2014 however) There is also the maintenance plan wizard, where it is very easy to create a re-index task and schedule it in a SQL server job.

But honestly scripting your own admin stored procedures or using something flexible like the Ola Hallengren script, is really the way to go imo.
 

PiedPiper70

Registered User.
Local time
Today, 07:41
Joined
Oct 21, 2012
Messages
115
SQL_Hell and others

Apologies for not responding sooner but have been away for a few days break - needed it!

I manually rebuilt the worst of the indexes a table at a time and that seemed to improve things by approximately halving the time to open the key form that I mentioned previously. This is good but unfortunately it wasn't the magic bullet for my mate who is still getting around 7 seconds - far too long.

My next move is probably to look into stored procedures and how to use them from an Access front end.
 

SQL_Hell

SQL Server DBA
Local time
Today, 07:41
Joined
Dec 4, 2003
Messages
1,360
Thanks for the update, it is a good idea to change to stored procedures, but you won't see any performance benefit from it unfortunately.
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 16:41
Joined
Jan 20, 2009
Messages
12,854
it is a good idea to change to stored procedures, but you won't see any performance benefit from it unfortunately.

Depends on how it is done. On its own that change would be unlikely to show much difference.

I don't really know why but I found that subforms with Master/Child LinkFields inherently performed very poorly against large linked SQL Server tables.

I changed to directly loading a recordset as the Recordset Property of the subform using the OnCurrent event of the main form. This recordset was returned from a parameterised stored procedure to which I passed the value that would have been in the MasterLinkField if it had been a traditional subform structure.

The performance of the subform went from several seconds on each main form record change to almost instant.
 

SQL_Hell

SQL Server DBA
Local time
Today, 07:41
Joined
Dec 4, 2003
Messages
1,360
Galaxiom thanks for adding that, in which case it may well improve performance.

I was referring to the common misconception that SQL server treats stored procedures any differently from ad hoc queries in terms of performance, it doesn't.
 

PiedPiper70

Registered User.
Local time
Today, 07:41
Joined
Oct 21, 2012
Messages
115
Thanks to everyone that has contributed to this thread, it's been very useful. I'm still not sure what exactly to do to make things faster, in particular the main multi-tab form, but for anyone that's interested this is what I'm planning to try.

1. There are 17 tabs on this form. The first is a Summary tab. Then there are 7 tabs which have fields displayed, all unbound, from 7 different tables. The remaining 9 tabs have subforms on. The whole thing is driven by a master module. The summary tab is populated last from data already contained elsewhere on the form.

2. First stage - there are 7 functions which obtain the data to fill the relevant 7 tabs. I plan to create 7 stored procs for each table and alter those functions to use the SP's instead as a data source.

3. Re-work the sub forms to also use stored procedures

4. If the above gives me the speed I need then stop there.

5. If not, then produce the summary screen on it's own, a separate challenge, and then only load/requery everything else when each tab is clicked on.

I'm hoping 2 and 3 give me what I need as step 5 would be tricky.

I'll report back and let you all know how I'm getting on. And I'm always happy to hear any comments even if it's "you must be mad" or "I wouldn't do it like that". Comments like "that's exactly what I would do" also welcome.

Thanks
Dave
 

Rx_

Nothing In Moderation
Local time
Today, 00:41
Joined
Oct 22, 2009
Messages
2,803
That is a good plan.
But, as per my experience and another recent launch, let me strongly suggest this.
Your SQL Server is a Virtual Machine.
Regardless of what you are told, make sure that the system admin completely verifies the SQL Server virtual machine's virtual memory is not being swapped to disk. This is a very, very tricky thing. Have him research it and ask about it on SQL Server Central or some other blog.

You should personally view Task Manager specifically for the SQL Server instance and just look at the charts.

Next, you should personally view Task Manager for your instance of MS Access being run on what is probably a Virtual Server for Remote Terminal.

In Task Manager, how much memory will it max out to. How much memory is being Cached? How much processor (peak) are being hit how often.

I will bet you a can of Smoked Kippers this will make an impact on your speed. B.T.W. looser has to eat the smoked kippers. ;)
 

keithkram

New member
Local time
Today, 07:41
Joined
Aug 13, 2014
Messages
1
I am that System Administrator for the 3 servers PiedPiper70 is referring too.

All 3 servers SQL Server 2014 are Virtual Servers running on a Server 2012 Host Server. All 3 servers are configured the same. The only difference being their specifications.

PiedPiper70 Server Host Server is the following spec

Single 6 Core Xeon Processor
32GB RAM
4 x 2TB SATA Drives Mirrored in Pairs

My Development Server is

Dual 4 Core Xeon Processors
96GB RAM
4 x 600GB SAS Drives Mirrored in Pairs

Customers Server

Dual 4 Core Xeon Processors
128GB RAM
6 x 600GB SAS Drives Raid 10 for Data
2 x 256GB Solid State Drives Mirrored for OS

All Virtual Servers have

8GB RAM
4 Virtual Cores
Dynamic Virtual Drives (although we have tried Fixed ones)
Their own Virtual Network Switch on a 1GB NIC not used by the Host

The thing that does not make sense is that PiedPiper70's lowest spec server performs the best. We have monitored Performance and none of the servers are short of Memory or CPU, and the hard disks have no queues and the network traffic is not a problem as we get the same performance on the clients system when there are no other users on it.

Basically PiedPiper70's system is giving 1 second response times for opening the main form, mine and the clients systems are giving between 4 and 8 seconds. So why are the higher spec servers performing the worst, is it something to do with the SAS drives as that is the main difference.

Anyone have any ideas. The existing Access version of the system performs the same on all 3 systems.
 

PiedPiper70

Registered User.
Local time
Today, 07:41
Joined
Oct 21, 2012
Messages
115
Update from my previous message:-

I have created a series of stored procedures and have replaced the code for all the pages that display fields of data. This has made some difference to the overall speed of the form opening, but I need more.

So I'm looking into using stored procs again to provide the data for 11 subforms. I trying the method suggested by CJ London whereby I delete the recordsource for the subforms but only replace it when each tab is clicked on. Problem is that the OnClick event or the actual tab does nothing. I've got the code to work if I click on the actual tab page itself, but I want to trigger it from the tab label at the top.

It should be easy but I just don't get it. Can anyone help with that bit please?
 

Users who are viewing this thread

Top Bottom