SSMS22 Application Itself Slower Than SSMS21 (3 Viewers)

My laptop generally sits at about 25-30% memory usage.
But some programs will resource hog. A local SQL Server itself is notorious for grabbing as much memory a it can.
 
I've got plenty to spare, it must be something else.

1765455742967.png


Editing objects was really slow in using an Azure Db for me. Things definitely improved as stated in recent posts thanks to input from forum. However it was too slow to continue developing. I changed to a local instance of SSMS 22 with near instantaneous Object Explorer now with exception to Diagrams which take around 10 secs or so on initial load.
 
Last edited:
Have many experienced this? I'm not talking about through Access or Query performance, I mean the actual application SSMS22 itself is extremely slow. Through Access is quick but I'm working in SSMS most of the time converting the BE for Access.
Seems there are quite a few complaints with the newer version itself being considerably slower than the the previous. I'll stay away from ver-25 for sure during beta period, but research indicates MSN have problems with it due to fully moving to x64bit system...

I think it's to do with OLE ODBC; as despite being < 1 year old computer & using MS365 it did not have ODBC Driver Ver 18; had to install manually. When I was installing I got a prompt about ODBC drivers not being present (despite them being present). Recommendations were to delete all OLE drivers, uninstall SSMS-22 & reinstall. Which I have done. Performance seems better on first few minutes of playing around but the Diagram view is still slow.

Specs
  • SSMS22 Managed Instance Of Azure
  • RAM 32 GB (50% consistent use); pretty sure this has increased considerably since installing/ using SSMS (new to it); pretty sure this is due to amass of services now running but not too worried about this atm
  • CPU - hardly anything ever (2.5 GHz available)
  • WiFi - nowt; circa 8Kbps/ 30Kbps rare
Diagram View
  • Add Relationship
  • 30 secs to load Table & Column Definition
  • 50 secs to write changes
  • 3 minutes to save relationship changes (sometimes hangs requiring restart sometimes)
Concurrence
I know it would be quicker to use a local instance during development, but I understand I'm likely to run into problems on deployment. As I'm so inexperienced I do not think I would be capable of identifying the issues so would rather suffer a slow Azure development I think.
I just installed SSMS22 on my laptop (an older one).
It seems to run just fine (loading wise is same, or even somewhat better then SSMS21).

To be fair, SSMS (recent editions) is perhaps the slowest loading application I have!!!

So, my current laptop?
4ghz, 6 cores, (12 threads), 24 gigs ram.

And I find that SSMS22 seems to be actually better then previous 21......

If I loaded SSMS one time (say during the day), and then re-load again?

I SSMS comes up in 5.8 seconds. Connection screen comes up about 4-5 seconds later.

Hence this:
ssms.gif


So, if anything, I find 22 loads a bit faster then 21....

Edit:
So, I would test/try against a local running copy of SQL server. Opening tables in design view, or opending diagrams occures without any real delay.

This suggests something else is at play here - perhaps domain authentication or something else......

So, as a FYI, I'm not seeing any real "new" delays by using SSMS 22. However, as above screen video shows, SSMS for the recent editions does indeed load slow, or better stated slower then just about anything else I run on my computer - even the big whopper Visual Studio loads faster then SSMS.....

But, you should not be seeing such delays. As noted, I would test/try against a local running edition of SQL server. Once SSMS is loaded, then general operations occur without any real delays....


R
Albert
 
Last edited:
Thanks for doing that Albert, really appreciated. Yes must've been Azure 's latency on the most basic of Resource Group package spec... As per nr-22 ditched Azure for the time being as it was too slow to develop on. Locally much faster, which was always going to be the case for obvious reason, but not as slow as received (unusable really).
 
I've got plenty to spare, it must be something else.

View attachment 122615

Editing objects was really slow in using an Azure Db for me. Things definitely improved as stated in recent posts thanks to input from forum. However it was too slow to continue developing. I changed to a local instance of SSMS 22 with near instantaneous Object Explorer now with exception to Diagrams which take around 10 secs or so on initial load.

First, I will preface my comments by stating clearly that I have stayed away from this one because of zero direct experience with SQL Server and Azure. I will ask what might therefore be seen as dumb questions that are hopefully at least not totally irrelevant.

1. Using Task Manager >> Performance >> Memory - look at Memory Composition and determine how much, if any, FREE memory you have. On that line, your four categories are In Use, Modified, Standby, and Free - in that order. If "Free" memory is very low, this is not a good sign. Below that you have a "Committed" statistic shown as xxx.x/yyy.y and (probably) units of GB. The second number (yyy.y) is the total VIRTUAL RAM size on your system counting both physical RAM and the so-called "Virtual Memory" file (a.k.a. swap file.) If the first number (xxx.x) is greater than the size of your physical RAM, you are likely using part of the virtual memory file that is controlled from Windows "Settings" section. Use of virtual memory to any degree is a bad sign because that means you are possibly swapping. There is an old saying, "When the O/S swaps, the world stops." In more blunt terms, performance during swapping sucks. Because page swap operations are always executed at extremely high priority, potentially getting in the way of other things.

2. Do you have a persistent connection associated with that back end? If your network connection has to go through gyrations to get to the BE and you don't have a persistent connection for it to ride, you would spend a lot of network overhead in re-verifying your connection permissions. Because you get version-differentiated results, this one is unlikely, but if the two different version differ because of a patch to the connection manager code, it could happen.
 
First, I will preface my comments by stating clearly that I have stayed away from this one because of zero direct experience with SQL Server and Azure. I will ask what might therefore be seen as dumb questions that are hopefully at least not totally irrelevant.

1. Using Task Manager >> Performance >> Memory - look at Memory Composition and determine how much, if any, FREE memory you have. On that line, your four categories are In Use, Modified, Standby, and Free - in that order. If "Free" memory is very low, this is not a good sign. Below that you have a "Committed" statistic shown as xxx.x/yyy.y and (probably) units of GB. The second number (yyy.y) is the total VIRTUAL RAM size on your system counting both physical RAM and the so-called "Virtual Memory" file (a.k.a. swap file.) If the first number (xxx.x) is greater than the size of your physical RAM, you are likely using part of the virtual memory file that is controlled from Windows "Settings" section. Use of virtual memory to any degree is a bad sign because that means you are possibly swapping. There is an old saying, "When the O/S swaps, the world stops." In more blunt terms, performance during swapping sucks. Because page swap operations are always executed at extremely high priority, potentially getting in the way of other things.

2. Do you have a persistent connection associated with that back end? If your network connection has to go through gyrations to get to the BE and you don't have a persistent connection for it to ride, you would spend a lot of network overhead in re-verifying your connection permissions. Because you get version-differentiated results, this one is unlikely, but if the two different version differ because of a patch to the connection manager code, it could happen.
The issue(s) of a persistent connection doesn't apply nor effect nor help when using SQL server. The persistent connection works due to eliminating the time for windows to grant op-locks, and switch from single user file mode to multi user file mode (and to create the. Ldb locking file). In case of SQL server connections? There's no files on backend that front end is touching or opening. It's a pure tc/ip socket connection, and no files are involved (and hence no fix or benefits occurs by using nor having a persistent connection). So when using sql server backend, the persistent connection from access front end doesn't help. Of course this is SSMS connecting to SQL server and thus once again the concept or introduction of a persistent connection doesn't apply, nor help....
 
Albert, I wasn't worried about files... I was worried about access arbitration, which applies to ALL network based connections in some degree. Particularly when either a certificate or a trust relationship is being exploited. Re-validating either kind of relationship adds some level of extra traffic on the net.
 
Isn't this thread supposed to be about performance of SQL Server Management Studio version 21 versus SQL Server Management Studio version 22?

Why are we even talking about Access in this context?
 
Why are we even talking about Access in this context?
I would imagine Doc is referring to 'access' in a protocol context; not MS Access. But you're right that I'm not really hooked up to MS Access & this is relating purely between SSMS-22 & Azure.

1. four categories are In Use, Modified, Standby, and Free - in that order.
Thanks Doc, my trial with Azure has ran-out but in the interest of learning I didn't have an identical set of params you referred to Doc, but i don't think it would be hardware related. Granted I'm no longer connected to Azure but whilst running half a dozen apps (SSMS inc'd) I got pic'd below.

1767199516674.png


2. Do you have a persistent connection associated with that back end? If your network connection has to go through gyrations to get to the BE and you don't have a persistent connection for it to ride, you would spend a lot of network overhead in re-verifying your connection permissions.

Google says default with a single SSMS window is a persistent connection though this seems to contradict that.

I'm going to continue locally for now but will look at Azure once I have the application built locally. Performance was good when I tested pulling data with Access with Azure, but for developing the application with Azure & SSMS-22 was not a fruitful experience.
 
Last edited:
Google says default with a single SSMS window is a persistent connection though this seems to contradict that.

If there is a window that you keep open, that counts as a persistent connection. Operating without an open window still works but if there is no other persistent connection, you have to go through little-a access arbitration more often than you would like. That Windows operation is related to permissions and security and usually doesn't really care if someone else is in the same folder from another machine - though someone COULD in theory put an exclusive file lock on something via a Windows action. Access can do that if you set the "Open Exclusive" option in the File >> Options >> Database section. Excel does it all the time because Excel worksheets are Open Exclusive cases unless you use some options that would override the exclusivity.

Big-A Access arbitration (record-level locking) is strictly about checking the .LACCDB or .LDB file to see what data blocks are already in use and is just looking for two users in the same block. Usually, Access doles out data in 8-block chunks = 4096 bytes because of old memory management considerations and the fact that the size of a physical disk block is 1/2 Kbytes.
 
Thanks Doc, that is extremely interesting & I want to learn more of that kind of thing.
 

Users who are viewing this thread

Back
Top Bottom