help needed tricky issue (1 Viewer)

dtdukes86

Member
Local time
Today, 11:37
Joined
Aug 14, 2021
Messages
83
hi all it has been a while since I last posted but here we go, I'm after some advice if possible

So I purchased isladogs (such as clever and nice guy) postcode lookup code and added it to my existing database, not every part just the very basic postcode lookup form. works great and I'm happy with it.

The postcode lookup returns its result instantly on 1 computer (marks-pc) out of 8 on my local network, that is to say, the time takes to collect the data from the net using isladogs json code, write it to my SQL server , read it from the server and finally dump it in a list on a form is very small like 1 second on this certain computer, great stuff!!

My issue is that on the other 7 computers, the process takes 30+ seconds or more.. it always works but for some reason it just takes ages!! I found the more results it finds for the postcode the longer it takes, for example if there is one 1 address it is faster but still not fast and if there is say, 40 results, it takes like 1min or more, it always does complete but the wait time is bad. often it gos to not responding and you just have to wait for it to finish going crazy and the results are there.

steps are taken so far.
I've checked all computer specs and some are even better than the computer that runs the code within 1 second (marks-pc)
I've made sure all systems are usings SSDs and have gigabit lan conections
checked all cables are cat6
tested the speed of the read and write over the lan between pcs and server

the server is using 2 ssds in raid, ( backup clone mode ) can't remember if that 0 5 or what not,. sql server version is 2022 i think"! definitely fairly new

finally ran network tests and optimizers etc

i found that marks-pc can write 15gb of data to the server in 14 minutes where as my pc (doms-pc) takes over 19mins to write the same file to the same server folder.

these two pc's mine and Mark are pretty much identical and sit next to one another in the same room, in fact, mine has more ram 12gb and marks-pc has only 8gb, they are both i5 4th gen, both gigabit, both ddr 3, both win 10 64bit same build, both running office 2016 pro plus and both use same SQL driver 13..

it's driving me insane, i can't imagine its a server issue or a file issue as we all have a copy of the version of my front end, the front end has twice weekly updates and revisions so marks-pc gets a new front end often but he always keeps the speed of the postcode lookup. I don't think it's to do with the code and it's very basic now and marks-pc has nothing different to the other one.

any thoughts,, if this is posted in the wrong place i apologize is advance.
 

Minty

AWF VIP
Local time
Today, 11:37
Joined
Jul 26, 2013
Messages
10,371
Is the database file in a trusted location on both machines?
 

tvanstiphout

Active member
Local time
Today, 03:37
Joined
Jan 22, 2016
Messages
222
I would have a debugging session on a slow machine, and find out exactly what line of code is slow. Then come back here, and we can figure out next steps.
 

Isaac

Lifelong Learner
Local time
Today, 03:37
Joined
Mar 14, 2017
Messages
8,777
what i usually do for postcodes is download a listing from the internet (they're free everythwhere), and simply make a table with a dlookup code.
how can it possibly be more complicated than that? this autofills city and state
 

CJ_London

Super Moderator
Staff member
Local time
Today, 11:37
Joined
Feb 19, 2013
Messages
16,612
i can't imagine its a server issue
Maybe not, but it can be - sql server admin can configure it to prioritise a user/group or process over others. (Don't ask me how)

I had a client who migrated their BE from Access to sql server with the expectation of improved performance. In fact performance was worse (far worse) and the reason was the app was considered by IT to be low priority so only minimal resources were allocated. They weren't prepared to budge so the BE was migrated back to Access.
 

dtdukes86

Member
Local time
Today, 11:37
Joined
Aug 14, 2021
Messages
83
what i usually do for postcodes is download a listing from the internet (they're free everythwhere), and simply make a table with a dlookup code.
how can it possibly be more complicated than that? this autofills city and state
id be interested in a copy of the list if you could please share one ? it's not something I've looked into before but I'm willing to give it a go.
 

dtdukes86

Member
Local time
Today, 11:37
Joined
Aug 14, 2021
Messages
83
I would have a debugging session on a slow machine, and find out exactly what line of code is slow. Then come back here, and we can figure out next steps.
i did do this to some degree and it appear the part of the process that drags its feet is writting the data to the sql server ,, downloading it and reading it from the table is fast but the write speed seems to be slower on every machine im not sure if the issue is down to the write speed fully but the write speed on marks pc is faster than mine.
 
Last edited:

dtdukes86

Member
Local time
Today, 11:37
Joined
Aug 14, 2021
Messages
83
Maybe not, but it can be - sql server admin can configure it to prioritise a user/group or process over others. (Don't ask me how)

I had a client who migrated their BE from Access to sql server with the expectation of improved performance. In fact performance was worse (far worse) and the reason was the app was considered by IT to be low priority so only minimal resources were allocated. They weren't prepared to budge so the BE was migrated back to Access.
hmmm im not really sure where to begin with this idea. ive never set or allowed any user any particular access everyone just logs into the sql srever using data connection and the SA pw which i know isnt the most secure but it works.
 

cheekybuddha

AWF VIP
Local time
Today, 11:37
Joined
Jul 21, 2014
Messages
2,280
i found that marks-pc can write 15gb of data to the server in 14 minutes where as my pc (doms-pc) takes over 19mins to write the same file to the same server folder.
Check basic things too - try swapping your network cables between the two different PC's and see if that affects your results.
 

dtdukes86

Member
Local time
Today, 11:37
Joined
Aug 14, 2021
Messages
83
Check basic things too - try swapping your network cables between the two different PC's and see if that affects your results.
yes i tried this incase cable quality or port location made a differnce , it didnt make any effect , the difference seems to be the computers rather that the cabling or what not . despite them bein very similar
 

AHeyne

Registered User.
Local time
Today, 12:37
Joined
Jan 27, 2006
Messages
92
Did you check the "SQL Server Configuration Manager" on your SQL Server?
Take a look into "SQL Server Network Configuration", choose your SQL Server Instance and activate TCP/IP.

Then restart your SQL Server Instance Service and try again.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 06:37
Joined
Feb 19, 2002
Messages
43,275
They weren't prepared to budge so the BE was migrated back to Access.
Was it possible that the forms were bound to tables or to queries with no criteria? That's what gets most people on a straight conversion. They've been using Jet/ACE techniques and bringing all the data local for filtering so every time a form opens, it brings down the entire table instead of just what the user wants to update. Makes DBAs think of Access as just sitting there with a straw sucking all the data across the LAN, constantly. And they would be right. Forms need to be bound to queries with selection criteria to limit the rows and columns retrieved. That lets the server do the heavy lifting and minimizes LAN traffic as well.
 

GPGeorge

Grover Park George
Local time
Today, 03:37
Joined
Nov 25, 2004
Messages
1,867
Maybe not, but it can be - sql server admin can configure it to prioritise a user/group or process over others. (Don't ask me how)

I had a client who migrated their BE from Access to sql server with the expectation of improved performance. In fact performance was worse (far worse) and the reason was the app was considered by IT to be low priority so only minimal resources were allocated. They weren't prepared to budge so the BE was migrated back to Access.
I remember one client (think sports equipment and sports clothing in the billions of dollars of annual revenue), who gave our Access team a SQL Server to use. Unfortunately, it turned out to be a VM with just enough RAM to run and just enough disk space to store the database. We learned all this one day when the log file grew so big it consumed the entire available disk space and locked us out until IT shrank it for us. I want to say it was around 25 GB at the time, but that's so pathetic it can't possibly be an accurate memory. (Pun intended).
 

isladogs

MVP / VIP
Local time
Today, 11:37
Joined
Jan 14, 2017
Messages
18,225
what i usually do for postcodes is download a listing from the internet (they're free everythwhere), and simply make a table with a dlookup code.
how can it possibly be more complicated than that? this autofills city and state
Unfortunately that isn't the case in countries such as the UK where the OP is based. Postcodes data is available from the Government Office for National Statistics but it needs adapting for use in Access/SQL Server. The address data for each code is proprietary to the Post Office and sold under licence to various resellers
 

isladogs

MVP / VIP
Local time
Today, 11:37
Joined
Jan 14, 2017
Messages
18,225
@dtdukes86
hi Dom. I've only just seen this thread after someone emailed me. If you intended to alert me, use @ before the user name.
Many thanks for the comments in post #1.
Sorry to hear you're having issues with speed on several but not all machines. I very much doubt its my code but am willing to look into it if that turns out not to be the case. I'll read the entire thread to see if I can offer any other suggestions beyond those already mentioned.

Good luck
 

Isaac

Lifelong Learner
Local time
Today, 03:37
Joined
Mar 14, 2017
Messages
8,777
I remember one client (think sports equipment and sports clothing in the billions of dollars of annual revenue), who gave our Access team a SQL Server to use. Unfortunately, it turned out to be a VM with just enough RAM to run and just enough disk space to store the database. We learned all this one day when the log file grew so big it consumed the entire available disk space and locked us out until IT shrank it for us. I want to say it was around 25 GB at the time, but that's so pathetic it can't possibly be an accurate memory. (Pun intended).
Sounds like some queries that the guys in accounting run who know just enough sequel to make the lights dim
 

GPGeorge

Grover Park George
Local time
Today, 03:37
Joined
Nov 25, 2004
Messages
1,867
Sounds like some queries that the guys in accounting run who know just enough sequel to make the lights dim
Well, I was aiming at a different aspect.

I was not precise enough; the VM itself had around 25 GB available in total for the mdf and ldf files.

Once we hit that, it stopped cold. The problem was it was a development database so there was a lot of churn going on, growing the ldf beyond what IT ever thought we'd need. Our process involved downloading huge files from an Oracle OLT database for analysis. One or two of those ETL processes filled up the log file rapidly.
 

dtdukes86

Member
Local time
Today, 11:37
Joined
Aug 14, 2021
Messages
83
Sounds like some queries that the guys in accounting run who know just enough sequel to make the lights dim
hahaha, I did make the entire thing local on one version of my DB to see if that helped and I thought it would be lightning fast. but guess what it was just as slow using local on tables :(
 

dtdukes86

Member
Local time
Today, 11:37
Joined
Aug 14, 2021
Messages
83
Did you check the "SQL Server Configuration Manager" on your SQL Server?
Take a look into "SQL Server Network Configuration", choose your SQL Server Instance and activate TCP/IP.

Then restart your SQL Server Instance Service and try again.
yes covered this mate . ty tho
 

Users who are viewing this thread

Top Bottom