'All-in-one' database very slow on network (1 Viewer)

Alc

Registered User.
Local time
Yesterday, 21:44
Joined
Mar 23, 2007
Messages
2,407
I have a database where all items, including tables, are stored in the same application.

When the database is copied to my desktop and run, a certain process takes between 11 and 20 seconds to complete.

When the same process is run with the database being stored on a shared network, it takes close to an hour to run.

I've tested this on three PCs now and the results are the same.

I always thought that if the tables were stored locally it would make no difference where the database was located. Clearly I was wrong, but why? What difference does the network make, in this case?
 

pbaldy

Wino Moderator
Staff member
Local time
Yesterday, 18:44
Joined
Aug 30, 2003
Messages
36,125
If you store it on the network, all the objects, data etc are brought over the network to be processed on your computer. Storing it on the network doesn't mean the work is being done there. That's a pretty big difference though.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 02:44
Joined
Feb 19, 2013
Messages
16,612
I can believe it. If you have a slow network, there is not much you can do with regards that aspect of it.

However as pbaldy says - all objects are copied across - including forms, reports etc. This can be eliminated by splitting your db and having front end on your pc, backend on the network.

The other thing to ensure is your queries/forms reports bring over as little data as possible both in terms of number of records and fields. Simple things like not having a form with a table as its recordsource and using filtering - instead don't bring anything across until your search criteria are set and applying it to the recordsource rather than the filter.
 

Alc

Registered User.
Local time
Yesterday, 21:44
Joined
Mar 23, 2007
Messages
2,407
Thanks both.

I've been brought in to improve and build on dozens of databases that mechanical engineers here have built over the years. No two have the same design and each has its own little quirks.

Where possible, I've been splitting the databases into front and backend, but was mainly doing it to reduce the risk of losing everything in the event that one end became corrupt. I didn't realize it would have the added advantage of speed.

Just as an aside, pbaldy, one of the other databases here links to data warehouses in Canada and the US. The network connections vary between poor and unbearable. I've had a query running for 4 hours now and it still hasn't finished. On a 'good' day? Takes 10 minutes.
 

pbaldy

Wino Moderator
Staff member
Local time
Yesterday, 18:44
Joined
Aug 30, 2003
Messages
36,125
Ouch. Hard to work around poor connections. Could it be run at night when perhaps there are fewer people sucking up bandwidth?
 

Alc

Registered User.
Local time
Yesterday, 21:44
Joined
Mar 23, 2007
Messages
2,407
Ouch. Hard to work around poor connections. Could it be run at night when perhaps there are fewer people sucking up bandwidth?
It will be once, it's working. Unfortunately, I need to test the changes I make during the daytime.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 02:44
Joined
Feb 19, 2013
Messages
16,612
I didn't realize it would have the added advantage of speed.

only a little bit, main thing is to do with amount of data - think about a web page - it brings very little data through (in terms of volume)

e.g. a list of items - just maybe 25 records at a time, whereas in Access you may be bringing through 100's if not 1000's
 

JHB

Have been here a while
Local time
Today, 03:44
Joined
Jun 17, 2012
Messages
7,732
..
Just as an aside, pbaldy, one of the other databases here links to data warehouses in Canada and the US. The network connections vary between poor and unbearable. I've had a query running for 4 hours now and it still hasn't finished. On a 'good' day? Takes 10 minutes.
Only one idea, can't you just check if there are some new -/updated data and then fetch that?
 

Alc

Registered User.
Local time
Yesterday, 21:44
Joined
Mar 23, 2007
Messages
2,407
Only one idea, can't you just check if there are some new -/updated data and then fetch that?
Thanks for the response.

The reports being pulled in this case come from one table and are a count of each time a certain value appears in a certain column. For example:
Column R1 - 1, 4, 5, 4, 7, 9, 6, 7....hundreds of values
Column R2 - 2, 4, 2, 3, 7, 9, 1, 3....hundreds of values
etc.
all the way up to column 47.
Each number is a customer satisfaction score and the person running the report enters a threshold number and looks for the number of responses lower than, equal to, and higher than that threshold.
The table itself is completely overwritten once a month, so all data is new, unfortunately.

Run from my C drive, I've produced a report in 11 seconds, which everyone's perfectly happy with. I have convinced the main user to just make a local copy, once a month, and use that.
 

kevlray

Registered User.
Local time
Yesterday, 18:44
Joined
Apr 5, 2010
Messages
1,046
FYI: Here at work we can use a wire connection and/or wireless. For meetings I go to I normally will turn on my wireless (pretty decent bandwidth). The other day I was working with a fairly small DB and it was painful doing almost anything. Then I remembered I still had the wireless on, turned it off (Access was upset, it became obvious that it was using the wireless connection). When I opened the same database up (now on a wired connect), it was noticeably much more responsive.
 

Users who are viewing this thread

Top Bottom