Question Separating DB: the best way? (1 Viewer)

zeeforel

Registered User.
Local time
Today, 07:33
Joined
Aug 11, 2004
Messages
25
Dear all out here,

Hope you can provide me with some tips for the following issue I am currently facing.

I have a Access DB (2010) with 6 tables with each around 1,2 million rows. The size is 1,3GB. The DB contains data that is also loaded (monthly) in some application, but we use the DB for regular research for exceptional things we need to understand.
Now the issue is that performance is terrible, even on our fast network en 64bit / 8GB memory machines. To give an indication: searching for a particular customer in one of the (imported / not linked) tables takes around 4-5 minutes.

Now as a solution, I am thinking of splitting the DB in at least 7 separate DB's each based on a certain group of customers. Now the question is, how to do this in the most efficient way (least time consuming).

Idea 1: per separate DB importing the big tables and in each separate table, filtering for the relevant customer group. Working with a lot of delete/append queries every month.

Idea 2: some kind of VBA achieving the same, but doing more in the background, cleaner and maybe also quicker?

Questions:
+ can I import and filter -at the same time- or is it always necessary to do the import / or link (of the big 1,2m row tables) first and then filter the group?
+ what to think of the current performance? (by the way a somewhat similar situation using Oracle DB and special SQL software presents the data in around 5 seconds on our network...this is the long term solution for this DB)
+ any other idea's?

Many thanks in advance!

Kind regards,
Marcel
 

CJ_London

Super Moderator
Staff member
Local time
Today, 06:33
Joined
Feb 19, 2013
Messages
16,638
Splitting tables as you suggest it not a good idea - it sounds to me like you do not have you tables properly indexed so suggest you investigate this first.

Also, not clear on how you are using the db - it sounds like you have one db (no front end/back end) which everyone is trying to use at the same time?

Also how fast is your fast network? the individual user machine specs you quote will have little effect on the performance of the db assuming it is located on the network
 

zeeforel

Registered User.
Local time
Today, 07:33
Joined
Aug 11, 2004
Messages
25
Dear CJ_London,

First of all thanks very much for this quick reaction. To answer the questions / respond:

1. Why is splitting no good idea? How do I make sure the individual table (no queries across tables involved!) are properly indexed? All tables have a primary key being some unique customer id or order idea etc.

2. It does not happen that more than one person is using the DB.

3. Think the network is fast (big organisation), but maybe I should test (but how?). DB indeed stored on a shared drive.

Thanks.
 

spikepl

Eledittingent Beliped
Local time
Today, 07:33
Joined
Nov 3, 2010
Messages
6,142
Google database indexes.

Or think of the phonebook - it is in fact indexed. If all the names were there in a random sequence, you'd have to scan, on average, half the phonebook to get the name and number you are looking for. But that is not necessaryy because the thing is indexed. The same applies to the DB fields of interest!
 

zeeforel

Registered User.
Local time
Today, 07:33
Joined
Aug 11, 2004
Messages
25
spikepl, you got me there! Maybe that is a very good idea. Will come back on this. Thanks.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 06:33
Joined
Feb 19, 2013
Messages
16,638
How do I make sure the individual table (no queries across tables involved!) are properly indexed?
If your users typically search for a customer based on their name/address/phone number/account number then the relevant fields need to be indexed
 

rangersedge

Registered User.
Local time
Today, 00:33
Joined
Jun 13, 2014
Messages
82
If your DB is that large then it wouldn't hurt to separate the tables and then have 1 "linked" DB that reads from the tables. I have even been experimenting with using mysql as the backend and using access to link to the data. My access dB doesn't store any info and is made up of just linked tables and forms. This cuts the load time by a ton since mysql has no size limit and is free.
 

zeeforel

Registered User.
Local time
Today, 07:33
Joined
Aug 11, 2004
Messages
25
Just to come back to the issue: the solution of indexing the most-searched-on-field was perfect. Search time improved from 5 minutes to 15 seconds. Incredible!! Thanks again.
 

Users who are viewing this thread

Top Bottom