Dear all out here,
Hope you can provide me with some tips for the following issue I am currently facing.
I have a Access DB (2010) with 6 tables with each around 1,2 million rows. The size is 1,3GB. The DB contains data that is also loaded (monthly) in some application, but we use the DB for regular research for exceptional things we need to understand.
Now the issue is that performance is terrible, even on our fast network en 64bit / 8GB memory machines. To give an indication: searching for a particular customer in one of the (imported / not linked) tables takes around 4-5 minutes.
Now as a solution, I am thinking of splitting the DB in at least 7 separate DB's each based on a certain group of customers. Now the question is, how to do this in the most efficient way (least time consuming).
Idea 1: per separate DB importing the big tables and in each separate table, filtering for the relevant customer group. Working with a lot of delete/append queries every month.
Idea 2: some kind of VBA achieving the same, but doing more in the background, cleaner and maybe also quicker?
Questions:
+ can I import and filter -at the same time- or is it always necessary to do the import / or link (of the big 1,2m row tables) first and then filter the group?
+ what to think of the current performance? (by the way a somewhat similar situation using Oracle DB and special SQL software presents the data in around 5 seconds on our network...this is the long term solution for this DB)
+ any other idea's?
Many thanks in advance!
Kind regards,
Marcel
Hope you can provide me with some tips for the following issue I am currently facing.
I have a Access DB (2010) with 6 tables with each around 1,2 million rows. The size is 1,3GB. The DB contains data that is also loaded (monthly) in some application, but we use the DB for regular research for exceptional things we need to understand.
Now the issue is that performance is terrible, even on our fast network en 64bit / 8GB memory machines. To give an indication: searching for a particular customer in one of the (imported / not linked) tables takes around 4-5 minutes.
Now as a solution, I am thinking of splitting the DB in at least 7 separate DB's each based on a certain group of customers. Now the question is, how to do this in the most efficient way (least time consuming).
Idea 1: per separate DB importing the big tables and in each separate table, filtering for the relevant customer group. Working with a lot of delete/append queries every month.
Idea 2: some kind of VBA achieving the same, but doing more in the background, cleaner and maybe also quicker?
Questions:
+ can I import and filter -at the same time- or is it always necessary to do the import / or link (of the big 1,2m row tables) first and then filter the group?
+ what to think of the current performance? (by the way a somewhat similar situation using Oracle DB and special SQL software presents the data in around 5 seconds on our network...this is the long term solution for this DB)
+ any other idea's?
Many thanks in advance!
Kind regards,
Marcel