Call Center data

AngeloWolfe

New member
Local time
Yesterday, 20:38
Joined
Jun 22, 2011
Messages
6
Greetings everyone! Have some interesting food for thought and wanted to see if you all had some suggestions to this problem.

I have a database I'm having to rebuild that is going to house last years, this years, and ongoing years of data from recorded phonetime. When filtered down month-to-month and by required PhoneID's, we are looking at around 50k-300k records of data at a time. Needless to say, the amount of data is greater than the amount that the Access 2007 database can host (it caps out at 2 gigs).

This data is being pulled down from a SQL server, so I first assumed Pass-Through queries or Paramaters would solve the issue; but it can never be that simple! It seems that the data does not STAY on the server for very long. Which means, I have to house this immense amount of data somehow and query from it.

So, I'm wondering: Would it be applicable to take the data and house it in multiple Excel 2007 worksheets, and link the said workbook to the database? Or, is there a much simpler way to house this data that I'm overlooking?

Thank you SO MUCH in advance! :cool:
 
My second suggestion is that if you need to use multiple files to house data--use multiple Access databases to house the data, create a master database which links to each of those source databases and then create a union query to bring all the data together in one object. Since the master database just links to other databases the data in those isn't counted against it.

However, my first suggestion would be to make sure the database has the proper structure. I've fixed a lot of databases that were over 1.5 gigs when I first meet them and ended up half that size when they were properly structured.

I don't mean just proper normalization, although that counts for a lot. I mean selecting the proper field type for each field. Don't use a Double where a Small Int will do. If a text field comes down and is a maximum of 15 characters be sure to limit it in Access to 15 characters not 255. Don't store redundant or calculated data like age when you have a date of birth. The key is to look at every field and ask if its necessary and if so what is the minimum datatype you must have to accomodate all values that it will hold.
 
My first option would be to see if it can stay on SQL Server. That gives you the most powerful tools to query large amounts of data. If they are archiving data out of a production database, perhaps they can first copy it to an archive database that can be your source.
 
Would it be applicable to take the data and house it in multiple Excel 2007 worksheets, and link the said workbook to the database?

If you were going to go through that much trouble, I don't see why you would not just house the data in multiple archival Access databases. However, I agree with the others that for optimum performance, you need to deeply examine normalization issues and data-size tailoring. If you can find a way to make an SQL server or MySQL database do what you want and retain the amount of data you need, that would also be good.

Another factor that is almost mandatory when working with large historical data sets as you describe is to determine how often you will need to refer to the older data - and how often you will need to refer to that data at the same time as you need newer data too. That will help define the point of diminishing returns for keeping data fully on-line and quickly available.
 

Users who are viewing this thread

Back
Top Bottom