Access and PivotTables

jazztie

Registered User.
Local time
Today, 16:41
Joined
Jun 9, 2005
Messages
19
I'm currently using Excel and PivotTables to extract and analyze vast amounts of data. (up-to 25,000 records with many variables). The problem now, is that the size of the excel-file is starting to exceed its limit (130MB).

So, I would like to use Access instead (it this a good idea??). So, my question is:

- How do I use PivotTables in Access?

Any help is greatly appreciated!

Jazz
 
If you are considering to use Acces in stead of Excel, you'll need to realize that Access is a database and not an improved spreadsheet.

First step to take is to configure your business process(es) and create the necessary tables to insert your data ( search for normalization on this forum).

If you have set up all your tables, relationships, queries, forms etc.
there is a feature in Access that can analyze your data thru an Excel spreadsheet ( Tools > Office links > Analyze with Excel ).

If you need help in configurating your Dbase, do a search on this forum
on : design database or somthing similar.

Good luck.
 
So, I need to normalize the data (I believe most of the data in the excel-sheets is already normalized) and call on the 'Analyze with Excel' function in Access.

So, basically I'm calling on the PivotTables in Excel to analyze my data? Wouldn't that give me the same problems with filesize as my current solution? Since all the data will be analyzed and reported in Excel, right?

sincerely,
Jazz
 
Normalizing data in Access can't be compared with normalizing data in your excel spreadsheets. As stated before, you need to set up tables, relationships, queries , etc. to enter data in your dbase.

If you output the data thru dbase queries into the excel pivottable,
you can refine your analyzis, since the query holds the data to be analyzed.
The size of this data will be substantially lower than that on the original spreadsheet.

Btw take a look at this thread which explains a good part of normalization:
http://www.access-programmers.co.uk/forums/showthread.php?t=62403
 
Thanks for the information, Rak!

After googling on Normalization, I now realise what it is. I've build complex databases before, so that wouldn't be a big problem. (Just a challenge...)

----
Mien pilsie bin ick kwiet...
 
My pleasure Jazztie.

Jazz en pils gaon hand in hand ....? :D
 
I believe some of your analysis can be handled by what are called CrossTab queries. This is a separate help topic, so you should be able to look it up and decide whether it fits your situation. There is also a CrossTab Wizard among the many Query Wizards, so they are pretty easy to generate.

Also, don't forget that in Access, you can do MULTI-STAGE data reduction, because you can run a query against another query. (Obviously, don't do this through too many layers or things start to get a bit sluggish.) But it is possible to summarize things one way, then summarize the summary in a different way, by designing layered queries.

For what it is worth, I do performance analysis (graphically, sometimes; sometimes not) based on files created by a large-scale computer. That is, I analyze the computer's performance as a System Admin function. When I look at a month of data, it is not uncommon for me to muck about with 90,000 raw records. Access doesn't seem to mind this on a GHz-class machine. I once had to do a year of software license usage analysis - about 300,000 records - to show that a particular new program had caused a strain on our resources. Try doing 0.3 Megarecords in Excel! Oh, I could have done the analysis on the mainframe - but it doesn't have a graphic output device suitable for performance plots. The graphs made all the difference in the study we were performing.
 

Users who are viewing this thread

Back
Top Bottom