Question Compairing data across two databases

twistedsymphony

New member
Local time
Today, 18:09
Joined
Jun 4, 2009
Messages
4
Here is my problem.

I've been tasked with migrating a system currently built on Access to SQL Server due to the application hitting Access' size limitations.

upon testing the application on the development machine I found that we are getting different results than the production machine.

There is a slight difference somewhere in the database between the two systems.

There are over 200 tables and the databases are close to 2GB in size each... worse yet most of the tables are poorly designed and do not have primary keys.

I need a painless and effective way to compare the data between the two tables record for record and determine where the data differs.

------------------
I've tried a half dozen software packages but they either end up crashing, not actually comparing the data (just structure), or not being able to compare the data between tables without a primary key (not useful to me).

writing queries one by one would take me months to get through the whole database.

Any suggestions for applications or scripts that can provide this kind of comparison functionality would be very useful.
 
There are over 200 tables and the databases are close to 2GB in size each... worse yet most of the tables are poorly designed and do not have primary keys.
I looking at trying to see why your db is so big. If it can be honed down to a manageable size, would that negate the need to transfer it.

I have a split db which uses about 130 tbls but is only about 30Mb. Some of the tbls have upto 100 plus flds and 10-20,000 records.

A few pts to consider:

Is the Access db split?. Does it have a large no of memo flds? How many flds and records are in the tbls? If you go into tbl design and indexes, are there a stack of unwanted indexes created by the system.

One thing that really blows out an Access db is the use of logos. One system I worked on had 1 logo (180kb .jpg file) but when embedded (rather than linked) in Access blew it out by 37 Mb.

Have you tried Compact and Repair? Have you tried to Import all the objects into a new "blank" db?
 
the database can't be slimmed down, nor can the tables be changed, it supports a 3rd party application that I have no control over in terms of how it manages the data.

the reason it's so massive is that it's for a scheduling program that manages worker time and tool usage down to the minute for hundreds of workers, thousands of tools and tens of thousands of products and performs forecasting operations months in advance... the reason we want to migrate to SQL is because they want to push the horizon out further and Access simply can't handle the kind of data. The actual data is really very simple, there is just massive amounts of it.
 
Last edited:
To get you to a more manageable situation whist a more permanent fix is complete is to split your back ends up into different back ends. The rule is to look at the tables that change every day, every week , seldom or not at all. Place them into their respective back ends so that the size is manageable. Then in your front relink the tables from the associated back ends. The front end does not care what the name of the mdb is as long as it exists and the tables exist in the mdb. Your queries / forms /reports /etc in your front end will be oblivious to the changes made and you should also get better performance results.

David
 
I don't really understand how what you're proposing applies to my situation.

The tables get regenerated twice daily with new data, save a small handful of reference tables which rarely change and have already been ruled out.

There is no "front end" really since the process is completely scripted and runs unmanned.
 

Users who are viewing this thread

Back
Top Bottom