manually replicate actions of table analyser (1 Viewer)

jackjsmith88

New member
Local time
Today, 18:47
Joined
Sep 21, 2019
Messages
2
Hi all,

wheneever ive created databses, its always from scratch, so no existing data to manipulate and i can easily organise one that has to be populated.

But.. now im creating one that rips reports out of SAP, and imports into Access, so that i can create various form based features and reports with charts etc,

i imported this years wad of data off, that lays out like so:

Call it RawData

Account Number
Account Name
ProjectID
ManualPriceAdjustment
BullingDocumentNumber
DeliveryNoteNumber
SalesDocumentNumber
CostPrice
NetPrice
Tax
MarginalPercentage
MarginNet

and a few more.. the table analyser, correctly reads what data needs to go in what table, but then it creates a shed load of corrections and wants to overwrite all my data with weird replacements

So what i want to do, is split the data myself, by creating one to many relationships in several tables, so that each table has the link button to display a subtable with corresponding info.

I started with SalesorderNumber.

So i wanted to create a table that autopopulates the field based on the data from the other table, and links them both together, and autopopulates based on whats coming from the RawData.

I suspect im going about this the wrong way, and i even played with the thought of an update query? but i don't know what im doing there?

if anyone can give me a gentle nudge in the right direction? :)

greatly appreaciated.

BR

Jack
 

CJ_London

Super Moderator
Staff member
Local time
Today, 18:47
Joined
Feb 19, 2013
Messages
16,553
you would use multiple append queries, one for each table. Depending on your requirement you may also require update and delete queries.

Assuming you are appending to tables with autonumber PK (i.e. not coming from the import data), you would first append to the primary table(s). Subsequent appends then need to link the import data to this/these tables on some unique value(s) in the raw data to be able to populate the FK in the secondary tables.

From your data, I would have thought you would want to start with account number as your primary table - one account can have many sales documents.

first append query

Code:
INSERT INTO tblAccounts (AccountNumber, AccountName)
SELECT AccountNumber, AccountName 
FROM RawData

a second one might be
Code:
INSERT INTO tblProjects (ProjectID, AccountFK)
SELECT ProjectID, AccountPK
FROM rawData INNER JOIN tblAccounts on rawData.accountnumber=tblAccounts.accountnumber)
 

isladogs

MVP / VIP
Local time
Today, 18:47
Joined
Jan 14, 2017
Messages
18,186
Agree with CJL.
If this is a one off operation just run the queries or sql statements in turn on a copy of your database (in case of errors)
If its going to be done repeatedly over time, create a procedure to run each query/sql statement in turn with no user intervention needed.

You might find some useful tips in this article on my website Synchronise Data
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 13:47
Joined
Feb 28, 2001
Messages
26,999
Having not worked with SAP before, I have to ask a dumb question. Can you get the schema from SAP for the data it tracks? That might go a long way towards helping you to analyze this wad of data. Reverse-engineering something from a report doesn't always help you clearly design things if there is more than one pathway leading from point A to point B.

If you are doing this on someone's request, you can perhaps use that someone to persuade the managers of SAP to provide the schema or to boost your permissions enough to give you read-only access to that level of data.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 14:47
Joined
Feb 19, 2002
Messages
42,971
I import data from other applications frequently and unless i need to keep it for some reason, I generally leave it as is. If you are worried about size due to all the duplicate data, that might be a reason to go through the effort to normalize but maintaining RI is irrelevant and you will be replacing the data frequently so automation would be critical if you decide you want to normalize.

When importing data like this, I always use a template database that my main database links to. That way I don't have to worry about bloat. Each time I start a new import, I delete the current data filled template database and copy a fresh, empty one from the master folder and import into that one.
 

Users who are viewing this thread

Top Bottom