Hello,
I am creating a database and application to automatically produce invoices for transportation contractors. It will be analyzing tables for specific metrics to dynamically generate bills, based on mileage, time, in service status, rates associated to contractors, etc...
I have a SQL back-end connected to Access front end with linked tables.
The core of the dynamic data, the trip metric data (trip name, mileage, time...) should be updated daily and time stamped when new changes are made. This way, there is an audit trail of the trip data, showing all historical changes for each record. Currently, my primary key in the trip table is tripName, but since I will be reproducing the primary key when an update is made and adding a new line, then I will be using a composite key of trip name and dateChanged.
For example:
tripName mileage duration dateChanged
CR101A 1.0 61 9/25/2015
CR101A 0.9 55 9/22/2015
CR101B 1.0 67 9/25/2015
CR101A 0.9 62 9/17/2015
So, this table would originally start out with only one row per unique tripName identifier (with timestamp), but as time progresses and changes are made to those trips, a new row would be added for that trip with a different time stamp.
The difficulty here will be that 'changes' to the trip table will not happen via user input. I would like these to happen dynamically via an update function. There will be a trip table (excel file) on an ftp server (i'll start locally for now) which will have the 'current' trip table on any given day. I want my program to fetch that table and update the local trip table in my sql database automatically by realizing what has changed in the table (either new trips or new columns on pre-existing trips) and add a new row with a time stamp.
What will be the best way to accomplish this? Can I fetch the external data from the server into a recordset, and then cycle through it, checking the associated parameters in my local table, creating a new line when I notice a change or a new tripName?
Lets say there are no more than 2500 trip identifiers to start, with changes happening not _that_ often, so I do not believe that the scalability here should be a huge problem. I could also schedule this task eventually to be run overnight.
Any thoughts?
Thanks,
K
I am creating a database and application to automatically produce invoices for transportation contractors. It will be analyzing tables for specific metrics to dynamically generate bills, based on mileage, time, in service status, rates associated to contractors, etc...
I have a SQL back-end connected to Access front end with linked tables.
The core of the dynamic data, the trip metric data (trip name, mileage, time...) should be updated daily and time stamped when new changes are made. This way, there is an audit trail of the trip data, showing all historical changes for each record. Currently, my primary key in the trip table is tripName, but since I will be reproducing the primary key when an update is made and adding a new line, then I will be using a composite key of trip name and dateChanged.
For example:
tripName mileage duration dateChanged
CR101A 1.0 61 9/25/2015
CR101A 0.9 55 9/22/2015
CR101B 1.0 67 9/25/2015
CR101A 0.9 62 9/17/2015
So, this table would originally start out with only one row per unique tripName identifier (with timestamp), but as time progresses and changes are made to those trips, a new row would be added for that trip with a different time stamp.
The difficulty here will be that 'changes' to the trip table will not happen via user input. I would like these to happen dynamically via an update function. There will be a trip table (excel file) on an ftp server (i'll start locally for now) which will have the 'current' trip table on any given day. I want my program to fetch that table and update the local trip table in my sql database automatically by realizing what has changed in the table (either new trips or new columns on pre-existing trips) and add a new row with a time stamp.
What will be the best way to accomplish this? Can I fetch the external data from the server into a recordset, and then cycle through it, checking the associated parameters in my local table, creating a new line when I notice a change or a new tripName?
Lets say there are no more than 2500 trip identifiers to start, with changes happening not _that_ often, so I do not believe that the scalability here should be a huge problem. I could also schedule this task eventually to be run overnight.
Any thoughts?
Thanks,
K
Last edited: