wheeledgoat
Registered User.
- Local time
- Today, 04:37
- Joined
- Feb 12, 2019
- Messages
- 16
I've got a "visit tracking" db developed for our medical office. Each area has its front end interface that connects to the back end on the server and the Front Desk, MOAs, RNs, MDs, and Pharmacists all use it to alert each other to where the patient is and needs to go.
We add somewhere between 100 and 200 visits a day. Initially I was having it delete the records off the backend as the visits were completed, but there's interest in running reports on the data.
I read a few articles on backups/archiving and came away with the idea that until you get into the tens of thousands of records, performance won't be appreciably impacted.
Still, I'd love to hear your opinion on the best approach, since the frontend interfaces will only ever use that day's visits. Records >1 day old will be used only to run reports on. Obviously, would like to keep it as snappy as possible.
1. Leave all records in the backend tables?
2. Archive & purge backend tables daily?
3. Archive after x number of records have accumulated?
Thanks!!
We add somewhere between 100 and 200 visits a day. Initially I was having it delete the records off the backend as the visits were completed, but there's interest in running reports on the data.
I read a few articles on backups/archiving and came away with the idea that until you get into the tens of thousands of records, performance won't be appreciably impacted.
Still, I'd love to hear your opinion on the best approach, since the frontend interfaces will only ever use that day's visits. Records >1 day old will be used only to run reports on. Obviously, would like to keep it as snappy as possible.
1. Leave all records in the backend tables?
2. Archive & purge backend tables daily?
3. Archive after x number of records have accumulated?
Thanks!!