To archive, or not? (1 Viewer)

wheeledgoat

Registered User.
Local time
Today, 13:50
Joined
Feb 12, 2019
Messages
16
I've got a "visit tracking" db developed for our medical office. Each area has its front end interface that connects to the back end on the server and the Front Desk, MOAs, RNs, MDs, and Pharmacists all use it to alert each other to where the patient is and needs to go.

We add somewhere between 100 and 200 visits a day. Initially I was having it delete the records off the backend as the visits were completed, but there's interest in running reports on the data.

I read a few articles on backups/archiving and came away with the idea that until you get into the tens of thousands of records, performance won't be appreciably impacted.

Still, I'd love to hear your opinion on the best approach, since the frontend interfaces will only ever use that day's visits. Records >1 day old will be used only to run reports on. Obviously, would like to keep it as snappy as possible.

1. Leave all records in the backend tables?
2. Archive & purge backend tables daily?
3. Archive after x number of records have accumulated?

Thanks!!
 

theDBguy

I’m here to help
Staff member
Local time
Today, 13:50
Joined
Oct 29, 2018
Messages
21,454
Hi. Assuming you're using Access, it can handle millions of records. As for speed, it depends on how you're manipulating those records. My vote is to leave the data in the BE.
 

June7

AWF VIP
Local time
Today, 12:50
Joined
Mar 9, 2014
Messages
5,466
Vote for 1. Unless db exceeds Access 2GB limit. In which case, consider migrating to a larger capacity db like MySQL or SQLServerExpress. Both are freebie and I think both have 10GB limit.
 

Micron

AWF VIP
Local time
Today, 16:50
Joined
Oct 20, 2018
Messages
3,478
speed can be affected by a whole raft of variables: pc processor, concurrent processes running on it, network architecture, server capability, server load, indexing of tables, efficiency of queries, excessive calculations in queries, etc. etc. Keep records until things get slow or space becomes an issue. Slowness will be annoying but things will still work. Running out of space suddenly could prove more damaging. Suggest you keep an eye on db size and comapct/repair only when it gets large enough that performance degrades AND only after backing up first. Have db send you an alert when it gets too large if you'd rather c/r manually.
 

isladogs

MVP / VIP
Local time
Today, 21:50
Joined
Jan 14, 2017
Messages
18,209
Agree with previous replies.
Keep the records and filter queries as appropriate to select the data you need.
You could also use a Boolean field called Active which is true by default but can be set as false to deactivate records.
However you do it, it is far better to have data readily available when needed rather than restoring from old backups.
 

wheeledgoat

Registered User.
Local time
Today, 13:50
Joined
Feb 12, 2019
Messages
16
Sure, understood. I was fishing for a reply along the lines of "in my experience, things can start to lag around x number of records for a setup like that."

Appreciate your responses though! Have a great one.
 

Mark_

Longboard on the internet
Local time
Today, 13:50
Joined
Sep 12, 2017
Messages
2,111
It is very difficult to say "After X records" because each back end is running on different hardware with different data structures in the DB being accessed by a highly variable number of users.

For myself, since your dealing with 200 records a day, that's only 73,000 in a year. Ten years from now I hope your over a million. By that time you'll probably have upgraded to ACCESS 2024 and realized they've expanded what ACCESS can do.

For now, you really don't need to worry about archiving unless you notice something as all of the other's have advised regard.
 

isladogs

MVP / VIP
Local time
Today, 21:50
Joined
Jan 14, 2017
Messages
18,209
If you have a large number of records and no indexes, searches will take longer.
If you have millions of records there will be a noticeable delay.
A good guide is when there is a delay showing the total number of records in a table

But you can significantly speed up searches by indexing frequently searched fields. If that is done, millions of records will be searched as fast as a small number of records.

There is a downside to indexing. Update queries take longer as the indexes also need to be updated
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:50
Joined
Feb 28, 2001
Messages
27,146
When I worked in the Navy Enterprise Data Center, we had this same question. We didn't leave it to chance. We asked the bosses who were asking for these reports. This is what we asked them:

"How far back do you want these reports to go, understanding that there is a cost of keeping records for a longer time. The cost is bigger files that are slower to respond and require more maintenance. What do you want us to do?"

Their answer led us to a policy decision that we would retain records for not less than six months, but not from time of issuance. Instead, we archived based on action closure. Once a particular action had been completed for six months plus the time until the next "monthly report cycle" it was subject to archiving.

We ended up with several hundred thousand records (1200-1500 computers x 30-40 actions every two weeks) retained for > 6 months - and Access handled it in native mode with no major issues. Our archiving involved a Yes/No field in the action history table that allowed us to mark when a record was eligible for archiving and at the next sweep, we would first export the marked records in a specifically formatted report as an external file. After that, we would delete the records. And of course, we did this at a time when we could run a compact & repair immediately after that deletion step.

The point is, you choose to either archive or discard for a reason. You choose WHEN to do either, also for a reason. You choose the nature of that archive for a reason. Often those reasons come from your boss or higher. So ask the right questions and THEN decide what is required.
 

Cronk

Registered User.
Local time
Tomorrow, 06:50
Joined
Jul 4, 2013
Messages
2,771
"In my experience", I've had one system where I archived records, not because of the number of records but by the size of the backend. (I did employ indexes on every field that required record finding or sorting.)


There was never expected (nor did it happen in many years) that archived data would be included in a report. However, there was a possibility that archived data might need to be searched for a specific record. Accordingly, I included functionality that a user could change their FE links to an archived BE.


My suggestion is to not archive unless you absolutely need to and even then look to other options (sql server) which was not an option in my cited case.
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 21:50
Joined
Sep 12, 2006
Messages
15,641
if you have queries/filters that can only be resolved by checking every row in the appointments table - then if you can't find a way to index the approach you are taking, the search/filter will take longer, depending on the number of records, as every record will need to be checked.

Otherwise, the indexing processes should help access find the most efficient way to recover your data. Sometimes it doesn't do that the "right way", and you may have to try to build the query another way. You could investigate "query plans" for more information.
 

shadow9449

Registered User.
Local time
Today, 16:50
Joined
Mar 5, 2004
Messages
1,037
Along the same lines as what others have said, my experience is that the performance will depend much more on good table structure (proper normalization and indexing) and efficient retrieval of records (good form design and efficient queries) than the number of records. Obviously this rule of thumb has its limitations but you may be surprised at how well Access performs with a lot of data if you've got these covered.

There are many threads on the forum about these topics.
 

Users who are viewing this thread

Top Bottom