You can do this with a series of queries lumped under a macro, plus a couple of manual operations.
First, before you write ANY queries, decide how you want to handle this problem. By this I mean, do you want to store your history data in another database? An external Excel file? Some other table in your main database? An external text file? Your choice. BUT ... whatever you choose, it has to be something that can be performed by a query or macro action, including VBA code that could be triggered by a macro action. So this is wide open.
Once you have that question answered, the next question is how you will decide what to dump. Such as, all data for parcels delivered over 1 month ago? 2 months? 2 weeks? Some other criterion? Whatever you choose, it has to be something that is testable in a query. Since you can employ VBA code in public functions from a query, even here you have a world of choices to make. However, the result has to be clear-cut. To archive or NOT to archive, that is the question...
Next, check your table relationships. Are there depencies to data in the table to be archived such that relational integrity would gripe? Do other tables depend on your main table? I'm not talking about cases where your main table depends on other data. Only where the "one" side of a one-to-many or one-to-one relationship resides in the main table. In each such case, you will have to provide a means to archive the corresponding data from the dependent table, too!
I am going to suggest that you add a Yes/No field to each table from which this archive function must occur. Call it [ArchiveIt] or some similar name that suits you. This will allow you to build multiple queries to sweep your tables ahead of time based on the old "divide and conquer" method of problem solving.
OK, here are the queries you need to write.
For each table subject to archiving, you need an Update query to selectively set the [ArchiveIt] flag. So this means at least one such Update query for the main table. You also need queries that set the archive flags in dependent tables. They can take advantage of the relationship to the main table rather than apply more complex tests. I.e. If my parent record is going to be archived, I should be archived, too!
Now write a set of Append queries, one for each table to be archived. Make these queries write the records to your chosen archiving source or to a staging table. Personally, I prefer the staging method myself. Remember "divide and conquer" ? The staging method allows me to re-run some parts by hand if there is a failure. Make the new staging table something like, if the main table is called [Parcels], the archive version is [ArcParcels]. Things like that. One staging table and one append query for each table to be archived.
Now write a set of Erase queries. One for each table that contains items marked for Archiving. In your production tables, erase only the marked items. You need another set of Erase queries for your ArcXxxx tables. These, however, do NOT need to test for the Archiving flag.
You need something to implement the final archiving step using the staging tables as a data source. If you really want it totally out of your database, then you need to decide whether you will store it in another database, a file, a spreadsheet, or something else. You need to then identify the way you will accomplish that operation. This could be anything from an Append to an external Access table to running VBA code that writes to a specific file.
Now the macro. I am going to assume you used the staging table approach. Put in anything you need in the way of shutting off system messages, echoing stuff on the status line, or whatever else you think you need, then...
1. Erase the staging tables completely.
2. For the main table, run your query that marks records for archiving.
3. If any dependent tables exist, run the queries that mark records for archiving.
4. Run the that copy only the marked-for-archive items. Run them all in any order.
5. Run the step that implements the migration of the staging table data to your external location. Do it for all tables including dependent ones, if any.
6. Run the dependent table erasures, if any dependent tables exist.
7. Run the main table erasure.
8. Re-erase the staging tables.
Before you run the macro, make a backup copy of your database. After you run the macro, you might consider a compression as a VERY good idea. This would minimize the size of the database, which will positively affect your query speed. Also, if the whole shootin' match succeeds, delete the backup copy of the pre-compression database and make a new backup copy of the compressed database.
As you might tell from looking at this rather lengthy procedure, you have SEVERAL opportunities to recover from failures at nearly any stage of this operation. To my way of thinking, ANY massive data operation such as this must be built with data reliability and recoverability in mind.