If you learn a little about Excel Application Objects and File System Objects, you can do what I did. Here is the overview.
You give people the empty spreadsheet with the right column headings and other niceties such as alternate row coloration (or alternate column coloration as desired). If you want a "belt and suspenders" approach, you can also rename the desired worksheet from Sheet1 to something else like "Daily" or some other prosaic name.
For each office, you tell them the naming convention you would like - for instance, WesternEasthaven_02222017.xlsx - assuming only one file per day.
Then you build a form with some code under it. Build a button on that form to launch a "gathering" sequence.
The sequence starts by using a File System Object to find a candidate spreadsheet via wild card search based on a dedicated folder and a fixed file type (.XLSX is good enough). The File System Object is detailed on this forum in many places.
If you find a file, the name you find might be significant since you can request people to use a specific name format. You can use the File System Object to pick apart the file specification to find the file name and can then use the Split function (based on an underscore as a delimiter, perhaps) to get the branch name and the relevant date string, which you can parse using the Mid function to extract day, month, and year. You can check your records to see if this is a double-dip or if it is a new file. DLookup or DCount can tell you that if you keep automated records of receiving a file. (See later for the other half of this idea.)
For each file your sequencer finds, open an Excel Application Object. There are many articles on this forum showing how. Open the workbook (file). Activate the worksheet. You can then look at the collection of Row objects. ActiveSheet.Rows(1) is the header row. Each column in a Row object is a member of the Rows

.Cells collection. So you can write code to verify that each cell in Rows(1) contains the valid column header name - and you can test for the next couple of cells beyond the last one you expected to NOT have any names in them. You can also make your template have a name other than Sheet1 and you can test that.
If the workbook passes the test, you have a couple of options from here.
First and simplest is to close the Excel App object without saving anything and import the spreadsheet.
The other option is, since the Excel file is open already, just open a recordset and extract data one field at a time in a loop, storing complete records as you go, until you hit a blank row or other marker that you use to indicate "end of data." Using this way, you can pre-check each row before you store it and can thus assure proper formatting and content.
Whether you directly import to a temp table or to the final table is your call. There are merits to both sides of that method - simplicity vs. precision.
The last thing you want to do is, when you are done with the file you just processed and it is closed, either delete it (fso.DeleteFile) or copy it (fso.CopyFile) or move it (fso.MoveFile) to an archives area - but get it out of the folder where you processed it. (Or you can rename it using VBA's "Name As" verb.) Anything to prevent your scan from seeing the same file a second time.
As noted above, you can also use the file's name to update some table to remember "I got data from WesternEasthaven on 02/22/2017." That way, you can avoid entering the same data twice (which would probably lead to key uniqueness violations).
If you do it this way, then mechanically you take the spreadsheets from your branch offices and dump them in your "import" folder. Open your database, open the import form, click the import button. When done, your import folder has nothing left to be imported and your files have been handled. If you have late-arriving files, because of the "self-cleaning" nature of this method, you just repeat the process.
Error handling will be up to you. Rules on frequency of reporting will be up to you. But this method as outlined above will automate the process of importing data for you.