Basic backend table modification questions ... (1 Viewer)

Local time
Today, 05:07
Joined
Feb 28, 2023
Messages
628
I should know the answers to these, but I wanted to confirm them.

  • I want to add two new tables to the back end of my database. Usually, I would get exclusive access to the BE in order to do this. Is this a firm requirement, or can other users have the back end open, since there is no way for the new tables to be accessed since they aren't linked to the FE yet.
  • I know if I want to add or modify fields of a linked BE table, I need exclusive access to the BE - or at least I've always done it that way.
  • Some of my tables are stored in the backend and then the data in the table is copied into local tables in the FE on startup. I'm assuming that if I ever wanted to add/modify fields to these tables, that is when I would have to have exclusive access to the BE, update the tables in the BE, update the FE, and then distribute and have everyone update to the new FE at the same time the updated BE was available, correct?
  • Adding tables to the FE is simple, except nobody else can use them until the new FE is released?
Thanks in advance!
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:07
Joined
Feb 19, 2002
Messages
43,275
It is poor practice to modify the BE during normal work hours. Do your BE maintenance after hours or on the weekend. For emergencies, you can shut down the users but don't make a normal practice of it.

  • Technically, adding a new table shouldn't disturb any existing code or queries and so would not be considered dangerous per se.
  • Modifying a table is a different issue. Don't even think about doing this while users are active in the BE.
  • For tables that are duplicated in the FE, you need to distribute a new FE as well as modify the BE to make changes. Again, you REALLY do not want to be doing this during normal work hours. Stay late, come in early, do it on the weekend. Do, not get in the way of normal business time. For extra safety, I rename the BE when I am doing maintenance just in case some overzealous user snuck in on a weekend to work and I didn't expect it. Then when the changes are made and the new FE is in place, I reset the BE file name as the last step.
 

MarkK

bit cruncher
Local time
Today, 02:07
Joined
Mar 17, 2004
Messages
8,181
Things I have done that have worked while the BE had connected users...
- Export a table from the FE to the BE using the Export popup option from the nav pane

With a DAO.Database reference to the BE...
- Execute a SQL CREATE TABLE command.
- Execute a SQL ALTER TABLE command for a table that has no locks (no one is using it at the time).

I have successfully run this code numerous times where dbs is an object reference to the BE. It raises an error if a user has a lock on the table.
Code:
Public Sub AddColumn(TableName As String, ColumnName As String, DataType As String)
    If Me.TableExists(TableName) Then
        On Error GoTo handler
            dbs.Execute "ALTER TABLE " & TableName & " ADD COLUMN " & ColumnName & " " & DataType
        On Error GoTo 0
    Else
        Err.Raise 5, "cTableDef.AddColumn()", "Table not found"
    End If
    Exit Sub
handler:
    Err.Raise Err, "cTableDef.AddColumn()"
End Sub
 

ebs17

Well-known member
Local time
Today, 11:07
Joined
Feb 7, 2020
Messages
1,946
Actually you want local tables. You know that you can link multiple backends into one frontend. This means that local tables are best stored in an additional local backend. This means that neither the backend with the master data nor the frontend is disturbed.

Such an additional backend can be distributed with the frontend or created completely using code.
 
Local time
Today, 05:07
Joined
Feb 28, 2023
Messages
628
Wow - Great answers!!!

@Pat Hartman - Thanks for confirming and explaining. Contrary to previous impressions, I very rarely interrupt the normal work flow. There are ways (for us, at least) to not do so without working odd shifts - for example:
  • As you confirmed, adding a table can be done without kicking everyone out of the back end, so can be done during working hours.
  • Modifying a linked table - Two scenarios:
    • If I am adding a NEW field: This doesn't take long. I'll need maybe 5 to 10 minutes exclusive access to the back end. Typically, the rest of the team leaves for the day earlier than I do. If necessary, the database typically isn't critical enough that someone can't spare 5 minutes for me to work on it, or I can do it at a different time. Adding the new field doesn't affect the deployed front end - it doesn't know the field is there, so I can modify the development front end without impacting anyone else.
    • If I am deleting a field from the table or modifying a field in the table and those fields are used in the front end, I have to release simultaneous as I would for a duplicated table below:
  • Adding/Deleting/Modifying a field in a duplicated table does require coordination on the release. What I typically do here is:
    • Make a COPY of the backend and make my changes to the COPY. (And probably make a log in Word or similar of exactly what in the backend was changed to what.
    • Change my development FE to link to the COPY of the BE. Modify the development FE as needed.
    • Tricky part: I need exclusive access to the BE (5 to 10 minutes) to make my changes, Then I need to change the development FE to point to the REAL BE, Then I need to release the development FE as the production FE. Needs maybe 30 minutes when the database will be down for everyone. Typically Friday afternoons work well. Everyone else is getting ready to leave and the new version will be in place Monday. (And if I find anything minor that needs to be changed, I can work on it over the weekend, if needed.)
@MarkK - I wasn't aware of the Export pop-up. Seems useful. If nothing else, I could develop my tables as local tables in the development FE. No impact on anyone else if I want to add fields or adjust structure. Then, when I'm satisfied everything works properly, I would export the table to the BE and link to it in the FE.

@ebs17 - I hadn't considered local tables. I think this is sometimes referred to as a side-end. Sounds promising, but I'm unsure about the logistics. For information - we have three types of tables in the database:
  • Strictly linked tables: These are in the backend and would stay in the backend. This the day-to-day real-time data in the database.
  • Tables only in the FE: There are not many of these. There is one table that contains the version date of the FE, and one table for monitor (display screen) information.
  • Tables in the backend that get the data loaded into the FE on startup. These are things like network paths, E-mail addresses, etc. We load them locally for faster access. They aren't updated often. The structure rarely changes. The data is fluid enough that I would want everyone to have it updated on startup, but not volatile enough that real-time updates are required (although that would be nice). These seem like candidates for side-end (local) tables - SE. With them as a side-end, I could update my local version of the SE and my development FE without disturbing operations and just release together when needed.
Questions:
  • Where would the SE (local tables) be located. The BE is in a central location on the network. The FE is either on the users desktop or (for Citrix) on the mapped U:\ Drive. I'm assuming I would place it in the same folder as the FE.
  • If the files are local - how are updates managed? Not necessarily structure changes, but just updates to any of the records? Would I need to have a version number for the SE and update it every time I changed a record? (And have a SE Version number field in the BE that gets compared at startup?). In a perfect world, I could just send out an E-mail that I updated the SE and go grab the latest version and copy it, but that didn't work well with the FE, so now we have a similar process that stores a version number in the FE and compares it to the version number in the backend. (Or I guess I could have a script that downloads the SE to the local folder - similar to how I update the duplicated table data, but it would be done outside of the FE.)
 

ebs17

Well-known member
Local time
Today, 11:07
Joined
Feb 7, 2020
Messages
1,946
With them as a side-end, I could update my local version of the SE and my development FE without disturbing operations and just release together when needed.
You described one of the advantages well.

I'm assuming I would place it in the same folder as the FE.
Yes, in the same folder or in a subfolder, but with a fixed relative path to FE for easy table relinking.

how are updates managed?
You don't need version numbers just for data. Simply use timestamp fields in the records for new creation and changes.
 

ebs17

Well-known member
Local time
Today, 11:07
Joined
Feb 7, 2020
Messages
1,946
When developing I use a copy of the backend. Data does not have to be complete and up-to-date, but it does have to be sufficient to show typical problems. You can also play around wildly with a copy.
Since development takes time, users can now continue working with the original backend without being affected.

To transfer the update:
The front end is simply replaced.

Structural changes to the backend are made during development using a compiled code script, not by hand.
Structural changes to the database
The script is cumulative, in the form “first check, then act per measure”. This means it can be executed as often as you like.
For me, the script is a documentation of exactly what is being carried out, also to prevent problems caused by others from being assigned.
The script can then be executed in a very short time and without errors (because it has been checked) when actually updating the working environment. After all, you need exclusive access here, the users have to be logged out and take a break.
Nothing else happens to the data from the original backend. If in doubt, I don't even need to know this user data (trade secrets, data protection, etc.).
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:07
Joined
Feb 19, 2002
Messages
43,275
The tables that contain information that pertains to running the FE like the version table, any menu tables, etc. These tables would change when the FE is changed and never by anything the user does.

The locally stored remote data. This is where the side-table concept works well. RI isn't relevant for these tables and these tables would never be joined in a query to linked tables. In my apps, I sometimes download my combo list recordsource data. It rarely changes and eliminates the need to transfer the data every time a combo is opened. You might also have data from a different application that for whatever reason, the PTB refuse to allow you to link to. To handle these, the FE makes a new side-BE in the same LOCAL directory that the FE runs from when it opens and then imports all the tables you want to store locally. As long as the file and table names and side-BE location remains static, the links won't break so you never need to relink once you have done this once. If you use the create method, you also need to run DDL queries after the data is imported to add the necessary indexes.

The second method of using a side-BE is to create a template with the tables and indexes all defined but empty. Then to start the process, your code copies the template from the server to the local directory. It then creates links if they don't exist and runs append queries to transfer the data. One method isn't better or more difficult, they are pretty equal in the work it takes to implement them

You should not be downloading data into the side-BE that changes or that the user can change.
 
Local time
Today, 05:07
Joined
Feb 28, 2023
Messages
628
I was following along and I got lost in the details somewhere. Here's a (hypothetical, obfuscated) example:
*** Current Method ****
Tammy and Tom work in my group and use the database.

In the BE, I have tbl_Email with two fields: fldGroup and fldEAdress

Record1 of tbl_Email contains QA and joe.smith@company.com

Usually, we will still want to send to joe.smith, but tbl_Email is copied locally from the BE to the FE on startup to provide faster acccess.

Joe Smith retires and Suzy Brown takes over for him.

I update Record1 to change joe.smith@company.com to suzy.brown@company.com. (And typically E_mail Tammy and Tom to let them know).

Tom does not have the database open, so he needs to send a report to QA, so he opens the database, it downloads the updated tbl_Email and he sends the report to Suzy.

Tammy does have the database open and she needs to send a report and she will try to send it to Joe and will get "Undeliverable" and will either let me know or close and re-open the database and it will work. (That part is less than ideal, but the data doesn't change often enough for it to be an issue.)
*** Side End method***
Tammy and Tom have the FE and their OWN copy of the SE on their computers. The SE has tbl_Email included. The FE no longer loads tbl_Email, but instead links to the tbl_Email in the side end.

I update Joe Smith to Suzy Brown.

How does that get communicated/distributed to Tammy and Tom? I'm not sure timestamps work ...

Typically, without version tracking, I send an E-mail to Tammy and Tom that they need to download the new copy of the SE. Tammy is diligent and does so. Tom, not so much. So I get an E-mail from Tom: Hey, my email to Joe bounced back. I have to remember that I updated the SE file. I can't ask him about file dates b/c typically Access updates the file date whenever the file is opened, whether or not the data is changed. I have to ask Tom if he updated the SE File. He says "Probably not. I vaguely remember an E-mail about that, but I was knee-deep in this other project and didn't get around to it."

How do I ensure that the SE is up-to-date? (Or does the database just download the new or current version on startup, in which case I'm not saving much (as I see it) over having it populate the local tables from the BE.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:07
Joined
Feb 19, 2002
Messages
43,275
Like @ebs17 I always use a copy of the production BE when testing. It means that I don't have to worry about messing up production data or interfering with operations. I also keep a log of changes as DDL queries. When I'm ready to move the new version of the FE to production, I take the BE off line and then run the DDL queries against it.

I also have a tool called SQL Examiner Suite. Sadly, it doesn't include Jet or ACE as a database but I can upload my BE to SQL Server and then run the tools in the suite that compare the old version of the BE to the new version and it creates all the DDL for me. Most of my BE's are SQL Server anyway so rather than keeping the log as you have to do if you only work with Access, I can wait until the end and let the tool figure out what I changed.

I have some old files that will show how this method works. It is a little more formal than you need but my clients were remote and in the case of the sql version, their DBA would need to run the script. If they had the Jet BE, then I would connect online with an Admin and send them the database and then run it and check everything. Later versions are more complex because they do more than just check the database name. They check the internal version and include proprietary info so I can send a new one.

You can open the .sql file with notepad. This is the actual script generated by the tool. It does the same thing as the Access version but to an SQL BE rather than an ACE BE.

You won't be able to run the database but you will be able to examine the DDL to see that it is pretty simple.
 

Attachments

  • DEA_Ver36toVer37.zip
    221.1 KB · Views: 38
  • DEA_AuditVer36_DEA_Audit36NEW_20091015.zip
    725 bytes · Views: 51

ebs17

Well-known member
Local time
Today, 11:07
Joined
Feb 7, 2020
Messages
1,946
How do I ensure that the SE is up-to-date?
There should only be data there that relates solely to this one user and his work.

The information that is important across users must be in the BE. In order to save such data locally, you could load it when you start the FE and store it here
- in TempVars
- in the text boxes of an invisible form
- in specially created properties
 
Local time
Today, 05:07
Joined
Feb 28, 2023
Messages
628
@ebs17 - Thanks, I misunderstood. In that case, the duplicated tables are Not really advantageous and I'm am probably best continuing with loading the local tables on startup.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:07
Joined
Feb 19, 2002
Messages
43,275
How does that get communicated/distributed to Tammy and Tom? I'm not sure timestamps work ...
As I said, I would NOT download data that is likely to change.

If you want to do this, you can handle the situation by adding an additional table to the BE and the side-BE. You will need to use a data macro in the BE to handle this. The data macro needs to update the email table row in the log with a date/time whenever a change is made to the email table. Each download table needs a separate row in this table and its own data macro. Hopefully this is possible with the data macro. I've never used one. Then you need to update the date/time you did the download in the side-db table. Now, any time you use one of the downloaded tables, you need to compare the datetime in the BE table with the download datetime. If the datetime of the download is earlier than the last update datetime, tell the user to close and reopen or give them a way to run the code again which would be more user friendly.

Again, the side-db is unique for each user and is located on their C: drive.
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 10:07
Joined
Sep 12, 2006
Messages
15,656
You might be able to add a new table or edit an existing table if it's not in use. You won't be able to set RI against an existing table that's in use, and you won't be able to do a C&R. I would only do stuff on a DBS that's in use of there was absolutely no alternative.
 
Last edited:
Local time
Today, 05:07
Joined
Feb 28, 2023
Messages
628
Thank you all. I re-thought my use of imported linked tables and dropped them back from about 10 tables to only 2. The database loads more rapidly, but interim lookups take longer. But the tables get real-time updates and I can modify the back-end structure (with exclusive access) without immediately having to release an updated FE.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:07
Joined
Feb 19, 2002
Messages
43,275
But the tables get real-time updates and I can modify the back-end structure (with exclusive access) without immediately having to release an updated FE.
You apparently have never worked in a real production environment. The most rigid was Sikorsky (they make Marine One - the President's helicopter) but all military contractors would be similar. The fewest stages was three - unit testing, system testing, production. When I wanted to change an application whether it involved just the FE or just the BE or both, it had to go through EIGHT staging libraries and be tested in each before the final move to production. This was obviously overkill for my Access applications but rules are rules. This was part of their service level agreement with the government and so I had to live by them too.

Making changes to the production database directly without testing first is poor practice. Doing it while users are in the database is a death wish. So far you've been lucky. I get that the world is different when YOU are the IT department and you work for a small business rather than Sikorsky at the other end of the scale, but the company's data is their lifeblood and you have a fiduciary responsibility to them to keep it safe and accurate. Testing before applying changes isn't really too much to ask.
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 10:07
Joined
Sep 12, 2006
Messages
15,656
@Pat Hartman
And yet the skilled designer, with a thorough knowledge of the development platform, and a deep understanding of the project environment can create stellar applications in a fraction of the time that the much more expensive system-based design team can produce relative rubbish. It may not meet the clients real requirements, or it may force clients to have to produce a design specification. Especially when the client often doesn't really understand what they actually want/need, or maybe doesn't realise how they could get something far better than what they have asked for.
 
Local time
Today, 05:07
Joined
Feb 28, 2023
Messages
628
@Pat Hartman - You would be surprised who I work for. I'm not looking to divulge it here, but we own Sikorsky now.

That said ... yes, requirements are different - even for government contractors for a large production database as opposed to an internal database that 8 employees use to track statuses.

I never said I didn't plan to do testing. In fact, one of the primary advantages of going back to linked tables, is that if I want to add a new field to a backend table, I can do so and then do my development platform testing without having to release a new FE and then find out in tested that I need to release yet another new FE.

I am fortunate that most of the errors are things that I introduced and I can work around (usually) if needed.

For example - I just recently added validation to some of the data fields, but I forgot some of the potential conditions. If needed, I just made the updates to my development FE and/or "I" have direct access to the tables and could make the changes there - which avoids the validation.

Point taken that if I'm not available someone else has to figure it out or they need to come up with a workaround - which is not good practice.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 05:07
Joined
Feb 19, 2002
Messages
43,275
And yet the skilled designer, with a thorough knowledge of the development platform, and a deep understanding of the project environment can create stellar applications in a fraction of the time that the much more expensive system-based design team can produce relative rubbish. It may not meet the clients real requirements, or it may force clients to have to produce a design specification. Especially when the client often doesn't really understand what they actually want/need, or maybe doesn't realise how they could get something far better than what they have asked for.
I wasn't suggesting that you need to use the "Sikorsky" method. Only that there is a rational middle ground that minimizes the risk of releasing bad updates to the general population. For major changes, I use the system test libraries so that several users can test the new features to ensure that they are what was requested and work correctly. For minor changes, I trust myself to do the testing. The POINT is that you need at least some separation between your development and production environments and experience has convinced me that playing fast and loose with this concept is dangerous.

PS when the wind is right (or wrong actually:)), I can occasionally hear the helicopters being tested at Sikorsky if I'm out on my patio or in the pool. They never fly over my house though, they tend to stick to the river. Right now all I can hear is the leaf blower outside my office window as the gardeners do their final leaf blow of the season. This one is really efficient, he's standing on my back stairs and really cleaning the bush outside my window. He must be new. No one has ever been that thorough before.
 
Local time
Today, 05:07
Joined
Feb 28, 2023
Messages
628
The POINT is that you need at least some separation between your development and production environments and experience has convinced me that playing fast and loose with this concept is dangerous.
Concur - you need a reasonable solution. For me:
  • For major changes, or anything where I know I will want to create new records and not have somebody asking what this record is about, I work on a COPY of the BE with the DEVELOPMENT FE. The two risks here:
    • Obviously, the major risk is if I forget to change the linked BE from the copy to the main BE and release the database, or don't realize I am using the Copy BE, users "think" they are changing the production data rather than the copy data. Then they have to go back and update anything they changed in the production BE.
    • I have to log pretty carefully what I've changed, so that I don't make changes to the COPY BE that work fine and then get errors when I link to the production BE b/c it doesn't have the same changes.
  • For minor changes, I'll just update the development FE with the production BE and use the development FE normally as part of the informal testing. Two risks here:
    • If I release too quickly, I haven't tested each condition, and someone says "Hey, I get an error when I'm doing this." And I have to figure out why the error occurs and fix it and release another update.
    • If I release too slowly, I tend to forget what changes haven't been released yet. So I tend to think "Don't worry about that, the FE already checks for and corrects that!". No - the DEVELOPMENT FE checks for and corrects it, the PRODUCTION FE ignores it.
 

Users who are viewing this thread

Top Bottom