When to use more sophisticated methods than forced update at startup (1 Viewer)

zeroaccess

Active member
Local time
Today, 15:20
Joined
Jan 30, 2020
Messages
671
So I'm playing around with various ways to check versions on startup and update front ends, but my FE is nearly complete and is < 5 MB. Maybe it's easier to just do it the lazy way via batch file. User count is < 20. I do not use any local tables at this time. All users run the same version of Access and are on the same network.

Some benefits to this could be: database only runs if back end is available, and gives a "fresh copy" every time (even though .accde already mitigates most issues).

At what point do you develop more sophisticated version checking routines? Is there a size threshold you go by to save network bandwidth? Number of users? Any other considerations?
 
Last edited:

CJ_London

Super Moderator
Staff member
Local time
Today, 21:20
Joined
Feb 19, 2013
Messages
16,605
not at all sure what you mean. what has user count or network bandwidth got to do with it?

At what point do you develop more sophisticated version checking routines?
my basis is 1 or more users other than myself.

The basis being I have two versions - development (mine only) and testing (for the testers). As soon as it goes to production, there may be one user, there may be 500. Typically under testing there will be somewhere between 1 and 5 testers, sometime more, depending on the app
 

zeroaccess

Active member
Local time
Today, 15:20
Joined
Jan 30, 2020
Messages
671
not at all sure what you mean. what has user count or network bandwidth got to do with it?
If the file you are copying is large, multiply that by how many users are going to log on and download the file. I could imagine a point where you may want to mitigate that depending on the size of your file and the size of your user base. I'm not sure how mine stacks up, but I don't think I'm nearing any tipping point. I'm just curious if others have a threshold or rules of thumb they go by, or other reasons for checking versions rather than doing the forced copy.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 21:20
Joined
Feb 19, 2013
Messages
16,605
still doesn't make sense to me - a file copy is more efficient by far than interrogating a db for data, updates to front ends should occur rarely after testing so my answer remains the same. If you are saying 500 users all logon at exactly the same time and there is an upgrade which require all users to copy a new version of the front end - how do you protect against that? The answer is, for me it has never happened. The largest userbase I have supported was a tad under 1000 users (all sales people wanting information about their commissions) and it was not a problem. We didn't monitor when they all logged in, but the worst that would happened would be took a minute or so longer to download the updated front end

So guess I can't answer you question
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:20
Joined
Feb 28, 2001
Messages
27,165
Only once did I have to do this, on the Navy machines for my most recent project. Navy regulations said I was not allowed to run a batch job on someone else's machine at app launch so I had to include a table of versions. I killed two birds with one stone. My version table in the BE was the history of my updates. That is, when I changed to a new version, I added a row and assured that the most recent version was always knowable. This meant I also had data available that could be used to make a report of the changes made to the DB.

At launch, each FE compared a version string that was defined in my opening form against the table of versions. When I changed the FE, that string would be updated to reflect the new version number. There was a "mandatory update" flag and if you had an older version than the most recent mandatory update, you were told to get the new version. The launch would fail until you did. IF on the other hand your version was more recent than the most recent mandatory update, you were allowed to run - but if you were running an older version than the latest one, you were given a pop-up one-time (per session) reminder that you were running a version that didn't have the latest fixes in it.

I would have used the auto-updater if I could have because I was aware of it. But the Navy regs said "no."
 

zeroaccess

Active member
Local time
Today, 15:20
Joined
Jan 30, 2020
Messages
671
still doesn't make sense to me - a file copy is more efficient by far than interrogating a db for data, updates to front ends should occur rarely after testing so my answer remains the same. If you are saying 500 users all logon at exactly the same time and there is an upgrade which require all users to copy a new version of the front end - how do you protect against that? The answer is, for me it has never happened. The largest userbase I have supported was a tad under 1000 users (all sales people wanting information about their commissions) and it was not a problem. We didn't monitor when they all logged in, but the worst that would happened would be took a minute or so longer to download the updated front end

So guess I can't answer you question
Actually that does answer part of the question. Basically, you're saying number of users is not a major consideration on modern networks, although you didn't mention if this file was large or small.

I'm a hardware/network person so I do think about these things.
 
Last edited:

zeroaccess

Active member
Local time
Today, 15:20
Joined
Jan 30, 2020
Messages
671
Only once did I have to do this, on the Navy machines for my most recent project. Navy regulations said I was not allowed to run a batch job on someone else's machine at app launch so I had to include a table of versions. I killed two birds with one stone. My version table in the BE was the history of my updates. That is, when I changed to a new version, I added a row and assured that the most recent version was always knowable. This meant I also had data available that could be used to make a report of the changes made to the DB.

At launch, each FE compared a version string that was defined in my opening form against the table of versions. When I changed the FE, that string would be updated to reflect the new version number. There was a "mandatory update" flag and if you had an older version than the most recent mandatory update, you were told to get the new version. The launch would fail until you did. IF on the other hand your version was more recent than the most recent mandatory update, you were allowed to run - but if you were running an older version than the latest one, you were given a pop-up one-time (per session) reminder that you were running a version that didn't have the latest fixes in it.

I would have used the auto-updater if I could have because I was aware of it. But the Navy regs said "no."
Interesting. I know of other databases on the network using the "batch file as the shortcut" method, so I know I'm not running afoul of any rules & regs.
 

HiTechCoach

Well-known member
Local time
Today, 15:20
Joined
Mar 6, 2006
Messages
4,357
zeroaccess,

I understand your concerns as a hardware/network person. They are valid network concerns.

With databases and data, the number one priority is protecting the data. To do this, all the front ends MUST be keep updated to the currently deployed version.

This does not mean we do not also have to keep in mind the impact on the network when deploying updates. That is why updates are normally not done very often.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 21:20
Joined
Feb 19, 2013
Messages
16,605
although you didn't mention if this file was large or small.
it was some time ago, so I can't give you an exact figure but the file would have been around 3mb - the size of my final development app.

With regards modern, the initial app was distributed around 13 years ago and was used for nearly 4 years until my clients purchased a web based solution. That fell over after two years and they reinstated my app with mods for another two years until they were under new ownership at which point the commission scheme was migrated to yet another web based app in use by the new owners. Six months after that, I lost touch, my contacts either moved on or made redundant.

Yes I deal mostly with developing software solutions, primarily 'stitching together' enterprise systems that cannot talk to each other effectively, but always in context of the OS and hardware environment, coupled with the requirements of the client in terms of connectivity. I estimate over the last 20 years or so I have saved my clients around $1/2 billion. Unfortunately I don't get paid a percentage.:sneaky:
 

isladogs

MVP / VIP
Local time
Today, 21:20
Joined
Jan 14, 2017
Messages
18,212
Agree with previous answers. INo matter how large or small the app, I have a development copy and it is tested thoroughly before release.
I limit the number of updates bundling several new or improved features together wherever possible to minimise how often clients need to perform such updates.
All updates are then made available via my website and client system admins are automatically alerted of such updates. This allows the client to download the updated version when convenient to them. The client system admins then install the update which may include sql script to modify the BE structure, additional network files etc (all automated) before installing it in a shared network location. At that point, the update is automatically downloaded to end users' workstations (typically 200+ per client) when they click on the app desktop shortcut.. I use Windows APIs for copying from the network. This means that, even with my largest FE which is around 140 MB. the file transfer only takes a few seconds and the process appears seamless
 

zeroaccess

Active member
Local time
Today, 15:20
Joined
Jan 30, 2020
Messages
671
Agree with previous answers. INo matter how large or small the app, I have a development copy and it is tested thoroughly before release.
I limit the number of updates bundling several new or improved features together wherever possible to minimise how often clients need to perform such updates.
All updates are then made available via my website and client system admins are automatically alerted of such updates. This allows the client to download the updated version when convenient to them. The client system admins then install the update which may include sql script to modify the BE structure, additional network files etc (all automated) before installing it in a shared network location. At that point, the update is automatically downloaded to end users' workstations (typically 200+ per client) when they click on the app desktop shortcut.. I use Windows APIs for copying from the network. This means that, even with my largest FE which is around 140 MB. the file transfer only takes a few seconds and the process appears seamless
I think I have read about your process before. It sounds elegant. However, in your example I would be the system admin as I work on the team.

This means I'm both developing and putting in place the updated files, so no one would be bothered by updates except for myself. I guess all that's left then is loading time to copy a 4-5 MB .accde, which is minimal. I'm starting to think simpler is better in my case.
 

Users who are viewing this thread

Top Bottom