Seriously considering an idle timer (1 Viewer)

fat controller

Slightly round the bend..
Local time
Today, 21:07
Joined
Apr 14, 2011
Messages
758
I've been having some problems with data going missing from one particular table, and I am firmly of the mind that the cause is a momentary drop in network connection/quality/speed that is the root cause. Our network has had significant problems over the past year to the point that often we couldn't even open an excel spreadsheet over the network. It is now significantly better, but still not great.

My database has a couple of back end files, and it is one in particular that is giving a problem, and I twigged this morning that this particular back-end file is open pretty much all the time, 24/7, 364 days a year.

So, I have found the subform that was keeping it open and made some changes whereby the user will now have to open the form as a pop-up rather than have it displayed as a sub-form all the time.

Now, whilst looking at this, there is one form that is probably the most used throughout the business, and it too is often open 24/7 - currently, neither it not the table it is connected to has suffered any problems, but I am thinking about preventing problems now before they happen.

Often, employees will sit with this form open continually, whether it is in use or not - hence, I am concerned that if our network has a little wobble that the data is then vulnerable, so I am considering an idle timer based on the code contained here

If possible, I would like to modify that to add an action whereby it closes all forms with the exception of one (the dashboard, which is unbound), on reaching the timer limit and/or do not acknowledge a message box to reset the timer.

Are there any pitfalls I need to be aware of? As always, your knowledge and wisdom is gratefully received.
 

LPurvis

AWF VIP
Local time
Today, 21:07
Joined
Jun 16, 2008
Messages
1,269
Hi there.

The description of:
>> I've been having some problems with data going missing from one particular table

and
>> the cause is a momentary drop in network connection/quality/speed
Are slightly incongruous to begin with.
A corruption and rows which are simply missing, that'd be non-standard. (Unless you're automatically, routinely, compacting the database - but I don't see how that could be if it's open 24/7.)
The risk comes of an interrupted connection when writing to the database.

I'd consider auditing the users' actions first. Specifically, logging deletions (by any of a variety of means).
If you're still concerned about the never closing form (and associated connection and potential lock on the data page), you could then consider timing people out yes.
Or even going unbound with the form(s). (I'm not one to throw unbound forms as a solution willy nilly, but where applicable - they have their place.)

An onscreen prompt and closing forms (as many as your choose) is absolutely your choice as the developer. (Run it past the powers that be first surely, how adoring of their users are they?)

But I'd check the other considerations first.

Cheers
 

fat controller

Slightly round the bend..
Local time
Today, 21:07
Joined
Apr 14, 2011
Messages
758
I have an audit trail running on numerous forms throughout, which records all deletions and edits as it goes; I have had a good root through this, and cannot see any deletions on the table concerned.

Our network has been so flaky that there have been instances where it has corrupted (#Deleted) entries whilst being records were being written, which is partially how I have arrived at the conclusion I have.

Having built an entirely new back end only last week, and then imported the data from an excel spreadsheet (having exported it from the old back end first), I am really perplexed as to what is causing the problem.

Nothing has changed with this aspect of the database since 2013, and it is only in the past two weeks that I am having an issue. :confused:

The unbound form is essentially my version of a switchboard, and it has worked without issue forever more, so I don't anticipate issues there. With reference to the powers that be, well I am the powers that be :D - however, if I am going to create more problems or risks then I won't go ahead.

Is there anything else that would cause records to be deleted (and they are deleted, no compacting done)?
 

LPurvis

AWF VIP
Local time
Today, 21:07
Joined
Jun 16, 2008
Messages
1,269
Well, it's just that there's quite a leap between corrupt rows (#Deleted as you've seen) and vanished rows (unless a compact had been performed).

I don't think I'm being at all biased when I say that I'd have moved to SQL Server a long while ago.
If your network is at all intermittent, then you really need a more robust RDBMS behind it.
At the very least then, you would have the ability to better audit what's going on (catch actual record insertions and deletions).
Depending upon the version of Access you're running, you could try to implement data macros to emulate something similar to completely satisfy yourself that nothing untoward is going on.

But a disconnected implementation would also give you that level of control too (as you'll decide when the rows are written back to the database - and can spy on them then.)

SQL Server or a unbound, disconnected implementation - either should give you the control you need.

Cheers
 

fat controller

Slightly round the bend..
Local time
Today, 21:07
Joined
Apr 14, 2011
Messages
758
Thank you :)

How much of a mission is it to migrate to an SQL back end?
 

LPurvis

AWF VIP
Local time
Today, 21:07
Joined
Jun 16, 2008
Messages
1,269
Funnily enough, it's exactly the same length as a piece of string. :p

It can vary from a couple of hours work to a fair old hike. It depends upon the application and the code you've used in it. (Some crazy people, ahem, make part of their living out of such conversions :-s.)
If you're not familiar with working in a SQL database, it can be a bit of a learning curve. (But it's well worth the time IMO.)

Cheers
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:07
Joined
Feb 28, 2001
Messages
27,340
Here is a stop-gap measure / examination to consider.

If the form in question and/or its underlying queries are set to Optimistic locking, then you do not lock the tables or have them open unless/until you actually do something. This does NOT block the case where your link to your BE has vanished, but it DOES help to prevent a network link failure from hosing your database to tears.

Don't get me wrong, now... if you are actually unlucky enough to be doing something when your network link drops out of sight, nothing will prevent that from causing an inconsistent database. HOWEVER, if your analysis is correct and most of your people are resting on their laurels (or sitting on their thumbs), then optimistic locking would cause there to be nothing inconsistent in your BE most of the time. This solution is just playing the odds by narrowing the window of opportunity for the network to hose you, it isn't a permanent fix. But it might buy you some time.
 

LPurvis

AWF VIP
Local time
Today, 21:07
Joined
Jun 16, 2008
Messages
1,269
Hi there. Long time.

That's more or less true of Pessimistic Locking too though (I'd be surprised if the OP was using Pessimistic, as it, of course, has to be specifically chosen, i.e. it's against the norm.)
By that I mean, a user has to have begun editing the record for it to have locked.
Yes, it's then locked for longer, but still requires that initial edit. And even then, locking should only affect contention for that data page, rather than writing back as a single action.
Now it might be that they do begin editing and then sit there indefinitely and that Pessimistic locking is employed. It might be that a save check on the Timer event of the form might then prove fruitful.
I'd just be surprised is all. :)

Cheers
 

fat controller

Slightly round the bend..
Local time
Today, 21:07
Joined
Apr 14, 2011
Messages
758
Thanks all :)

I have been on site all weekend, and I am now moving toward the notion that someone, somewhere is mucking about in one of two ways - either they have found a way to edit/delete records without it showing on the audit table (goodness knows how they are doing that though!), OR they are tampering with the back-end file directly (lives in a hidden folder on a shared drive). I suspect that the latter is the case.

I left the site on Saturday night at 9pm, having been there for 16 hours, and all was well when I left. By the time I got back in yesterday morning at 8am, two records had been created and deleted (no audit trail) and one partially created with only one field populated. The really odd thing was that the vehicle ID that appeared was one that we have not had at this particular site for over 18 months.

So, my next move is to password protect the back-end files, and that will make it that bit harder (and hopefully stop) whoever is doing this.

I have never heard of Pessimistic Locking (or Optimistic for that matter), so it is nigh on impossible that I have used something like that, unless I have done so by pure chance.

EDIT - since leaving last night and returning this morning, nothing has been changed. As of this morning, I also have implemented an update where the switchboard screen inserts 'who and when' into a table when they log in and log out (in addition to counting log-ins and noting the last log in date/time which has always been the case).
 

LPurvis

AWF VIP
Local time
Today, 21:07
Joined
Jun 16, 2008
Messages
1,269
Hi

Well, as mentioned given the original scenario, that it's possible deliberate meddling isn't a surprise.
Securing the BE is one thing. (Your individual applications surely have to link to it still... are you relinking them with password included?)

However, if you're using an ACCDB, I'd suggest using some Data Macro snooping.
It'd be partly what you'd want to do anyway if you moved up to SQL Server.
But they're simple to set up, and would give you some audit. (Record creation and record deletion operations in particular?)

Of course, locking the BE off should prevent snooping - and hence the need to hide the Audit table would be less important, but I'd consider hiding it anyway. ;-)

Cheers
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 21:07
Joined
Sep 12, 2006
Messages
15,719
passwording the backend ....

note that if users can see the linked tables in the front end (or even import tables in the front end to a separate dbs - you actually import the link, not the data) then they can probably see the connect string, which contains the password to the back end in plaintext.

not sure how easy it is to prevent all this from a determined snooper.


are you sure the partially edited records ARE entries directly entered into the backend, and not just some partial entry made in the normal way.

one way of checking is to add some fields to the before update event for the form that writes new records, to add in user name and date - you may have these anyway. (Maybe chuck in some non-printing character like a tab, or a hard-space at the end of the user name). If someone adds data directly to a table, these fields will be blank. Obviously they could manually add the data, but they probably won't realise there is a non-printing character at the end, and then you will know that the entries were made manually.
 
Last edited:

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:07
Joined
Feb 28, 2001
Messages
27,340
two records had been created and deleted (no audit trail) and one partially created with only one field populated.

May I ask how you know (without audit entries being present) that two records had been created and deleted? If you are going by the AutoNumber field, that could be someone who started to enter a record but then aborted the process with nothing stored. If that happens, would you have written an audit record? On my system, I have a [COMMIT] button and a [CANCEL] button. I don't audit until you commit your changes. If you have a similar capability no matter how you manage it, then when do you audit?

The "partially created" record case has also cropped up on my system, but after a lot of research, I attributed that one to sloppiness on my part in allowing someone to navigate away from a record before committing it, due to an error that triggered a Trap event and did not correctly handle my logic flow.
 

fat controller

Slightly round the bend..
Local time
Today, 21:07
Joined
Apr 14, 2011
Messages
758
Oddly, since I made some changes the other day, the problem has stopped....

I was going by the autonumber to see which records had been created/deleted, so yes it is possible that that it was an abandoned entry, however whilst this table supplies data that is in use constantly (and I do mean constantly), the data contained therein is very seldom altered. The form that is used to edit/create records does not allow for abandoned entries as such - the users are forced to follow the procedure through and at the end of that they only have the option to save. In fact, no user has got the rights to delete from this particular table - - they can put a record out of use (archive) by putting a date in a field, but the data never goes away. There are only 218 records in this table, 160 of which are currently 'live'.

None of the tables are visible to users via the front end, and the back-end files are in a hidden folder squirreled away on the shared drive, so it would have to be someone fairly determined.

Now, when I made the changes the other day one of those changes was a very quick string that inserts into a table that basically records the exact time that people log in and out of the front end, and inserts that into a table. I did, accidentally, deliberately, let a couple of the more 'gobby' members of staff see the information it was recording...... maybe word has gotten around that I am watching? Who knows.

I will still probably password the back-end just for that added layer of protection - they are only ever accessed directly by me, and for users they are all linked (so I can add the password when I link them).
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:07
Joined
Feb 28, 2001
Messages
27,340
Just remember that if you use AutoNumber (whether sequential or random), there is a non-zero probability that merely opening the record might start a storage process that consumes one "tick" of the AutoNumber field, particularly if there is a default value associated with the table in question. If you then don't store that, the moving hand has already written and now moves on, to misquote Omar Khayyam.
 

Users who are viewing this thread

Top Bottom