Solved Performance hit after back end encryption (1 Viewer)

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
I added a password to my back end and there is quite a performance penalty while loading forms, etc. And this is with only 1 user for testing (me).

I removed the encryption as the performance was unacceptable.

Is there a better way to protect the back end? It has everything hidden - special keys, navigation pane, menus, and toolbar is disabled. You just get a blank window when you open it, unless you know about the bypass key. That is, if you find it. Network path access is already restricted to the team using the program.
 
Last edited:

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
If this is the wrong forum, you may move it to General. I guess when I think of back end databases, I think tables.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 05:29
Joined
Feb 28, 2001
Messages
27,179
This becomes an issue with data-at-rest vs. data-in-motion. Which do you fear more? A hacker breaking into your back-end file on the server or a hacker tapping your network to watch your files fly past a "sniffer"?

If you can restrict user visibility via firewalls and Windows-level file permissions, that isolation might be enough. If you are worried about network traffic being captured, consider changing the connection to a VPN, which is automatically encrypted but only in motion. It is clear-text on either end.

I certainly cannot tell you what to do because that will depend on your priorities, which I obviously cannot know. But I can tell you that the U.S. Navy didn't encrypt back-end files for non-SECRET databases. Even if the DB was just SENSITIVE BUT UNCLASSIFIED (SBU), they didn't encrypt the back end files. They just used firewalls and restrictive permissions with group identifiers as a way to limit data visibility.

They also had special VPNs for their SECRET and higher networks, and these VPNs were never seen from the general Internet. But these VPN routers were speedy little hardware "brokers" that managed encryption and decryption very fast. I never had to actually manage one so I don't know the specific hardware technology, but they WERE available from commercial vendor sources.
 

isladogs

MVP / VIP
Local time
Today, 11:29
Joined
Jan 14, 2017
Messages
18,219
Encrypting the backend should not have a significant effect on performance.
Make sure you have a persistent connection to the backend. That should improve things a lot.

Also try to avoid opening large datasets when you open a form. Just pull the data you actually require.
There are many other factors to consider in obtaining the best possible performance.
See the two links in the first post of this thread Performance tips to speed up your Access database
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
This becomes an issue with data-at-rest vs. data-in-motion. Which do you fear more? A hacker breaking into your back-end file on the server or a hacker tapping your network to watch your files fly past a "sniffer"?

If you can restrict user visibility via firewalls and Windows-level file permissions, that isolation might be enough. If you are worried about network traffic being captured, consider changing the connection to a VPN, which is automatically encrypted but only in motion. It is clear-text on either end.

I certainly cannot tell you what to do because that will depend on your priorities, which I obviously cannot know. But I can tell you that the U.S. Navy didn't encrypt back-end files for non-SECRET databases. Even if the DB was just SENSITIVE BUT UNCLASSIFIED (SBU), they didn't encrypt the back end files. They just used firewalls and restrictive permissions with group identifiers as a way to limit data visibility.
Thanks - this would be data-at-rest. This is a U.S. Government department WAN with 1000-1500 users. The directory the database resides in is limited to 30 or so people with Active Directory. So the concern is in regards to someone in the group stumbling upon the folder.
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
Encrypting the backend should not have a significant effect on performance.
Make sure you have a persistent connection to the backend. That should improve things a lot.

Also try to avoid opening large datasets when you open a form. Just pull the data you actually require.
There are many other factors to consider in obtaining the best possible performance.
See the two links in the first post of this thread Performance tips to speed up your Access database
Thanks - I am interested in this persistent connection technique. I will have to do some reading on that.

Regarding datasets, my main form opens from a list using the condition where ID = ID, so it is only that one record. On starting a new form, it opens where 1=0, a trick I learned from the Northwind sample. Everything is very fast, but only when not using encryption on the back end.
 

isladogs

MVP / VIP
Local time
Today, 11:29
Joined
Jan 14, 2017
Messages
18,219
To create a persistent connection, create a table with minimal/no data.
For example I have a table tblKickout with one Boolean field Kickout and one record which I use to manage closing the app for essential maintenance.

When the app is opened, a form bound to that table is opened hidden and remains open till the app is closed.
It serves a dual purpose as both a persistent connection to the BE and using a timer event which checks the value of that field every 30 s and starts a forced closedown if the Kickout field = True,

Using a persistent connection of that type should largely eliminate the issue you describe.
If not, there is another cause to identify and solve

Those links should provide enough info to solve the issue
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
To create a persistent connection, create a table with minimal/no data.
For example I have a table tblKickout with one Boolean field Kickout and one record which I use to manage closing the app for essential maintenance.

When the app is opened, a form bound to that table is opened hidden and remains open till the app is closed.
It serves a dual purpose as both a persistent connection to the BE and using a timer event which checks the value of that field every 30 s and starts a forced closedown if the Kickout field = True,

Using a persistent connection of that type should largely eliminate the issue you describe.
If not, there is another cause to identify and solve

Those links should provide enough info to solve the issue
So just having a bound form open is all you need? That should already be happening in my database as a user's time will either be spent with the data entry form open, or the record list open. In my testing clicking around, everything had a loading delay. I'm probably missing something.

I suppose I could make a tblHome and set it as the record source of my frmHome just to be sure. That one normally doesn't have a record source, but it is not a screen someone sits on for more than a few seconds anyway. The forms one would then open from Home, open as either a tabbed document without the tabs showing, perfectly overlaying Home (and the Access ribbon is hidden), or a pop up. So Home always stays open, and if you navigate back to it, it's really just switching back to that "tab", but you would never know. So there should be bound forms open 99.9% of the time.

I have been considering multiple back ends. The many table in the one-to-many relationship will grow by ~100,000 records per year. We won't run into size issues for quite some time, but it doesn't hurt to design for the future. If I did that, I'd need to ensure a persistent connection to each database.
 

isladogs

MVP / VIP
Local time
Today, 11:29
Joined
Jan 14, 2017
Messages
18,219
So just having a bound form open is all you need? That should already be happening in my database as a user's time will either be spent with the data entry form open, or the record list open. In my testing clicking around, everything had a loading delay. I'm probably missing something.

That's not the same. You need the same table opened throughout in order to maintain a persistent connection to the BE
Swopping from a form to a record list requires the connection to be dropped and re-opened.
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
That's not the same. You need the same table opened throughout in order to maintain a persistent connection to the BE
Swopping from a form to a record list requires the connection to be dropped and re-opened.
Same table, got it. I will experiment with binding a table to frmHome as that opens on startup and remains open. Thanks!

Then if I do split the many table to its own db, I'll have to think about a hidden form for that.

Or, would binding frmHome to a query that includes both tables do the trick?
 

isladogs

MVP / VIP
Local time
Today, 11:29
Joined
Jan 14, 2017
Messages
18,219
Suggest you use the keep it simple, stupid (KISS) mantra.
Use a table with one field & one record or an empty table to keep the overheads to a minimum
Remember that each open connection reduces the number of available connections
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
Suggest you use the keep it simple, stupid (KISS) mantra.
Use a table with one field & one record or an empty table to keep the overheads to a minimum
Remember that each open connection reduces the number of available connections
You know, I was just preaching KISS the other day to my team. I like it.

I thought that using the same table(s) that are already open would reduce overhead, but I will take your suggestion. If I split to 2 back ends, I will use a query including a dummy table from each. Do you see any issues with that?
 

isladogs

MVP / VIP
Local time
Today, 11:29
Joined
Jan 14, 2017
Messages
18,219
It sounds OK. Try it and see.
However, multiple backends can create other issues.
I would recommend you consider moving to a SQL Server BE or similar to manage capacity issues,
Doing so will also increase both stability & security
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
It sounds OK. Try it and see.
However, multiple backends can create other issues.
I would recommend you consider moving to a SQL Server BE or similar to manage capacity issues,
Doing so will also increase both stability & security
Yeah, it is a thought. We certainly wouldn't need it for capacity for a long time (10+ years?), but the thought of reducing network drive reads/writes is something. We'll see how testing goes in the next week or two. Thank you for all your help.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 05:29
Joined
Feb 28, 2001
Messages
27,179
While having an active SQL server on the back end would reduce network drive reads and writes, you can also get improvement by assuring that the tables are indexed properly for the things that most often are used to drive your searches. When Access is confronted with a WHERE clause and the criteria include values from an indexed table, it will attempt to use the indexes to identify the records and only bring them over. You still have to bring in the whole index but at least you won't have to bring in the whole table too.
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
Today I created a tblConnection as the recordsource for frmConnection, which opens hidden after clicking login. It's blank and you never know it opened behind frmHome. I may play around with the name(s) in the future.

The database is definitely performing differently now. My record list and popup data entry form are loading faster.

Tomorrow I will attempt encryption again and note the effect on performance with this persistent form in place.

I have also looked at indexing. The majority of my queries are things like open popup form WHERE ID = ID of a combo or text box, or similar. So that is already taken care of since primary keys are indexed. I also went ahead and enabled indexing on date fields and a couple of others that are used as criteria in queries. I haven't gone all out because I want to be sure I don't go too far with it, but so far things are running well.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 05:29
Joined
Feb 28, 2001
Messages
27,179
Of course, there is such a thing as too much of a good thing. Indexes make retrieval better but make updates, deletes, and appends worse. So it IS just a big balancing act.
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
Of course, there is such a thing as too much of a good thing. Indexes make retrieval better but make updates, deletes, and appends worse. So it IS just a big balancing act.
Yeah. How about effect on network drive I/O? Does everyone having a persistent connection with this method hit the drive harder?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 05:29
Joined
Feb 28, 2001
Messages
27,179
Persistent connects are never zero cost but if you aren't doing queries through that connection then the only major cost is a connection "keep-alive" which, for standard TCP/IP should be once every 30 seconds. Barely a blip when compared to normal Access traffic.
 

zeroaccess

Active member
Local time
Today, 05:29
Joined
Jan 30, 2020
Messages
671
Today I created a tblConnection as the recordsource for frmConnection, which opens hidden after clicking login. It's blank and you never know it opened behind frmHome. I may play around with the name(s) in the future.

The database is definitely performing differently now. My record list and popup data entry form are loading faster.

Tomorrow I will attempt encryption again and note the effect on performance with this persistent form in place.
I am pleased to report that the above mitigated the performance hit from encryption.

I made a copy of the front and back ends.

One front end was connected to an encrypted back end
One was left unencrypted

Going back and forth between the two versions, I could not tell a meaningful difference between them in terms of loading times. If there is a difference, it is very small. Neat trick. The next step is to test any compounding effects of adding real users into the mix.
 

Users who are viewing this thread

Top Bottom