front and back end synchronization

2bb

New member
Local time
Today, 11:05
Joined
Jul 26, 2009
Messages
6
Dear reader,

In my db I import quite a lot of data. It grow to fast, so I had to split it into a back and front end, so I could compact and repair the back end after each import, without bothering the user. Works well.

However the data is processed and I use multiple statements to delete tmp tables, to create tmp tables, to append data to existing tables etc. Strange errors occur like "3107 record(s) cannot be added; no permission.." Boils down to the front end not knowing in the back end a new table was created in the statement before and so on. Even a table refresh of the back end doesn't work.

The .stillexecuting statement doesn't work as well in 2007. How can I ensure Access has completed the SQL task in the back end and gave the message to the front end? At the moment I use a function sleep, but this can't be the final solution, while I never know on which system the back end will be ran.
 
You should not be compacting a database that is currently available to other users.

http://office.microsoft.com/en-au/access/HP051874491033.aspx

"If you are compacting a shared Microsoft Access database (multiuser (shared) database: A database that permits more than one user to access and modify the same set of data at the same time.) that is located on a server or shared folder, make sure that no one else has it open."

You are lucky if you have not already had a disaster. You may have missing records without realising it.

The permission problem is probably something to do with not being able to get exclusive open rights while others are still connected.

You should not be running your data import from in the back end. This should be done from one of the front end databases. Temporary tables should be created in this front end.

The back end should only store data.
 
Don't delete the tables, just delete the data and import by appending.
 
Thanks Galaxiom and boblarson,

I first check whether everybody logged out before compacting.

Not deleting the tables is not an option. I want to be sure all data provided is imported. After import I start looking whether all colums I expected exist, if not I can give the user a message, controlled by me, not some user unfriendly Access error.

Do you have any other suggestions on my synchronization problems?

Browsing the internet a lot of people are working with sleep in Access. I guess it is the same problem. Access not ready with data processing, so it has to wait to execute the next code line. I would however think, a method is available to check whether Access is ready with dataprocessing.

Kind regards,

Bas Botman
 
Not deleting the tables is not an option. I want to be sure all data provided is imported. After import I start looking whether all colums I expected exist, if not I can give the user a message, controlled by me, not some user unfriendly Access error.

Do you have any other suggestions on my synchronization problems?
Keep the creation and destruction of temporary tables in the front end - append to the real table in the back end when you've done all the checks and are happy it's good data.

Edited to add: Oops - yes, in other words, what Galaxiom already said. If your application needs to routinely create new tables in the back end, you're doing it wrong.
 
I see often that users who use temporary tables used during imports have them as tables native to the front end (not linked). This has the effect of bloating the front end at a bigger rate than one would normally expect.

The preferred solution is to create a local temporary import dump mdb in the same folder as the front end. Link the dump tables to the front end and import as usual. This prevents the unneccessary bloating of the front end and is totally seperate fron the front end.

David
 
Keep the creation and destruction of temporary tables in the front end - append to the real table in the back end when you've done all the checks and are happy it's good data.

Edited to add: Oops - yes, in other words, what Galaxiom already said. If your application needs to routinely create new tables in the back end, you're doing it wrong.

Thanks atomic. That really is the problem. I first only had a front end, but after import to a tmp table it became to slow, so I had to compact and repair before contuining, that's why I splitted my app in a front and back end. Getting synch problems. Do you see an alternative?
 
Thanks DCrake.

Actually still will have the problem to synch between two db. The amount of data is too large, I suppose, so tmp (or back end) is still processing or didn't give tables / queries free to front end, before I reach next step in code, getting al kind of DAO error messages in the range of 3108 to 3112.

Have another idea, how to control synch the dbs?
 
So what problem are you actually encountering now. Is it a data issue? coding issue? What?

David
 
So what problem are you actually encountering now. Is it a data issue? coding issue? What?

David

Data is processed in back end using "docmd.openquery", runs well, no problems, data ok.

But in next step in the code, I use the data processed (calling a query or table based on the data processed) and the code tells me f.e. "3112 Record(s) cannot be read; no read permission on 'x'" or "3107 Record(s) cannot be added; no insert permission on 'x'." By the way I did refresh the querydefs and/or tabledefs of the backend.

Debugging the code, gives the front end apparantly enough time to set object 'x' free, while manually processing the code (after a resume) never gives the problem. So I started adding delays to the code, using a sleep function, incrementing the time to sleep in an error trap. Works quite well, but have to check the entire code where ever I am doing some data processing in the back end.

Besides doesn't feel good, can't believe this is the solution. Did find .stillexecuting, but doesn't work in 2007....... How can I see in code whether the back end is ready and gave the info to the front end. Last one is critical as well, because even after a dbbackend.tabledefs.refresh the front end, might give an error.
 
Thanks atomic. That really is the problem. I first only had a front end, but after import to a tmp table it became to slow, so I had to compact and repair before contuining, that's why I splitted my app in a front and back end. Getting synch problems. Do you see an alternative?

I like DCrake's suggestion of using a separate db local to the front end, but if this also is causing you problems due to the volume of data being piped between the two, how about:

Use a disposable front end.

If you have any tables containing local user configuration settings etc. in your front end, split *them* out into a separate DB, local to the front end.

Then compact and repair the front end, and store a copy of it somewhere.

Then do all your temporary stuff right in the front end, and when the front end gets too bloated, throw it away and replace with a fresh copy.
 
Data is processed in back end using "docmd.openquery", runs well, no problems, data ok.

But in next step in the code, I use the data processed (calling a query or table based on the data processed) and the code tells me f.e. "3112 Record(s) cannot be read; no read permission on 'x'" or "3107 Record(s) cannot be added; no insert permission on 'x'." By the way I did refresh the querydefs and/or tabledefs of the backend.

Debugging the code, gives the front end apparantly enough time to set object 'x' free, while manually processing the code (after a resume) never gives the problem. So I started adding delays to the code, using a sleep function, incrementing the time to sleep in an error trap. Works quite well, but have to check the entire code where ever I am doing some data processing in the back end.

Besides doesn't feel good, can't believe this is the solution. Did find .stillexecuting, but doesn't work in 2007....... How can I see in code whether the back end is ready and gave the info to the front end. Last one is critical as well, because even after a dbbackend.tabledefs.refresh the front end, might give an error.

In the end I skipped the whole concept of a back and front end database!!!
I just monitor the size of the database, asking the user to log out and compact the database, giving the user a form with all users logged on, so he can ask his collegaes to log out as well.

Not the most elegant solution, but Access isn't either.
 
In the end I skipped the whole concept of a back and front end database!!!
I just monitor the size of the database, asking the user to log out and compact the database, giving the user a form with all users logged on, so he can ask his collegaes to log out as well.

Not the most elegant solution, but Access isn't either.

I hope you are ready for the corruption that will be occuring with running a single database file with multiple users. Make sure that backups are kept regularly because you are playing Russian Roulette with a non-split database with multiple users on a network.
 

Users who are viewing this thread

Back
Top Bottom