Problem with linked table

setis

Registered User.
Local time
Today, 12:40
Joined
Sep 30, 2017
Messages
127
Dear all,

I have a DB with a linked excel spreadsheet. It is used to auto-populate different fields on a form.

The database is split and when more than one user are working on the same form, we get the error:

"The MS Access database engine cannot open or write to the file G:\myfile.xlsx. It is already opened exclusively by another user, or you need permission to view and write its data. "

Could anybody help me to solve this?

Edit: When I try to set the excel document as "shared" I get the error in excel "this workbook cannot be shared because it contains excel tables or xml maps. To share this workbook convertt he tabels to ranges and remove the xml maps.


Thanks!
 
Last edited:
Do not use Letter assigned drive paths: G:\myfile.xlsx.
(now you see why)
instead use the full UNC path: \\server\folder\folder\myFile.xlsx
 
you cannot directly write to a linked excel file. you need excel auyoation to do it.
 
you cannot directly write to a linked excel file. you need excel auyoation to do it.

Thanks, I'm not trying to write in it. It is used on a read-only basis to auto-populate a couple of fields in a form.
 
Access does not play well with others and it will not share custody of any linked files.

You might have to create a separate database where you can temporarily import the spreadsheet so Access only has it open for a minimum amount of time to reduce contention.
 
I would like to ask for a suggestion on which is the most appropriate way to do this.

I have a dynamic excel sheet that now is approx. 36.000 rows.
I need to auto-populate some fields in a form using this sheet.

Would the right way to do this be to import the sheet into a table on another Access DB and link it to this one?

I am afraid that if I import the table into my database, it would make it to heavy.

What about updating the this table in access? Which is appropriate way to update such a large table?
 
36,000 rows is not a big table.
Importing it locally will not hurt the size of your db greatly at all.

If it is a repetitive task make sure you delete the records from your import table rather than deleting and re-creating the table each time, as this will cause database bloat, unless you can compact and repair periodically.
 
Would the right way to do this be to import the sheet into a table on another Access DB and link it to this one?
I believe that is what I suggested. But how you do this depends on the situation which is why I didn't go into detail.

How frequently do you need to refresh the spreadsheet data?
Will multiple people be using the same refreshed data or will each attempt to do his own import?
Do you need the refresh to be automatic or will the users control it?
 
I believe that is what I suggested. But how you do this depends on the situation which is why I didn't go into detail.

How frequently do you need to refresh the spreadsheet data?
Will multiple people be using the same refreshed data or will each attempt to do his own import?
Do you need the refresh to be automatic or will the users control it?

* I would have to refresh it probably every day or every other day. It would be around 70 new rows every day.
* The idea is that multiple users will be using the same refreshed data, so it will have to be imported in the back end, if I'm moving away from the linked table
*If the refresh happens automatic, great, if it's user activated it's fine too.

Thanks for your suggestions!
 
* I would have to refresh it probably every day or every other day. It would be around 70 new rows every day.
* The idea is that multiple users will be using the same refreshed data, so it will have to be imported in the back end, if I'm moving away from the linked table
*If the refresh happens automatic, great, if it's user activated it's fine too.

Thanks for your suggestions!

When you import the new data to the BE, check whether the datatypes are correctly read by Access.
If there is ANY possibility of confusion, I suggest importing to a 'buffer' table first then using code to convert that data to the correct datatypes in the final table e.g. CStr, CLng, CDate
 
36,000 rows is not a big table.
Importing it locally will not hurt the size of your db greatly at all.

If it is a repetitive task make sure you delete the records from your import table rather than deleting and re-creating the table each time, as this will cause database bloat, unless you can compact and repair periodically.

Sorry for reopening this again, but I need to be sure that I understand correctly.

What you are suggesting is that after importing the table from excel (Tb1), when needing an update, I create a new updated table (Tb2) and append the records to the first one (Tb1). After that I should delete all the records from Tb2. Would that be the right way to do it?
 
It depends if you can link to the updated information with the right data types.
If you can link to it then you don't need to import it to a staging table, you can simply append from the linked sheet.

If not then yes the second table and the delete will be required.
 
How can I make sure that when I do the query append, I add only non-existing records?
I have a "CaseNumber" field that should be unique.
 
Last edited:
Use a Left join and set the case number criteria to Is Null in the target table.
 
Use an unmatched query - check the wizard for this

Link the source & destination tables using a LEFT JOIN
Set the linked field criteria for the destination table to Is Null
Change from select to append query

EDIT - Hi Minty - I'm not going to challenge you to a duel as you're quicker on the draw!
 
Use an unmatched query - check the wizard for this

Link the source & destination tables using a LEFT JOIN
Set the linked field criteria for the destination table to Is Null
Change from select to append query

EDIT - Hi Minty - I'm not going to challenge you to a duel as you're quicker on the draw!

I did the unmatched query and it finds the new records. Perfect!

The results of the query are perfect and no columns are duplicated

When I change from select to append and I try to run it, I get the error "Duplicate output destination 'CaseNumber'"

In the query details I have:

Field:CaseNumber
Table:1
Append to: CaseNumber (this is in Table2)

and at the end:
Field:CaseNumber
Table:2
Append To: CaseNumber
Criteria: Is Null

What is causing the error?
 
Clear the field name from the Append To line in the field with criteria Is Null (Table 2)
 
Last edited:
Clear the field name from the Append To line in the field with criteria Is Null (Table 2)

When I do this, the "append to" and the criteria "Is Null" disappear as well and it attempts to add all the records, not only the new..

Edit: It seems to work when I create a query from the unmatched query and append this one.. Thanks a lot for the help!
 
Last edited:
Code:
You've misunderstood what I meant.
Keep the Is Null criteria
Just blank the Append To entry for that field

Field:CaseNumber
Table:1
Append to: CaseNumber (this is in Table2)

and at the end:
Field:CaseNumber
Table:2
Append To: 'leave this box blank
Criteria: Is Null

Code:
INSERT INTO Table2 ( CaseNumber )
SELECT Table1.CaseNumber
FROM Table1 LEFT JOIN Table2 ON Table1.CaseNumber = Table2.CaseNumber
WHERE (((Table2.CaseNumber) Is Null));

attachment.php


Not sure what you meant with your Edit line:
Edit: It seems to work when I create a query from the unmatched query and append this one.
 

Attachments

  • Capture.PNG
    Capture.PNG
    13.5 KB · Views: 284
Last edited:

Users who are viewing this thread

Back
Top Bottom