Is there any limit of records?

setis

Registered User.
Local time
Today, 15:09
Joined
Sep 30, 2017
Messages
127
I have an excel linked to my database to auto-populate several fields.
This excel is dynamic and gets auto-updated from our navision system.

This excel has around 31.500 rows and we have experienced that the visibility for ALL records is intermittent; I could see some records that my colleague could not and when I closed and opened. The same row wasn't visible for me anymore.

Does anyone have had experience managing large number or records like this and is it a more stable way to have access to all these records coming from Navision (MS CRM) than a linked dynamic excel sheet?

Thanks in advance
 
make sure your PC has LOTS o memory.
 
Sounds like you are linking to Excel from Access. The fact that the Excel originated as an export from Navision (whatever that is) is irrelevant.

If people are using TransferSpreadsheet to link to or import data from Excel, make sure they are selecting the correct Excel version. acSpreadsheetTypeExcel12Xml is the version you need to select if you are importing .xlsx files.

Thanks a lot for this
 
Sounds like you are linking to Excel from Access. The fact that the Excel originated as an export from Navision (whatever that is) is irrelevant.

If people are using TransferSpreadsheet to link to or import data from Excel, make sure they are selecting the correct Excel version. acSpreadsheetTypeExcel12Xml is the version you need to select if you are importing .xlsx files.

EDIT -- I forgot to mention that using the correct SpreadsheetType will give you all the rows in the spreadsheet. If you use an older type, you will be limited to whatever rowcount that version of Excel supported.


I've looked at the TransferSpreadsheet method and I'm willing to try that.
The thing is that this big table with >32K rows is now a linked table and the weird thing is that when I open the excel itself, all the records are there, but when I open the linked table from access, there are a bunch missing.

Would it be better to transfer the spreadsheet to a table and trying to append the new records when the the original one gets updated?
 
From the links I provided I believing you can link directly to the original tables/server.

No need to involve Excel from what I can see.
 
I've looked at the TransferSpreadsheet method and I'm willing to try that.
The thing is that this big table with >32K rows is now a linked table and the weird thing is that when I open the excel itself, all the records are there, but when I open the linked table from access, there are a bunch missing.

Would it be better to transfer the spreadsheet to a table and trying to append the new records when the the original one gets updated?

Yes it would be better to import the excel file to an Access table particulary if there are several users and a shared backend db.
Performance should be better plus if needed the table will be editable in Access whereas the Excel file cannot be edited from Access.

You could set up a routine to run the import automatically at times of your choice e.g. overnight
 
You could set up a routine to run the import automatically at times of your choice e.g. overnight

Thanks a lot for your answer. Could you please point me out to an information source to set this up? This would be great!
 
the problem that you will be facing
when importing the excel file to a table
is identifying which records already exists
in the imported table, when its time to
re-import again, since the workbook is dynamic,
maybe there are deletion and additions of
rows on the workbook.
 
the problem that you will be facing
when importing the excel file to a table
is identifying which records already exists
in the imported table, when its time to
re-import again, since the workbook is dynamic,
maybe there are deletion and additions of
rows on the workbook.

Yes, that's why linking the dynamic excel sheet sounded like the right thing to do here.
I can imagine that setting up a process to append new records to an access table from a spreadsheet that it's continuously updating can be a nightmare.
 
I reference an Excel workbook in one of my databases and the workbook is then read only to Excel users. Fortunately that is only me, but something to be aware of.?

Yes, that's why linking the dynamic excel sheet sounded like the right thing to do here.
I can imagine that setting up a process to append new records to an access table from a spreadsheet that it's continuously updating can be a nightmare.
 
Thanks a lot for your answer. Could you please point me out to an information source to set this up? This would be great!



Out at the moment and answering on my phone. If you create a routine to import the excel data to an access table I'll reply again this evening telling you how to automate it using task scheduler


Sent from my iPhone using Tapatalk
 
Yes, that's why linking the dynamic excel sheet sounded like the right thing to do here.
I can imagine that setting up a process to append new records to an access table from a spreadsheet that it's continuously updating can be a nightmare.

No it isn't.
You need 3 queries or equivalent in VBA.
1. Append new records where these don't already exist.
2. Update existing records where changes have occurred
3. Delete existing records which aren't in the Excel import ...
OR mark them as inactive if preferred.

If you want me to explain how to do this automatically using task scheduler, let me know
 
Please tell us, how will you manage duplication, deletion an insertion of rows in excel. Pk on all field in the table will not work if there are identical rows in excel entered in two rows.
 
Basically you need a unique field in the Excel table so you aren't relying on an autonumber PK field.

For example with my schools databases, I run this process over night every night. I use unique pupil number (UPN) as the reference field.

I extract 30 CSV files remotely from the schools management system database and import each of these to temporary 'buffer' tables so I can modify where necessary.
I then run procedures to do the append/update/delete on each file.

The whole process is done at a set time overnight using task scheduler.
For simplicity, I have a separate utility database to run this process.
It is linked to both the CSV files (or Excel files in this case) and the destination db.
When it runs, it first locks the main database to prevent possible data conflicts and when complete it unlocks it again.
It also creates a log file of each part of the process and emails that to a designated email address when complete.
 
No it isn't.
You need 3 queries or equivalent in VBA.
1. Append new records where these don't already exist.
2. Update existing records where changes have occurred
3. Delete existing records which aren't in the Excel import ...
OR mark them as inactive if preferred.

If you want me to explain how to do this automatically using task scheduler, let me know

I would actually be more than grateful if you could explain me how to do this. There is no PKs in the dynamic spreadsheet, since this is only used as a lookup to bring a "Case Number" field on a form and auto populate some other fields in the form.

Thanks so much!
 
Before throwing myself into Ridder's solution, I tried to do a work around with my linked dynamic spreadsheet.

One problem that I had with Microsoft CRM was that I was only allowed to export 10.000 records at a time. What I had to do was to export from the selected date 6 months at a time and pasting the results into one spreadsheet. The last bit was a dynamic spreadsheet exporting the cases created on or after the 25/10/2017.
Then, I pasted all the results from the static worksheet into the dynamic one and this works actually great on excel. The spreadsheet refreshes correctly and the new cases appears there.

Linking this spreadsheet in access is problematic because I can see all records except those created on or after the 25/10/2017. I still don't get why, since I can see them on the excel.

Using the information here (second option) https://dynamicscrmpros.com/export-limitation-microsoft-dynamics-crm-microsoft-excel/ I could bypass the 10.000 records export limit and create a dynamic spreadsheet with all the 32K records BUT, linking this one to access shows me 0 records. It shows the records in the count below, but the rows are empty.
There must be some kind of problem linking to dynamic spreadsheets like this.
 
OK here's a summary of how I do this - its been in place for well over 10 years and runs perfectly 'unattended' except for rare issue where the network connection is lost

As I said before...

Basically you need a unique field in the Excel table so you aren't relying on an autonumber PK field.

For example with my schools databases, I run this process over night every night. I use unique pupil number (UPN) as the reference field.

I extract 30 CSV files remotely from the schools management system database and import each of these to temporary 'buffer' tables so I can modify where necessary.
I then run procedures to do the append/update/delete on each file.

This is done to update staff, student, timetable and other whole school data remotely. This will take a while depending on the size of the school

For a large school of 1500 + students the individual CSV files can be very large. For example the attendance mark files are split into year groups to make them more manageable - each CSV file is over 10MB and approx 150,000 rows

Attached zip file containing the following items:
1. How SDA Link works - that's the utility used to update the main database (SDA).
2. How to use SDALink with Task Scheduler
3. Two random example procedures as text files:
- AppendStudentExclusions - imports the CSV file, saves to 'buffer table' ImportedStudentExclusions and does some data manipulation
- UpdateTimetable - transfers the data already saved to the ImportedTimetable buffer table and saves it to the final table
In most procedures I use SQL code or recordsets but for simplicity I've chosen two procedures that use queries so its easier to understand. Obviously there's lots of items specific to my purposes that you can ignore in
these examples
4. Example of log file produced when each procedure runs - this is emailed automatically when the process is complete

Hopefully the idea will be clear enough in conjunction with my earlier comments.

To reiterate you MUST have a unique field on each record of your Excel spreadsheet so you can determine whether:
a) the record has already been imported (in which case update it)
b) it's a new records (in which case use an append unmatched query)
c) an existing record is not in the latest spreadsheet (in which case delete it or archive it)

Hopefully you can adapt to your needs
 

Attachments

Last edited:

Users who are viewing this thread

Back
Top Bottom