sharepoint forum? (1 Viewer)

Isaac

Lifelong Learner
Local time
Today, 01:25
Joined
Mar 14, 2017
Messages
8,922
So where does Access come into play in there? Just curious...
You can use VBA to do everything I mentioned
 

theDBguy

I’m here to help
Staff member
Local time
Today, 01:25
Joined
Oct 29, 2018
Messages
21,581
You can use VBA to do everything I mentioned
Where I am lost is why even use VBA? If you map a drive to the Doc Library, then why can't the user simply open that folder and then open the files in it? I'm not sure the OP explained the purpose of using Access yet.
 

tmaccabe

New member
Local time
Today, 01:25
Joined
Oct 13, 2023
Messages
26
Beware of jumping on the Dataverse bandwagon for Access. All I'm saying is make sure you know what you're getting into. In m experience, DV tables are not as performant as, oh say, Azure SQL or SQL Server or even SharePoint lists.

They do offer one advantage. When used with PowerApps, they can be configured to support off-line synching. I.e. lose you your internet connection and PowerApps caches data on the device running that app pending a reconnect to the internet. I don't think that's supported yet when linked to Access, though.

Just be aware of what you are getting into before committing to Dataverse as your backend.
Speaking of performance, what is it like to query a SharePoint List of 100,000 rows? I guess it's fine as long as your cache is up-to-date? I imagine loading the cache is the long part. How does this differ when linking to DV? Is caching the same? Are queries processed locally or can you do pass through queries? I'd love to see a matrix or chart showing what to expect from these two options given various scenarios. I would like to know the functional (truly usable) limits of integrating SharePoint and Access. I would like to know strategies for pushing these limits. I imagine one would be to limit how often you make table/list design changes (so that frequent re-linking can be avoided).
 

GPGeorge

George Hepworth
Local time
Today, 01:25
Joined
Nov 25, 2004
Messages
2,042
Speaking of performance, what is it like to query a SharePoint List of 100,000 rows? I guess it's fine as long as your cache is up-to-date? I imagine loading the cache is the long part. How does this differ when linking to DV? Is caching the same? Are queries processed locally or can you do pass through queries? I'd love to see a matrix or chart showing what to expect from these two options given various scenarios. I would like to know the functional (truly usable) limits of integrating SharePoint and Access. I would like to know strategies for pushing these limits. I imagine one would be to limit how often you make table/list design changes (so that frequent re-linking can be avoided).
I don't know anyone who has ever put 100,000 records in a SharePoint list. Maybe someone has, though.

I don't know the internals of caching in either environment.

There is no such thing as a pass-through query to SharePoint; it's not a database and lacks such tools.
 

NauticalGent

Ignore List Poster Boy
Local time
Today, 04:25
Joined
Apr 27, 2015
Messages
6,412
I don't know anyone who has ever put 100,000 records in a SharePoint list. Maybe someone has, though.
I have personally used Lists as high as 15k. Depending on what is being done, it can really bog-down. I was, however, able to mitigate this somewhat by using Views bit eventually, the performance got so terrible we went to SQL Server and life got real good, real fast.
 

GPGeorge

George Hepworth
Local time
Today, 01:25
Joined
Nov 25, 2004
Messages
2,042
I have personally used Lists as high as 15k. Depending on what is being done, it can really bog-down. I was, however, able to mitigate this somewhat by using Views bit eventually, the performance got so terrible we went to SQL Server and life got real good, real fast.
Back in the days when I was gainfully employed, I worked at a company which used linked SharePoint lists in one application. IIRC, the Office version deployed was either 2007 or 2010, probably 2007. One person had to manage a process with up to 25K records in the SP list. Whatever she was doing with those records required looping through the records in chunks of 2500 or 5000 at a time to avoid locking up the computer. Although I'm sure performance is better now than then, I would not anticipate it to be all that much better. Her desk was about 15 feet from mine. That's as close as I ever got to working with SP lists, and it convinced me I didn't want to do it. But, again, maybe it's better these days.
 

tmaccabe

New member
Local time
Today, 01:25
Joined
Oct 13, 2023
Messages
26
I'm working with a couple of lists now, one has 10k rows, another 14k. Once you're in and the cache refreshes, it's smooth sailing unless you are doing an update query impacting many rows. Bound form operations work just fine. Granted, I don't have more than a couple dozen users. It's important to remember that updating many rows means that not only must you run a long query against SharePoint, but that those same updates must be downloaded into each user's cache. I've learned to not only version my front-end, but also my back-end. I use a version table to determine if there have been design changes to the lists and call CurrentDb.TableDefs("tbl_name").RefreshLink only as needed. And, I only force people to get a new front-end if the current code/query/form/report changes are critical. I expect that these practices will reduce delays for users and reduce inadvertent loss of their updates as design changes occur.
 

Users who are viewing this thread

Top Bottom