toddbailey
Registered User.
- Local time
- Yesterday, 18:10
- Joined
- Aug 26, 2011
- Messages
- 23
Hi All,
Is there a way to emulate a stored procedure in a excel to access link?
allow me to expand on this:
I have a excel sheet with vb/a code that does a lot of data processing and causes a lot of network traffic due to the volume of data being processed, once the datasets are ready, 1/2 a dozen worksheet tabs are populated.
Processing time can take several minutes to an hour.
Once the data is prepared, loading into Excel take a matter of a minute or 2
I need to rethink the data prep stage.
If I were using sql server or any other industrial strength database engine, I'd put the processing logic on the database server and leave the front end to only retrieve the final data sets for display.
I was wondering if I could follow a similar design in Access, in this case having Excel call a series of vb/a module, perhaps providing parameters or storing the parameters in a table. Then having the vb/a modules execute on the server that houses the access database file, Or is access just a file and all the processing is done on the machine that opens the file?
I'm looking for ways to minimize network traffic and all the issues related to it.
Ideas so far:
1. Copy the entire access file off the network and store it on the local machine, and all database i/o is then done locally.
The problem with this method is the master database file is over 500 mb in size.
Another thought, a local database, containing only the required tables. When the user hits the go button, the code could query the local database and if out of date by x hours, the tables would be reimported from the master.
While this idea makes the most sense to me, I not sure if there is a way to have excel do a table import in access, other than a delete * from table or drop table and a series of select into statements.
There are about 2 dozen tables containing a few hundred of thousand records. So a lot of uncompressed data would be brought in, Yes I can use where clauses to filter the data but then there is the added processing cost of filtering the data.
so the questions are:
Can I run vb/a modules from machine A with the database on server B and where is the code executed at?
or Do I need the database locally to eliminate net traffic
The easy answer is to move the database to Sql server and use stored procedures. That way I can make a easy ado call to execute the stored procedure with a few parameters, and avoid impacting the user's machine with unnecessary processing loads and traffic, once the data is prepared, automatically send an email to the user and they can open excel and import only the data sets needed.
thanks
Is there a way to emulate a stored procedure in a excel to access link?
allow me to expand on this:
I have a excel sheet with vb/a code that does a lot of data processing and causes a lot of network traffic due to the volume of data being processed, once the datasets are ready, 1/2 a dozen worksheet tabs are populated.
Processing time can take several minutes to an hour.
Once the data is prepared, loading into Excel take a matter of a minute or 2
I need to rethink the data prep stage.
If I were using sql server or any other industrial strength database engine, I'd put the processing logic on the database server and leave the front end to only retrieve the final data sets for display.
I was wondering if I could follow a similar design in Access, in this case having Excel call a series of vb/a module, perhaps providing parameters or storing the parameters in a table. Then having the vb/a modules execute on the server that houses the access database file, Or is access just a file and all the processing is done on the machine that opens the file?
I'm looking for ways to minimize network traffic and all the issues related to it.
Ideas so far:
1. Copy the entire access file off the network and store it on the local machine, and all database i/o is then done locally.
The problem with this method is the master database file is over 500 mb in size.
Another thought, a local database, containing only the required tables. When the user hits the go button, the code could query the local database and if out of date by x hours, the tables would be reimported from the master.
While this idea makes the most sense to me, I not sure if there is a way to have excel do a table import in access, other than a delete * from table or drop table and a series of select into statements.
There are about 2 dozen tables containing a few hundred of thousand records. So a lot of uncompressed data would be brought in, Yes I can use where clauses to filter the data but then there is the added processing cost of filtering the data.
so the questions are:
Can I run vb/a modules from machine A with the database on server B and where is the code executed at?
or Do I need the database locally to eliminate net traffic
The easy answer is to move the database to Sql server and use stored procedures. That way I can make a easy ado call to execute the stored procedure with a few parameters, and avoid impacting the user's machine with unnecessary processing loads and traffic, once the data is prepared, automatically send an email to the user and they can open excel and import only the data sets needed.
thanks