Data Base Entries are too slow

asingh

Registered User.
Local time
Today, 16:38
Joined
Jul 14, 2005
Messages
20
I have made an excell worksheet [for around 18 users] which sends data to a single data base excell sheet...using ADO. The prblem is that after around 7000 records have been saved to the data base sheet..data base entries [users trying to send data to the data base sheet] takes around 7-8 seconds per record. Is there any way to speed this up...

thanks
asingh
 
Sending data from Excel to Excel, or from Excel to Access?

If sending data to another spreadsheet, try changing the Excel application calculation method from xlCalculationAutomatic to xlCalculationManual while processing. When you are finished, turn the calculation back to xlCalculationAutomatic. Also turn the Excel Application ScreenUpdating to False while you are processing the files.
 
Am sending data from excel to excel on a shared drive system. the data base sheet is a closed sheet..with around 17 users sending data to that one sheet using SQL ADO. I am not using any of the commands mentioned...xlcalculationautomatic or xlcalculationmanual in my code....!
 
If you are sending data from Excel to Excel, you have the ability to access the Excel application model.

If you turn off the Screen Updating and Auto Calculation of Excel, your processing should occur faster.

I'm not entirely sure how you are using ADO if you are passing data from Excel to Excel. There doesn't appear to be a Microsoft Access object involved. Am I am misunderstanding something? :confused:
 
I sense doubt from you. ;)

Consider the following...

Code:
Public Sub Test()
On Error goto Err_Test

    Dim xlApp as New Excel.Application

    With xlApp
        .Workbooks.Open Filename:="C:\Temp\MasterSummary.xls", _
            AddToMRU:=False
        .ScreenUpdating = False
        .Calculation = xlCalculationManual
        '-----------------------------------
        ' Gather all the data from the subordinate workbooks
        ' Update the Master Summary file
        '-----------------------------------
        .Calculation = xlCalculationAutomatic
        .ActiveWorkbook.Save    
        .ScreenUpdating = True
    End With

Exit_Test:
    'Just because I'm worried that the routine might fail unexpectedly,
    'I would call a routine here.  If xlApp is not Nothing, I would reset the
    '.Calculation = xlCalculationAutomatic
    Exit Sub

Err_Test:
    'Error handling code here
    'Call ErrHandler("Test routine",Err.Number,Err.Description)
    Resume Exit_Test

End Sub
 
Or we could do as gHudson suggests and use the Environ() function call....

Quoting gHudson...
Code:
MsgBox Environ("UserName")
MsgBox Environ("ComputerName")

No seriously, I tried the Search capability..... after I posted my old code. ;)


Please ignore. Wrong thread....
 
Last edited:
I tried what FlyerMike suggested above...but it is still taking just as long...! I SET the master data base sheet as a new excell application and then set screenupdating = false and calulation = xlCalculationManual. The method that I am using to insert data into the master sheet is...

Set rstworkbook = CreateObject("ADODB.Recordset")
rstworkbook.Open strSQL, stringConnect
Set rstworkbook = Nothing

where strSQL is the query prepared for insert...which look like:

strSQL = "INSERT INTO [userdata$] ([PrimaryKey],[Date],[Processor],[Policy_No],[Worktype],[Status],[Credits],[CreditsTotal],[UpldTime],[UserId]) " & " VALUES ('" & xPrimaryKey & "','" & xDate & "','" & xName & "','" & xPolNmbr & "','" & xWrktype & "','" & xStatus & "','" & xCrdts & "','" & CrdtsTtl & "','" & UpldTime & "','" & UsrId & "') "


and other suggestions to speed this up.....
 
Are you calling the Insert query for each line in each workbook? I am assuming the Summary workbook is closed after each insert? This might be where the timelag exists.

I would try to break this task into two parts.

1) Gathering the information from the 18 (?) Excel workbooks. I usually create a UDT array and store the data (temporarily) in it.
2) Dumping the data from the array into the summary workbook.

I use Debug.Print statements through out my code to determine how long each phase takes.

Gathering data from each of the workbooks is going to be an arduous task.

The dumping of data into the summary workbook should benefit from the Calc and ScreenUpdate modifications discussed earlier.
 
Yes each workbook..[each of the 18 users have there own workbook] are inserting values to the data base sheet using insert SQL. I am closing the database sheet after each individual insert has been successcfully executed...using "Set rstworkbook = Nothing". What I noticed..when the data base sheet has little data present in it..[less than 2000] records..the inserts SQL does not take long..once the number of records reaches...around 7000+ records..each individual insert takes approx 8-10 seconds..which is too long..how does UDT array work...?? I could try that. But then users [18 of them] will loose the independence of updating the master sheet with there data whenever they wish...?

thanks..
 
Part of the problem is that you are using a lightbulb to hammer in a nail. In other words, you are using Excel in a way for which it was not designed. The "database" should be a database, not an Excel sheet. You should be using Access, or some other database, to hold the data. The Excel frontend would be more feasible then.
 
The summary "data base sheet" is an Excel Workbook, correct? It has the look, taste, and feel of an MS Access table.

In my simple make-believe world ;) , my users have Excel files with 3 columns of data.

Col A: This
Col B: That
Col C: The Other Thing

Col A is a unique identifier for a record. It appears once, and only once in the summary worksheet. If a user attempts to modify this column, Igor from IT pays them a visit. :mad:

I create a User Defined Data Type in one of my modules:

Code:
Public Type TGIF
    sThis as String
    sThat as String
    sTheOtherThing
End Type
When a user is in one of their spreadsheets, and wants to pass their data to Igor's Summary sheet, their routine looks something like this...

Code:
Public Sub PassMyHumbleData()
On Error goto Err_PassMyHumbleData

    Dim MyData() as TGIF
    Dim lIndex as Long

    Call GatherData(MyData(), lIndex)
    Call PassDataToSummary(MyData(), lIndex)  
    'We can cross this bridge later -- after you have gathered the data
    '  you open the summary workbook exclusively..or wait
    '  then find the key line and pass the data from the array to the workbook

Exit_PassMyHumbleData:
    Exit Sub

Err_PassMyHumbleData:
    'Call ErrHandler("PassMyHumbleData routine",Err.Number, Err.Description)
    Resume Exit_PassMyHumbleData

End Sub

Public Sub GatherData(ByRef MyData() as TGIF, ByRef lIndex as Long)
On Error goto Err_GatherData

    Dim sRange as String
    Dim i as Long

    Range("A1").Select
    sRange = ActiveCell.CurrentRegion
    Range(sRange).Sort Key1:=Range("A2"), _       
        Order1:=AscendingHeader:=xlYes, _
        OrderCustom:=1, MatchCase:=False, Orientation:=xlTopToBottom
    i = 1
    While Not(IsEmpty(ActiveCell.Offset(i,0).Value))
        lIndex = lIndex + 1
        Redim Preserve MyData(lIndex)
        With MyData(lIndex)
            .sThis = ActiveCell.Offset(i,0).Value
            .sThat = ActiveCell.Offset(i,1).Value
            .sTheOtherThing = ActiveCell.Offset(i,2).Value
        End With
        i = i +1
    Wend

Exit_GatherData:
    Exit Sub

Err_GatherData:
    'Call ErrHandler("GatherData routine",Err.Number, Err.Description)
    Resume Exit_GatherData

End Sub
 
Last edited:
Yes the data base sheet is an excell sheet..which is closed..and lying on the shared drive network. Where the excell sheet name is the table name...and the column name in the excell sheet is the column name like a ms access or sql data base..so the sql commmand "strSQL = "INSERT INTO [userdata$] ([PrimaryKey]" would mean ..that userdata$ is the sheet name and PrimaryKey is the column name in that sheet....and the answer the BOBLARSON " I dont have access to MS ACCESS platform...so I have to do it through excell..and I thought replicating a data base design in excell would work...
 
I think the repetitive opening, writing to, and saving the ever-increasing Excel file is the time killer.

Open the summary file once, traverse the array of passed data, write to the appropriate rows. Save the file. Take the rest of the day off.

The problems are:

1) While I am writing to the file, the other users have to wait. Too bad for them.
2) If I'm waiting to write to the file, too bad for me.
 
One thing you may explore is the use of Microsoft Desktop Engine, which is basically SQL Server "Lite." It has the same size restrictions as Access, but you might be able to go that route since it is free. If you don't know SQL Server, then there would be a definite learning curve. But, as mentioned by FlyerMike,

the repetitive opening, writing to, and saving the ever-increasing Excel file is the time killer

Definitely having so many people working on one spreadsheet and adding that much data will be problematic. That's why I mentioned the fact that Excel is not built for such a task. The fact is, that it can do what you want, but not completely the way you want or with the same speed that you would have if you changed to a relational database.

Good luck with it all.
 
are there any other way..that those 18 people can transfer there data [using their individual sheets]to that one single data base sheet...even it means that they do it at a scheduled time [so only one person is updating the sheet at a single instance], but is not slow...???
 
Another approach is that the master summary sheet would call the 18 subordinate sheets for updates.

The only burden (as far as read-write conflicts) is with the summary file. I would open the subordinate files as read-only, grab their data, and move on to the next file.
 

Users who are viewing this thread

Back
Top Bottom