TinkerMan
Dooh!
- Local time
- Today, 11:08
- Joined
- Jan 12, 2005
- Messages
- 35
I'm suffering with a bit of a performance drag, but it's all due to workarounds from earlier problems. So as well as asking for a way to increase the perfomance I want to appeal to your knowledge and creativity to see if there is an alternative solution to this.
Tables
Orders (ID, ItemID, CustID, date, TYPE and no_ordered)
Delivery (ID, ItemID, CustID, date and no_delivered)
1.
The type in the orders is the first speedbump, as there are 2 different types of orders (Normal and Extra). As a result (and also from customer specification) when the Delivery form is presented, it is important to expose how many of each order type is ordered. In addition it is benefitial to have only one line per item. Therefore I create a crosstab to put the two type columns next to each other. This then causes the recordset to be non-updatable. I also need to have columns where the user enters how many items that actually is delivered as well to check a 'Delivered' checkbox. I tried to make the updatable fields UnBound and to update the data by using event code. However, all rows will get the same value if you enter something into an unbound field
So, is there a way to have updatable and non-updatable columns in the same resultset?
2.
As a workaround for no.1, I added two columns in the Delivery table(*sigh*), one for each order type. When the Delivery form is opened I run code that will sync the order numbers into the Deliver form. The Delivery form therefore only needs to read from the Delivery table, making the recordset updatable!!
The way that this "synchronisation" (read inconsistency and duplication that I very much dislike) is done, is split into 3 parts:
I.
First an append query is executed that will add a delivery with it's customer and item ID with the specified date if it is not the from before. This runs very quickly, so no problem with this.
II.
The second part is to collect the order numbers that needs to be put into the Delivery table. This sum and crosstab query is also very quick.
III.
The third part is to loop over the result of 2. and do an update for each Delivery row. On average there is about 3-400 orders per day and it takes around 12-18 seconds to run, which i think is not very good. I added transaction to this update, starting before all the updates and commiting after all updates, but the performance was not increased significantly.
Here is the code for the second and third part:
After changing to run the query inside a QueryDef, it increased performance somewhat, but not enough.
Is there a way to dramatically increase the performance here?
3.
When I do INSERTs I can use sub-SELECTs to find what I need to INSERT. I tried to find out if it is possible to use that same approach on an UPDATE, but without success. Is this possible? I imagine that this UPDATE would be a lot more efficient than my manual approach.
Thanks again for your help
Tables
Orders (ID, ItemID, CustID, date, TYPE and no_ordered)
Delivery (ID, ItemID, CustID, date and no_delivered)
1.
The type in the orders is the first speedbump, as there are 2 different types of orders (Normal and Extra). As a result (and also from customer specification) when the Delivery form is presented, it is important to expose how many of each order type is ordered. In addition it is benefitial to have only one line per item. Therefore I create a crosstab to put the two type columns next to each other. This then causes the recordset to be non-updatable. I also need to have columns where the user enters how many items that actually is delivered as well to check a 'Delivered' checkbox. I tried to make the updatable fields UnBound and to update the data by using event code. However, all rows will get the same value if you enter something into an unbound field

So, is there a way to have updatable and non-updatable columns in the same resultset?
2.
As a workaround for no.1, I added two columns in the Delivery table(*sigh*), one for each order type. When the Delivery form is opened I run code that will sync the order numbers into the Deliver form. The Delivery form therefore only needs to read from the Delivery table, making the recordset updatable!!
The way that this "synchronisation" (read inconsistency and duplication that I very much dislike) is done, is split into 3 parts:
I.
First an append query is executed that will add a delivery with it's customer and item ID with the specified date if it is not the from before. This runs very quickly, so no problem with this.
II.
The second part is to collect the order numbers that needs to be put into the Delivery table. This sum and crosstab query is also very quick.
III.
The third part is to loop over the result of 2. and do an update for each Delivery row. On average there is about 3-400 orders per day and it takes around 12-18 seconds to run, which i think is not very good. I added transaction to this update, starting before all the updates and commiting after all updates, but the performance was not increased significantly.
Here is the code for the second and third part:
Code:
Function updateDelivery(Dato As Date)
Dim dbs As DAO.Database
Dim rst As DAO.Recordset
Dim qd As DAO.QueryDef
Set dbs = CurrentDb
Set qd = dbs.QueryDefs![qDelivery-Order_sync_data]
qd.Parameters![Dato] = Dato
start_Exec = Now
Set rst = qd.OpenRecordset
stop_exec = Now
MsgBox DateDiff("s", start_Exec, stop_exec)
start_Exec = Now
On Error GoTo handle_err
If (rst.RecordCount <> 0) Then
' for each row, update delivery
' Get default Workspace.
Dim wrkDefault As Workspace
Set wrkDefault = DBEngine.Workspaces(0)
wrkDefault.BeginTrans
Dim qd2 As DAO.QueryDef
Dim sql As String
rst.MoveFirst
While (Not rst.EOF)
Echo True, rst.AbsolutePosition
sql = "UPDATE delivery SET [delivery-ordered_normal] = " & IIf(IsNull(rst.Fields("0")), "null", rst.Fields("0")) & _
", [delivery-ordered_extra] = " & IIf(IsNull(rst.Fields("1")), "null", rst.Fields("1")) & _
" WHERE [delivery-id] = " & rst.Fields("Delivery-ID") & ";"
Set qd2 = dbs.CreateQueryDef("", sql)
qd2.Execute
'runSql sql
'CurrentDb.Execute sql
rst.MoveNext
'MsgBox sql
Wend
wrkDefault.CommitTrans
End If
stop_exec = Now
MsgBox DateDiff("s", start_Exec, stop_exec)
Exit Function
handle_err:
wrkDefault.Rollback
MsgBox "Error during update of Delivery: " & err.Description
End Function
After changing to run the query inside a QueryDef, it increased performance somewhat, but not enough.
Is there a way to dramatically increase the performance here?
3.
When I do INSERTs I can use sub-SELECTs to find what I need to INSERT. I tried to find out if it is possible to use that same approach on an UPDATE, but without success. Is this possible? I imagine that this UPDATE would be a lot more efficient than my manual approach.
Thanks again for your help
