Function to loop through records looking for duplicates (1 Viewer)

Kryst51

Singin' in the Hou. Rain
Local time
Today, 04:17
Joined
Jun 29, 2009
Messages
1,898
I have been asked to create a report for our purchasing department. I need to total up weights for this report, I have some criteria that I have to meet to do this. My options are to use either the actual material weight OR the weight we debited our vendor, which can vary from eachother for various reasons. If possible I want to use the debit wieght, however some items have more than one debit line that utilizes the same weight factor, so I get duplicates and the weight is double in my query. I was hoping to come up with a function that would look at each item and if there was more then one debit line for it it or it uses a fixed amount would use the tag weight for that item instead of the debit weight, and if there wasn't it would use the debit weight.

I am sure my title is too vague, I just don't know how to describe this. I think I need to so some kind of looping through the recordset or maybe a recordcount, but am not sure.

Here is my current query to find the debit weight for each item (some Items bring back duplicates due to two amounts being entered with the same weight factor):
Code:
SELECT Debits.[Item ID], Debits.[Amount Description], Debits.Quantity, [Amount Description-Debits].[Amount Description]
FROM Debits INNER JOIN [Amount Description-Debits] ON Debits.[Amount Description] = [Amount Description-Debits].[Amount Description ID]
WHERE (((Debits.[Amount Description])=1 Or (Debits.[Amount Description])=14));

Here is my query that finds the total tag weight for each Item:
Code:
SELECT Items.[Item ID], Sum(IIf([Tag Information]![Return Tag Weight]=0,[Tag Information]![Ship Tag Weight],[Tag Information]![Return Tag Weight])) AS Weight
FROM Items INNER JOIN [Tag Information] ON Items.[Item ID] = [Tag Information].[Item ID]
GROUP BY Items.[Item ID];
 

Kryst51

Singin' in the Hou. Rain
Local time
Today, 04:17
Joined
Jun 29, 2009
Messages
1,898
Oh the woes of making things harder then they should be.... I just used a find duplicates query, and used that to filter my data.... :D YAY!
 

Users who are viewing this thread

Top Bottom