Union Query issues

dmccleld

New member
Local time
Today, 06:26
Joined
Sep 26, 2011
Messages
1
Hey Everyone,

I have a question regarding a query I created for a project tracking database. Currently our company needs information based on location that is impacted by some change. The current table that was created prior to me becoming the administrator of the database has over 150 fields (PROJ_TABLE), 60 of which are locations due to over 60 locations at my company. Each field is a check box (LOC_CHECKBOX_1, LOC_CHECKBOX_2, ..., LOC_CHECKBOX_60). I was asked to create a report that returns the projects that are outstanding by the location that is impacted based on a user input of the name of the location.

My first attempt at this has created a very slow running report. I first created a query that searches each field (LOC_CHECKBOX_1, LOC_CHECKBOX_2, ..., LOC_CHECKBOX_60) for a yes check box and then outputs the string name of the location. I then had to create 2 union queries (UNION_1 & UNION_2) due to the size of the SQL to return two fields (PROJ_NUMB & LOC_STRING) that also filtered for non-null values of the string. Then I had to use another union query to merge the first two union queries (MASTER_UNION). I then created a report (LOC_REPORT) that has a parameter LOC_PARAM that filters MASTER_UNION for LOC_STRING that are similar to LOC_PARAM. This report also returns more fields from the original table (PROJ_TABLE) based on the linked field PROJ_NUMB.

This takes a butt load of time. Since running the union in essence causes everything to be run over 60 times and then I discard most of the information anyway.

First, I would like to know if there is a better way to use queries to return the information? Second, is it possible to prompt a user for a parameter similar to LOC_PARAM that then searches the fields string names for the field with a similar name, then search that field for yes/no? If you have more questions please feel free to ask.
 
If it were me, I'd scrap that table design, design a normalized structure, move the data into it and redo the report. Then you'd have a Locations table that was properly indexed, and you could much more easily write a faster performing report. Almost certainly the other 90 columns in the Project table contain denormalized data as well. Refactoring those will certainly yield additional performance benefits.
 

Users who are viewing this thread

Back
Top Bottom