I have a db that takes a lot of 'dirty' imports from different systems. I have a lot of queries and reference tables that comb through the data, clean it up, fill in missing data, run calculations, and transform it into a unified format.
Some of the 'final' select queries run quite slow. For example, one brand that needs a lot more cleanup takes twice as long as another brand that doesn't have as big of a chain of queries.
Is there a way to programmatically run a query but split it up into its component parts (down the query chain) and measure the time each part takes to run? I've done as much optimisation as I can in the areas I can think of, and unfortunately it is just simply not practical to check and run each part manually.
Let me know your thoughts. Cheers.
Edit: even cycling through to run and time every single select query in the db could help, as I can set it off and leave it for however many hours it takes. Hmmm...
Some of the 'final' select queries run quite slow. For example, one brand that needs a lot more cleanup takes twice as long as another brand that doesn't have as big of a chain of queries.
Is there a way to programmatically run a query but split it up into its component parts (down the query chain) and measure the time each part takes to run? I've done as much optimisation as I can in the areas I can think of, and unfortunately it is just simply not practical to check and run each part manually.
Let me know your thoughts. Cheers.
Edit: even cycling through to run and time every single select query in the db could help, as I can set it off and leave it for however many hours it takes. Hmmm...