Search results

  1. I

    My Database went from 120 GB to 150 GB after Normalizing? Is there a Compact command for SQL Server?

    Hi BlueSpruce, Thanks for checking in. I am primarily trying to analyze the quality of the data at this point. I decided to create a new database that only has records from 2024 and up while I am analyzing and creating reports. For Totals queries I find it's better to use Stored Procedures...
  2. I

    My Database went from 120 GB to 150 GB after Normalizing? Is there a Compact command for SQL Server?

    I was advised to go with varchar by someone to reduce the size of the DB. After reading the StackOverflow article I'll stick with nvarchar. I had BULK_LOGGED Recovery on. I thought that was the right choice vs. Full Recovery. I didn't know about SIMPLE Recovery until today. Thanks BlueSpruce.
  3. I

    My Database went from 120 GB to 150 GB after Normalizing? Is there a Compact command for SQL Server?

    I shrunk the log file with the below code. The database is now 43 GB which I can live with. Thank you George for recommending to look into the log file. Can I turn off logging all together? It slows things down and I don't need it while massaging reporting data. Thank you. -- 1) Switch to...
  4. I

    My Database went from 120 GB to 150 GB after Normalizing? Is there a Compact command for SQL Server?

    Hi George, The Log file is 112.82 GB making up more than 2/3 of the database size which is 155 GB. Recommendations? I actually don't need Logging at all. I make my own backups and this is read only reporting database. Thank you
  5. I

    My Database went from 120 GB to 150 GB after Normalizing? Is there a Compact command for SQL Server?

    Thank you for your responses. I ran DBCC SHRINKDATABASE (UMA); prior to posting. It doesn't reduce the database. I just re-ran it again... it ran for 30 minutes and again didn't reduce the database. See below screenshot. I might have to re-import the entire dataset into a new Database and...
  6. I

    My Database went from 120 GB to 150 GB after Normalizing? Is there a Compact command for SQL Server?

    Hello MS Access Community, I normalized tblManifestHeaders by removing the Generator Address fields and creating a tblGenerators lookup table. I repeated the same for Receiver Address. I also changed all fields from nvarchar(250) to varchar(250) in tblManifestHeaders because this is a US...
  7. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    Great catch Tom. I missed that index. The query now runs under 3 seconds.
  8. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    Good News! Normalization took under 18 minutes. I believe RECOVERY BULK_LOGGED made the difference in speed. The normalized table has 100,158,139 records. The below LINQ C# query took under 4 seconds which is decent. Thank you for everyone's advice! return await dbContext.ManifestLines...
  9. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    I am going assess the results of the normalization but I will check this out. I encounter Codes that are "," separated in a field often. Thank you for providing an alternative suggestion. This is why I made the original post... to get different approaches on how to solve this.
  10. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    The field wasn't being indexed and it still took 20 minutes. I learned about ALTER DATABASE UMA SET RECOVERY BULK_LOGGED; I am about to Normalize the Federal Waste Codes. I hope this takes under 3 hours. -- 1. Switch to BULK_LOGGED recovery ALTER DATABASE UMA SET RECOVERY BULK_LOGGED; -- 2...
  11. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    I took ~20 minutes to run a simple Update Query? Is this normal for 31 million records? I've never worked with Tables over 100K records. New experience for me.
  12. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    Index a nvarchar(4000) field?
  13. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    Yes the Join fields are both Indexed. I get good performance when I filter on any other field.
  14. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    Correct. The user will be filtering this at Runtime. I will be building the Where Clause with the criteria they provide. I guess I will normalize unless any other options are presented. I was hoping it wouldn't have to come to this. :) Thank you.
  15. I

    How to automate importing 31 .csv files into SQL Server?

    Looking for advice for a new post I just created. https://www.access-programmers.co.uk/forums/threads/advice-on-how-to-improve-query-efficiency-when-filtering-on-nvarchar-4000.334643/ Thank you.
  16. I

    Advice on how to improve query efficiency when Filtering on nvarchar(4000)

    Hello, I have a table with 31 million records. So far most queries return within 2-4 seconds but there is one important query that is taking 14 seconds. There is a field called FEDERAL_WASTE_CODES and 75% of the time it contains <= 3 waste codes (D001, F002, F003) and 95% of the time it...
  17. I

    How to automate importing 31 .csv files into SQL Server?

    I downloaded today's manifest data. 35,636 Manifests were added in a week. This entails 176,932,466 Kg of waste. I will likely do the update weekly. Thank you for all the input and advice.
  18. I

    How to automate importing 31 .csv files into SQL Server?

    Yes ManifestNum and LineNum is the Natural Key for the Details table. My understanding is that the Primary Key is always the Clustered Index? I always use ID as the primary key. I will be using the ID in all the table joins and table relationships.
  19. I

    How to automate importing 31 .csv files into SQL Server?

    Yes doing that except I always use ID for Clustered. I did find some records that have ManifestNum == null. I have to remove them. I am surprised SQL Server successfully indexed this.
  20. I

    How to automate importing 31 .csv files into SQL Server?

    There are not too many duplicates. I think there were around 10 cases like the below.
Back
Top Bottom