Recent content by ErikRP

  1. E

    Determining Year Currency

    Thanks to everyone for their collective assistance. I haven't got even 5 minutes to give this a try today but I'll report back on my progress. Thanks again!
  2. E

    Determining Year Currency

    I'm hoping someone can help me figure out a problem I need to solve. At our company, plans are reviewed every 3 years after a plan is created, i.e. a plan created in 2011 would be reviewed in 2014. Most plans were created in either 2010, 2011, or 2012 however new plans can be created, albeit...
  3. E

    Help with DLookup (Type Mismatch)

    That's exactly what I needed - thanks missinglinq! :) I'll be sure to keep this little sample handy in future! Thanks too Brian for your assistance - it was reassuring to know I was close!
  4. E

    Help with DLookup (Type Mismatch)

    Thanks for the input Brian, but if I run two separate DLookups, doesn't that create the situation where I am checking to see if there is already a First Name of "John", and then I am checking (separately) to see if there is already a First Name of "Smith"? I don't care if there is already a...
  5. E

    Help with DLookup (Type Mismatch)

    I'm trying to figure out a problem I'm having with DLookup, and I think I've hit the limit of my abilities. I've created a form (frmMain) that is used to enter student records. What I'm wanting to do is to check if a student already exists in the database. Rather than wait until the whole...
  6. E

    Joining on "Hard Return" Values - Possible?

    I need to join two tables, TableA and TableB. These were originally Excel spreadsheets which are being converted to Access for some data massaging, and then going back to Excel. TableA has fields: Stock_Item, Price, Description, etc. With sample data: Q00001, 10.00, Blue Ball Q00002, 15.00...
  7. E

    Querying to get Access Statistics

    Thanks! I'll have a look and see what I can find.
  8. E

    Querying to get Access Statistics

    This is either very simple or very complex, I haven't figured out which yet. I need to know the number of tables in my database and from each table I need to know how many records are in each table. Ordinarily I would just count the number of tables then open each one up to get the number of...
  9. E

    Impact of # of Fields/Records on DB Size

    Not sure if this is a Table question, a General question or what exactly. I'm working with a lot of records (millions of them). Assuming each record/field contains identical data, I'm wondering which would result in a larger overall DB size: - 10 million records, 4 fields each - 30 million...
  10. E

    Please Help With Concatenation Query

    THAT'S PERFECT!!! :) Thanks so much - that produced exactly the kind of data I was needing! I appreciate that so much!!!
  11. E

    Please Help With Concatenation Query

    The table name is Analysis. All of the fields are text strings. (I simplified my example but the data in each field is alphanumeric.) Thanks EMP!
  12. E

    Please Help With Concatenation Query

    I have a table that I need to query that looks something like this: ID Type Product ID 123 Health 323 123 Health 424 123 Dental 424 124 Health 323 125 Dental 323 What I need to see is something like this: ID Type Product ID 123...
  13. E

    Compare several hundred rows against other rows

    I must be missing something or doing something wrong - I'm not getting any results back. Here's what I have: SELECT Plan, Item, Price, Description, Colour, Value, Warn1, Warn2, Count(Warn2) AS Expr1 FROM Main GROUP BY Plan, Item, Price, Description, Colour, Value, Warn1, Warn2 HAVING...
  14. E

    Compare several hundred rows against other rows

    Not sure if what I want to do is possible, or at least possible the way things are set up. I have a massive table - c. 6 million rows. It contains data along these lines: Plan#, Item, Price, Description, Colour, Value, Location, etc. The primary key would be Plan# + Item. Each Plan# has...
  15. E

    Cleaning Up "Grouped" Data

    To clarify, the data wasn't in a CSV file, but instead imported via fixed width ASCII import. I didn't lose any data - it never existed in the report I was working with. The data I'm using is from a mainframe dataset. The report was formatted in a way what I suspect was a "Group By"...
Back
Top Bottom