Looking for Best Plan of Attack (1 Viewer)

cfp76

Registered User.
Local time
Today, 04:34
Joined
Aug 2, 2001
Messages
42
Hello -
I have a .csv file that I need to create a report from but I'm not 100% on how I'm going to go about doing it.

This is the problem: the .csv is nothing that I have control over, the layout is standardized but it is not optimal as far as my skillset is concerned.

In cells A1 through A9 are the filter information (date, type, ranges, linetype, etc). The header is row 10 (about 25 headers) and the data is under that. Somewhat like this

criteria
criteria
criteria
criteria

header1, header2, header3, etc
data, data, data, etc
data, data, data, etc

What do you think would be the best plan? Should I just take the .csv into Excel and deal with the headers that way OR should I write a query that pulls the actual data out from under the criteria in Access? If I do it in Access, is there a way to just have it "refresh" - because the end result is going to be an excel spreadsheet (most user friendly thing I could do for the person)....

I'm just kinda standing here looking around lost right now and am hoping to get some insight from people on the forum.
Thanks!
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 04:34
Joined
Sep 12, 2006
Messages
15,701
csv is far far easier to manage than a spreadsheet.
the main thing is you have a standard format

import with a file spec, based on the data rows

find a way to get rid of rows you don't need - ie some columns of the headers, etc, will be null, or non-numeric etc etc

validate the data, if you need to

finally, establish a series of queries to compare the import with previous data, to add new data, and update existing data. within this analysis you will most likely decide to split the single table into a normalised format.

takes a bit of work to set up, but once done, the whole import will be automated.
 

cfp76

Registered User.
Local time
Today, 04:34
Joined
Aug 2, 2001
Messages
42
Excellent - thank you for the start. I've never done anything with File Specs on import as to eliminating rows (I can do columns! heh).

Will be a challenge but any direction is appreciated.
Thanks!
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 23:34
Joined
Feb 19, 2002
Messages
43,474
The .csv format only works if all rows are identical in format like a table. The first row may contain column headings but it has the same number of columns as all subsequent rows. If the row format is not consistant, you need to use OLE automation to populate individual cells in Excel.
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 04:34
Joined
Sep 12, 2006
Messages
15,701
it's worth seeing if you can import the csv, I think.

using a copy of the real file, delete the surplus rows, until you get to the data rows proper. then design a file spec based on those.

now when you import the full table with the spec, the additonal rows will also come in, but some columns will be blank. you can delete or ignore these. there will be a way to identify the rows you don't need.
 

daievans

Registered User.
Local time
Yesterday, 20:34
Joined
Apr 3, 2013
Messages
75
I've been having lots of fun and help from around here withimporting text files of late.

Insofar as unwanted data is concerned, I got around that by importing the complete file and then writing a query to delete the records I did not need. My initial field in the recordset designated the type of data that record held.

As someone has already noted, I think a csv file will be easier to sort out as each row of the textfile will have the same structure.
 

Users who are viewing this thread

Top Bottom