Importing portions of text files

Kila

Registered User.
Local time
Today, 07:38
Joined
Mar 5, 2003
Messages
275
I am trying to write a macro that will import a text file. We download reports from another incompatible program to text files that consistently have the same number of irrelevant leading lines at the top. I want the program to ignore the first four lines (or records, if they were successfully imported) and only import the lines of data after that.

What would be the easiest way? I am designing a database to store the data, and I want it to be very easy for non-access users to use. (i.e., push a button, file is imported, end of story) Thanks in advance for any help you can offer.
 
Help?

Still looking for any suggestions anyone can offer. Thanks!
 
Help?

Anyone...Anyone?
 
We use a program called Monarch to do this. It's probably possible in Access, but I don't know how.

You might have been better off in the Modules & VBA forum, although the guys & gals here tend to read everything.
 
Sounds interesting. Does this Monarch clean up the data before it goes into Access? Is it easy to use? Can I use it with Access somewhat invisibly? I am looking to make a button that the end user can push to import the data without having to think or know much about what is actually taking place to minimize confusion & errors. Can this Monarch be behind the Macro/VBA & work invisibly in this way?

Regarding the other forum, yes, after this post was up for a few days, it occurred to me that that might have been a better forum than this one, so I posted it there because I was worried that those folks would not see it. I was not aware that they do read everything.

Thanks for your help.
 
There are a number of ways you can go about this problem. Depending on what the data looks like, you can use Primary keys or validation rules of your fields so that the leading records are not imported into your table.

You could also write a macro in word that deletes the leding 4 lines, and then run the macro from access.

Like I said, there are other ways. Give some info on what the data looks like from the beginning/how you want it stored in access. The clearer the question, the more answers you will get, I think.

And by the way, it is not a good idea to post the same question twice.


Fuga.
 
I agree with FUGA.. need more info on the structure of the file.
 
Sorry about the confusion with the double-post. It occurred to me later that I had posted it in the wrong place, and I was worried that the right people would not see it. I was also trying not to be TOO wordy. Thank you for responding. Here is some sample junk data that is similar to the kind of stuff I need to import. This data is "captured" from another program in a fext file. As you can see, it has leading irrelevant lines.

I plan to make the RequestNumber field unique, so that if the user accidentally imports the data twice, identical fields will not be pasted in, only new ones.

Here is the sample data...Please note that the forum keeps deleting the extra tabs and spaces, but the thing is delimited by tabs in reality:

SPECIAL REP ORT 1
Date Of
Request Number Location of Event Issue Codes
Section Involved: Clinic 1
123.04098 "FEB 2,2004" OPT AREA X0A Referral
123.90404 "FEB 3,2004" GENERAL X0B Referral
123.04036 "FEB 3,2004" PC 1 X0B Referral
123.04047 "FEB 3,2004" PC 1 X0A Referral
123.90405 "FEB 3,2004" PC 2 X0B Application
123.04095 "FEB 2,2004" PC 1 X0A Referral
123.90403 "FEB 3,2004" PC 3 X0B Applicationl
123.04056 "FEB 3,2004" PC 4 X0B Referral
123.90401 "FEB 2,2004" PC 5 X0A Referral


On a side note, another problem I am having is that junk records aside, the date is also not importing at all. I think the quotes are confusing it, even though the date format of the input mask looks like Feb 2, 2004 (without the quotes) I need to find a way to strip them off also.

Thank you so much for any help you can offer.
 
Last edited:
is the file name the same and is it in the same location every time you import.

Note: use [ php] TEXT [/php ] tags to keep formtting.. if you can do that now for us , that would be great..
:)
 
The file name CAN be the same every time if that is what I train the end users to do, so Yes to that. Here is the data again...

PHP:
SPECIAL REP	ORT			1
	Date Of			
Contact Number		Location of Event	Issue	Codes
Section Involved:  Clinic 1				
123.04098	"FEB  2,2004"	OPT AREA	X0A	Referral 
123.90404	"FEB  3,2004"	GENERAL		X0B	Referral 
123.04036	"FEB  3,2004"	PC 1		X0B	Referral
123.04047	"FEB  3,2004"	PC 1		X0A	Referral
123.90405	"FEB  3,2004"	PC 2		X0B	Application
123.04095	"FEB  2,2004"	PC 1		X0A	Referral 
123.90403	"FEB  3,2004"	PC 3		X0B	Applicationl
123.04056	"FEB  3,2004"	PC 4		X0B	Referral 
123.90401	"FEB  2,2004"	PC 5		I0A	Referral
 
so this
PHP:
SPECIAL REP    ORT            1 
    Date Of             
Contact Number        Location of Event    Issue    Codes 
Section Involved:  Clinic 1

is what you want to strip out?
 
Normally, because I am rooted in ASP, I would use the FSO (File System Object) to perform a text stream read into an array based on the TAB delimeter. Then I would have to find out how many tabs there are before I get to the core data. Then I would loop X to ubound(textstreamdata) and join the separated text back together (where x is the number of tabs to start from). Then I'd save the file. From here (for access) I would use the acimport function.

There are still quite a few things about access that I haven't "discovered" yet about how I would normally do things from a web standpoint. So unless someone else has a better solution, I recommend researching on how to use the File System Object to get the data you need.
 
What will happen if you design your table so that the first field is a double, and then import the file?

Fuga.
 
One way to do it

The only problem you've got is the text that is that all the text fields aren't in quotes. However, if there are only a few possibilities for the third last field then you can do an instr for them and concat quotes around them. The fouth and fifth fields you can also concat quotes. Then you can use the code behind the form to suck the data into the tables. Don't forget to change the tabs (chr(9) to , That way you can use the default import specs for access.
The date field is the only one that causes problems but there is a function in a module to change those dates to mm/dd/yyyy format that access requires for import.
 

Attachments

Last edited:
Thanks, but what about this error message...?

Thank you so much for this! I am working with it now. When I run it, this line:

Dy = Trim(Mid(TDate, Comma - 2, 2))

Gives the error message

Run time error '5'...Invalid procedure call or argument

Any suggestions on how to get around this?
 
What did you mean by this? Is this something I need to do to the data before import?

Don't forget to change the tabs (chr(9) to , That way you can use the default import specs for access.
 
Kila said:
What did you mean by this? Is this something I need to do to the data before import?

It is far easier to import the data if it is cleaned first. You don't have to change the tabs to commas, access can accept tabs as delimiters. Its just that in putting the text file example on the board, the tabs are lost when I pasted it to a text file. I'll change the text file I have and see if I can fix it up here. The reason for the error is that what I sent you was a general idea of how to go about it only. What I put in the db was the default for access. It expects "" around all text and , as a delimiter. If you want to email me a bigger chunk of text file, I'll see what I can do after work this evening.
 
That's it...but I've got bigger fish to fry.

Actually, that's all I have. I don't have rights to the program that spit it out. Another user forwarded the data to me (although I DID tweak it a bit before posting it to make it confidential). If you want, I can generate more data by cutting & pasting the lines & givine each a unique ID#

However, EEEEK! I have a larger problem before I can proceed with this one.... check out this link. Maybe you will have other suggestions.

http://www.access-programmers.co.uk/forums/showthread.php?t=63648

Thanks for your help!
 
This Is What I Would Do..........

Complicated but...

Import the data into a 2 field table...call it TBL_IMPORTS. The first field, named ID for example, should be a autonumber field. The second field, named say DATA, should be a text field set to the maximum 255 characters for safety.

Create a query called QRY_MINIMUM which outputs the lowest autonumber generated by your text.

Now create an append query called QRY_APPEND IMPORTED DATA and within that query display your QRY_MINIMUM and the table TBL_IMPORTS but do not create a relationship between the two. Add criteria stating that the ID number exceeds the QRY_MINIMUM figure by 4 ie [TBL_IMPORTS]![ID] > ([QRY_MINIMUM]![MINOFID] + 3) and thus you will now insert into your table only the records which are not in the first four lines!

Let me know how you get on...regards,

NEIL
 

Users who are viewing this thread

Back
Top Bottom