Puzzle Challenge - How to collapse records (1 Viewer)

cyberman55

Registered User.
Local time
Today, 16:35
Joined
Sep 22, 2012
Messages
83
This is an oddball. I have to import 255 fields (1 record) from an Excel spreadsheet. So, naturally, I used the docmd.TransferSpreadsheet command. I've used it many times in the past, but never for more than 20 fields or so. With this many fields, it failed, usually around column 117 and/or intermittently beyond that so the incoming data was spotty and incomplete. I could not find any documentation on some kind of bug. The errors table listed "Row Truncation" on all the problem fields. But none of the data had more than 60 characters and I am dumping the data into a table with all fields defined as Short Text.

So, I decided to see if I could bring in the data in smaller chunks or 52-70 columns at a time. And it worked! This proved to me that there was nothing wrong with the incoming data.

But running this code:

Code:
strEndRange = "CopyHere!A1:BZ2"


DoCmd.TransferSpreadsheet acImport, acSpreadsheetTypeExcel12, "CopyHere", strWBNameAndPath, -1, strEndRange


strEndRange = "CopyHere!CA1:DZ2"


DoCmd.TransferSpreadsheet acImport, acSpreadsheetTypeExcel12, "CopyHere", strWBNameAndPath, -1, strEndRange


strEndRange = "CopyHere!EA1:FZ2"


DoCmd.TransferSpreadsheet acImport, acSpreadsheetTypeExcel12, "CopyHere", strWBNameAndPath, -1, strEndRange


strEndRange = "CopyHere!GA1:ITZ2"


DoCmd.TransferSpreadsheet acImport, acSpreadsheetTypeExcel12, "CopyHere", strWBNameAndPath, -1, strEndRange

I ended up with four records in table CopyHere with no values except in the imported range.

So, the challenge (at least for me) is how to collapse all the data into a single record? I can't see an easy way to do it.

Just to clarify something that may be confusing: The tab in the workbook I'm importing is named "CopyHere" and the table I'm importing into is also named "CopyHere" (and yea, I know, bad).
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 16:35
Joined
May 21, 2018
Messages
8,527
I have to import 255 fields (1 record) from an Excel spreadsheet.
Why? No way is that useable data. There is unlikely any entity that can be described in 255 characteristics. Why not correctly normalize the data first? You cannot reasonably use a 255 field table.
However if you had to you should have put a common column in each range. That could simply be a row number (1 to number of rows). Then link on that
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:35
Joined
Feb 28, 2001
Messages
27,175
Well, here's a thought for you. The maximum record (not counting long-text data types, a.k.a. memo field and BLOBs a.k.a. embedded or attached items) is 4096 bytes but if you have 255 short-text fields, you are looking at over 7500 bytes based on you saying no individual field was more than 60 bytes, so I said "half that" for an average.

Access will never handle a "standard" record longer than about 4K bytes. And besides, I also agree with MajP that at least some part of that data set is not normalized. If you can normalize the data, you can probably compress the livin' heck out of it, because normalization often removes - not just a few bytes, but whole fields. It is for cases such as this one that we usually tout the merits of normalization.
 

plog

Banishment Pending
Local time
Today, 15:35
Joined
May 11, 2011
Messages
11,646
I'd break your 255 columns into 5 spreadsheets of 51 columns. Then to each I would insert a column before all your real data (as new column A). I would make that column hold the row number (A1=1, A2=A1+1, A3=A2+1 etc.) so each row has its own unique number and corresponds to the other 4 spreadsheets you made. That way you have a way to keep all the rows of data together even though they are on 5 different spreadsheets.

Once imported you can link them via that row number column and have all your data back together. However, once in Access that data shouldn't all be brought back together in one table/query. There is no way a properly normalized database has a table with 255 fields. You might as well stay with Excel if that's how you are going to set up your tables.
 

Uncle Gizmo

Nifty Access Guy
Staff member
Local time
Today, 21:35
Joined
Jul 9, 2003
Messages
16,280
In 2019 I was messing around with importing large datasets in excess of the 255 columns, the maximum in MS Access. I was using the Microsoft Access import wizard to import data, and I had some problems.

I changed the import text from short text to long text. (As suggested by Richard "The Doc Man)" In other words, from text to a memo field (talking in old Access terms). It imported the data OK, however it did throw up a very strange message.

The Error Was:-
Method execute Temp IMEX spec of object _WizHook failed

If you want to go over the 255 Fields limit then you will need to be a bit creative!

The first thing I would suggest is that you convert your data to comma separated values (CSV) in a text file. I recommend this because practically every system has a way of exporting it's data in comma separated value files. With CSV, any VBA code you write will work with any system. You just have the extra step of converting to CSV...

The other thing you should know is I didn't finish my project! I was trying to make a generic system for carrying out this type of import. I built a Google spreadsheet with 1122 columns and exported that to CSV. I then,using VBA in MS Access imported this in chunks, placing each chunk in its own table. In theory I could import as many columns as necessary. The first problem I encountered is it wouldn't import more than about 80 columns at a time whatever I did. If I went over 80 columns something would go wrong. Not a problem I just made my destination tables 80 columns or less....

The other problem is you are limited to the number of characters you can import, I think it's around 2000. I've got some private notes on this process. I say "private" because I wouldn't want anybody to read them as they are not put together that well they are for my own use. I'll post a few extracts for you and some links. Also Richard "The_Doc_Man" a member of this forum was very helpful in helping me work out what was going wrong. Richard knows a lot about how the fundamental Windows file systems work.


Links:-
Another way of Reading a Text File

Another:-

This was of interest Because it captures the first character of the string and checks to see if it is a line feed… Note the guy commented that he wasn't sure!

Another:-

I do have a nearly completed project for importing wide CSV files automatically, but I don't know when I can finish it.
 

Uncle Gizmo

Nifty Access Guy
Staff member
Local time
Today, 21:35
Joined
Jul 9, 2003
Messages
16,280

Importing large CSV files into MS Access - Nifty Access​


I did a quick demo of what I've got so far...
 

cyberman55

Registered User.
Local time
Today, 16:35
Joined
Sep 22, 2012
Messages
83
Thanks for all your responses. Perhaps I didn't give you a good background/overview. The client, a law firm, sends out a set of forms for their clients to fill out relating to Estate Planning. The get back a .cvs file that's 1271 fields wide. Ultimately, the date is getting parsed into a main table and at least 10 related tables. Currently one of their admins is spending hours manually entering the data into Access. So, to start, I had them pick the 255 fields of the highest priority. I think the suggestion to import the blocks of data into separate temp tables is a good one. Then, since the data has to be parsed into different tables anyway, I can just do it directly from each of these temp tables. I was stuck on just bringing all the data into one master table first, but since the data has to be appended to many different tables, well why bother?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:35
Joined
Feb 28, 2001
Messages
27,175
Sometimes I sound like a broken record on this subject, but "divide and conquer" worked over 2000 years ago for Julius Caesar when he divided Gaul (France) into three parts and conquered each part in turn. Divide that beast up into related chunks. Just out of curiosity, how many records are we looking at?

I'll also suggest that an ounce of prevention is worth a pound of cure. Lay out a strategy ahead of time on how you are going to load up what you are getting from that monster of a spreadsheet. When I was working for the U.S. Navy as a systems administrator, our yearly security reviews were done using monster spreadsheets, not so many columns as yours, but quite a few that wouldn't fit within 255 bytes if we wanted to convert to Access. They were ALWAYS tough to manage, so you have my sympathy.

Another reason for wanting to break it up is that right now you have "wide" data - but Access works best when you have "tall" data - i.e. not that many fields but a lot of records. So breaking up the beastly records into smaller chunks will probably be a good strategy. It will be worth your while to REALLY look at that set of fields so that you can try to optimize the import process. The less you have to update and rearrange tables, the less maintenance work you will need, because dealing with that much data doing a lot of updating will quickly lead to database bloat. Your DB back end will require frequent Compact & Repair operations and when you do that, you will find that your DB will appear to shrink significantly. This size change is due to bloat and is something to be avoided, or recognized and controlled.

Here in the link below is the list of sizing limits for Access. It says max record size for a single record is 4000 bytes excluding certain cases. Since you are using short text fields, none of the exclusions appear to apply to what you are doing.

 

pekajo

Registered User.
Local time
Tomorrow, 06:35
Joined
Jul 25, 2011
Messages
133
HI,
Just a suggestion, if you pivot the one record to a one column in excel and import the data which would be one field with all the records(Column) data and start from there.
Peter
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 16:35
Joined
Feb 19, 2002
Messages
43,266
How does the file get created initially? Surely the client isn't creating that monster .csv file. Maybe you want to look at fixing the problem at the source. You can create a spreadsheet that collects the data in a very organized manner. Only the data entry fields are unlocked. You can create separate tabs for the items that recur so they can be processed as lists.
 

Users who are viewing this thread

Top Bottom