File too large

gblack

Registered User.
Local time
Today, 03:06
Joined
Sep 18, 2002
Messages
632
I thought I posted this last night, but I can't find it even searching on myself... so I guess in never took... if you actually see this elswhere on the site, my apologies...

I am attempting to create a metrics analysis table from another table. What I would like to do is copy the structure (only) from table 1 into a new table. Change all the fields in the new table to text (except for an ID field which would be an autonumber). Then run a seperate group by query against each column, counting the values in each group (i.e. first query would have two fields The grouped column and the column count.

Once I have these values I would like to concatenate them (with the count in parens) and then push these values back into the new table under the appropriate column.

My code does this. I basically loop through a recordset that runs to each column/field groups and counts and then Edits the new table with the concatenated data.

My first table is 170 fields and 38K records. The issue is that it's too much for Access to handle and it blows up (on field 123) Telling me the File is too large. The file does explode to 1G. Then I can shrink it back down to 67mb by running a repair and compact... and then run the the data for the rest of the fields in that table. When I compact again I get about 80Mb.

So now I have two tables, both with an ID field... so I try to link them together (via a make table query) and meld them into one table... but it keep running into that "File Too Large" issue.

How can I have two tables in a database file with a combined size of 80Mb, but when linked together are too large for the database file? Does it have something to do with having all text fields?

I looked up the limits to MS Access and the field count doesn't appear to be an issue since it's nowhere near 255... So what's the problem here?

Thanks,
Gary
 
I don't know from where become the error but I know that your table is far away from a normalized one.
Even if Microsoft allow 255 fields, I can't imagine an object with... 170 (important) properties.
So, I'm sure that your table can be (and SHOULD be) splitted in more tables that keep data for only one object.
 
...My first table is 170 fields and 38K records. The issue is that it's too much for Access to handle and it blows up (on field 123) Telling me the File is too large.

A Table that needs 170 Fields would be extremely unusual. It sounds like it may be the layout for a Spreadsheet, but without any more information, it is difficult to assist you further. If you could provide a Table Layout, or a stripped down version of the Database with a few records of test data, I suspect that we would have a greater ability to assist you.

-- Rookie
 
Sorry one quick but important error in my post... It's telling me record is too large when I try to combine the table and file too large when I try to run the process for the entire table...

Getting to the responses:
Normalized or not the tables are what they are. These are Oracle tables used on a MAXIMO system to house procurement data for the US Navy... There is nothing I can do about the table structure (as I am just a peon contractor with no real rights to that stuff). But there is something I can do about the data within. Thusly, I am putting together some tools to do some analysis.

Basically I've completed this task... with a few minor issues (i.e. some of the larger tables have to be chopped in half (column-wise) in order to push the grouped column data and counts. I've limited the field sizes to a character (i.e. Text) limit of 85 and record limit of the metrics table is 500 (figuring the vast majority of groupped data with over 500+ singletons is likely going to be a unique value and thusly incosequential to the overall evaluation).

So my data in the metric table would look like this for the first 3 records:
AUTO_ID---ASSET_ID-------STATUS------------COLOR-----------SERNUM
1------------198267364(1)--ACTIVE(37,123)--RED(20,545)-----NULL(38,376)
2------------198267365(1)--DISPOSED(347)--BLUE(14,987)----XVS129867(1)
3------------198267366(1)--ITB(72)-----------GREEN(2,987)----GDG82930 (1)...

This type of look allows us to easily scan the fields and find anomalies and other issues with the data we're putting into the system... at some point maybe the folks who deal with the actual structure will get this and realize they don't need the vast majority of fields the table is using... dunno...

Anyway, my question still remains... Why, when I try to put the two table halves together, does it tell me: Record is too Large?

Thanks,
Gary
 
There is nothing I can do about the table structure (as I am just a peon contractor with no real rights to that stuff). But there is something I can do about the data within
Indeed. And this "something" can be to "split" that table by using queries. Then use this queries instead that very large table.

Anyway, my question still remains... Why.....
Maybe someone know the exact answer from theory. I have huge doubts that someone know the answer from practice.
 
Sorry one quick but important error in my post... It's telling me record is too large when I try to combine the table and file too large when I try to run the process for the entire table...

Getting to the responses:
Normalized or not the tables are what they are. These are Oracle tables used on a MAXIMO system to house procurement data for the US Navy... There is nothing I can do about the table structure (as I am just a peon contractor with no real rights to that stuff). But there is something I can do about the data within. Thusly, I am putting together some tools to do some analysis.

Basically I've completed this task... with a few minor issues (i.e. some of the larger tables have to be chopped in half (column-wise) in order to push the grouped column data and counts. I've limited the field sizes to a character (i.e. Text) limit of 85 and record limit of the metrics table is 500 (figuring the vast majority of groupped data with over 500+ singletons is likely going to be a unique value and thusly incosequential to the overall evaluation).

So my data in the metric table would look like this for the first 3 records:
AUTO_ID---ASSET_ID-------STATUS------------COLOR-----------SERNUM
1------------198267364(1)--ACTIVE(37,123)--RED(20,545)-----NULL(38,376)
2------------198267365(1)--DISPOSED(347)--BLUE(14,987)----XVS129867(1)
3------------198267366(1)--ITB(72)-----------GREEN(2,987)----GDG82930 (1)...

This type of look allows us to easily scan the fields and find anomalies and other issues with the data we're putting into the system... at some point maybe the folks who deal with the actual structure will get this and realize they don't need the vast majority of fields the table is using... dunno...

Anyway, my question still remains... Why, when I try to put the two table halves together, does it tell me: Record is too Large?

Thanks,
Gary

More than likely when you combine the table with itself in the query, your field count is doubled. ie. take your original 170 and multiply by 2 > 255.

...Resulting in a too many fields type error.

As for the 170 fields x 37k records - when you dump the entire thing into a recordset, there are a few things that can be happening: you're hitting your RAM cap or you're hitting the 2GB MS Access DB ROM cap or the query is simply hanging for some reason...

Converting all fields to text will result in a substantially bloated file, especially during the conversion process. Try moving all tables to separate databases and linking them. Keep an eye on your file properties while the automation is occurring. Consider doing the automation in piecemeal if you can't run the whole table in one pass.
 
Splitting the table is a must, in order to get my end result. I said I did this from the getgo. But why must I split it is the question.

I pulled this from the internet:
Number of enforced relationships: 32 per table minus the number of indexes that are on the table for fields or combinations of fields that are not involved in relationships
Number of tables in a query: 32
Number of fields in a recordset: 255
Recordset size: 1 gigabyte
Sort limit: 255 characters in one or more fields
Number of levels of nested queries: 50 Number of characters in a cell in the query design grid: 1,024
Number of characters for a parameter in a parameter query: 255
Number of ANDs in a WHERE or HAVING clause: 99
Number of characters in an SQL statement: approximately 64,000

MS Access allows up to 255 columns in any given table that's a lot!. The table I am pulling in only has 170. Now, I am scrubbing the datatypes and making them all (TEXT 85)... but whether they're TEXT 85 or TEXT 255 seems to have no bearing on when MS ACCESS Crashes.

The fact I can Have one table that's 100 fields and another table that's 70 fields and my database is only 80MBs confuses me as to why I can't link the two tables together without getting a "Record too large" error... because these tables are relatively small (w/a 500 record cap).

Hmm... maybe there's a record with an extra quote or something, causing this issue...

Hmm... maybe if I run a LEN(whole record) it'll show me some crazy huge field or something... Imma go do that and see
 
Splitting the table is a must, in order to get my end result. I said I did this from the getgo. But why must I split it is the question.

I pulled this from the internet:


MS Access allows up to 255 columns in any given table that's a lot!. The table I am pulling in only has 170. Now, I am scrubbing the datatypes and making them all (TEXT 85)... but whether they're TEXT 85 or TEXT 255 seems to have no bearing on when MS ACCESS Crashes.

The fact I can Have one table that's 100 fields and another table that's 70 fields and my database is only 80MBs confuses me as to why I can't link the two tables together without getting a "Record too large" error... because these tables are relatively small (w/a 500 record cap).

Hmm... maybe there's a record with an extra quote or something, causing this issue...

Hmm... maybe if I run a LEN(whole record) it'll show me some crazy huge field or something... Imma go do that and see

FYI: Splitting a table and splitting the entire database are very different; I am uncertain which option you chose.

You should have no trouble joining 2 tables (1 with 100 fields and 1 with 70 fields) in a query, by Primary Key.
 
Yep I understand... I am not splitting any database.

I am baiscally chopping the table in half (meaning half of the columns are in one table (with a PK) and half in the other (with a PK) because when I attempt to run my process Access blows up to 1G and says "File too Large"

So I run two processes and after I repair and compact the database (with both tables) is only 80MB...

However, when I try to connect them together via a make table query MS Access breaks and says "Record too large"

That's what's baffling me.
 
Last edited:
Yep I understand... I am not splitting any database.

I am baiscally chopping the table in half (meaning half of the columns are in one table (with a PK) and half in the other (with a PK) because when I attempt to run my process Access blows up to 1G and says "File too Large"

So I run two processes and after I repair and compact the database (with both tables) is only 80MB...

However, when I try to connect them together via a make table query MS Access breaks and says "Record too large"

That's what's baffling me.

The repair / compact size of a database isn't relevant. The second you try to manipulate tables or data in the database, the file size will jump. You can watch your file size balloon in real time - just open file properties and keep an eye on it as your run your functions. If you ever see the size pass the 1.5 GB mark, start to panic.

I'm not certain if your "record too large" error is your field count, which you indicated caused some trouble, or actual database size, or if you're running out of memory because you're trying to cache a potentially 2+ GB recordset.

At this point it's really just theory-crafting. Moving the table chunks to different databases and pointing the maketable query to a third database will drastically slow the process down, but may solve size issues if actual database size is the error-factor. Good luck!
 
I saw this on the web:
http://support.microsoft.com/kb/111304 It basically says change some text fields to memo fields.

I saw another thread where the guy was having the same issue. They said the same thing to fix it (i.e. change text to memo), but that guy said when he changed all of the text to memo access gave him a too many fields error...

So I changed over half to memo and it helped a bit but still wasn't able to get through all the fields before crashing....

The strange thing was that the first record now has jibberish in it for some of the MEMO fields and in the case of EMIEC (see attached screen shot) it didn't pick up the count in parens. Doesn't look like a real fix.

So I don't know... I guess I leave it alone and in pieces. I am just doing analysis work on the data so I guess it doesn't hurt to have the bigger tables split... but still, it's pretty anoying!:banghead:
 

Attachments

  • Capture.PNG
    Capture.PNG
    27.8 KB · Views: 153

Users who are viewing this thread

Back
Top Bottom