Database with large table is crashing

jal

Registered User.
Local time
Today, 04:41
Joined
Mar 30, 2007
Messages
1,709
I'm using C#.Net code to populate an .MDB file (with data pulled from textfiles). This is my first experience with a large table. After populating half a million records my C#.Net program called "SearchText" closes with a message, "SearchText has encountered a problem and has to close. We are sorry for the inconvenience."

It's hard to debug at work because I don't have C#.Net installed here (i.e. no Visual Studio). So I'll have to do it at home - but what if I can't reproduce the problem?

So anyway, if anyone knows of any "gotchas" with large tables that might be causing this problem, please let me know.

The data still seems good - I can open the tables with Access 2007 here at work. The tables are simple (no memo fields), there are only about six of them, and none of them have more than 10 columns. The table with the half-million records only has about four columns.

After the last crash I compacted it down to 45 MB so it's not very large.

PS...Then again I guess there is a data problem. The Autonumber column of the large table doesn't seem to be populating (weird) or at least isn't displaying in Access 2007. It's not the primary key, though.
 
Dont know if its the cause, but remember there is a 2 gig size limit on access databases.
 
Dont know if its the cause, but remember there is a 2 gig size limit on access databases.
Right, but it only grew to 190 MB and, after the crash, it compressed (i.e compacted) down to 45 MB.
 
Anyone know why the Autonumber column isn't populating?
 
What is the code to actually populate the tables?
 
What is the code to actually populate the tables?

I have a sub which adds parameters automatically to a cmd object (which has proven very successful in other projects). So I create the cmd object like this:

OledbCommand cmdINSERT = (OleDbCommand)oQuery.Parameterize("INSERT INTO Accounts (FileID, Account) VALUES (@FileID, @Account)");
cmdINSERT.Parameters["@FileID"].Value = intFileID;
cmdINSERT.Parameters["@Account"].Value = strAccountNumber;
cmdINSERT.ExecuteNonQuery();


The primary key is the two columns FileID and Account. There is a third column called AutoNum (which isn't populating - why?) which is clearly set to AutoNumber. And a fourth column (boolean) with a default value set to false.

I have heard that @-signs can throw Access so I may have to try question marks, but I've had a lot of success with @-signs to date (even in Access).
 
I went into SQL view and typed out an INSERT INTO command (inserted one record) and here again, the record inserted, but the Autonumber column didn't populate. (I did not use parameters in this manual insert). So even if the @-params are causing problems, they are probably not the cause of this autonumber problem.

Maybe my DB is corrupt? But how can this be if it is brand new, and with such simple tables? There are no forms, queries, macros, or other objects - just a few simple tables.

NONE of the tables are populating the autonumber column.
 
Oh god, i'm a complete idiot. Those columns are set to "Number" not autonumber - can't read.
 
Oh god, i'm a complete idiot. Those columns are set to "Number" not autonumber - can't read.

That would do it :D. No worries, I think we've all done something like that at times. And me, many times. :)
 
Now I just have to figure out why it has been crashing after a half-million records.
 
I'm expecting it to be a bad record causing the problem - maybe i'm inserting a null somewhere, or a value too large for the field. At the moment I've restarted the code (on a different batch of data) to see how many records get populated before it crashes again - I'm wondering if it will crash on the same exact number of records.
 
Make sure that any number fields are set to accept the highest number possible. You might be getting to the max of some field size (see here for more).
 
Yes, it must be something like that - the second batch of data is going way past the half million mark, so the first batch must have had a bad record. I'll probaby figure this out eventually.
 
I don't think the problem was with Access. The logic is setup to parse small textfiles (50KB in size) and load the data in Access. What I didn't know is that, in one case, the IT dept zipped hundreds of them into one large zipFile (120 MB). So when my logic tries to load the file, it probably crashes due to a memory overflow. That's my theory, anyway.

So why didn't I just filter for files with .TXT extensions? Our IT dept wasn't wise enough to use .Txt extensions. In many cases they use the extension to form part of the filename. viz SomeFile.anypossibleName.

So if that's the problem, I can definitely find a workaround.

If Windows had just gave me a legible message such as, "Memory overflow" i would have figured this out sooner.
 

Users who are viewing this thread

Back
Top Bottom