Problems importing file into Access 2019 (1 Viewer)

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
Oh all mighty forum, forgive me for I have sinned.

It has been 3 years 1 month 24 days since my last post *give or take a week*

My google-fu has failed me for some time now trying to find a more perminant fix for this, now I find myself here after some googling send me here facing my old home forum !

I have been busy doing all things non-access, however I have returned to my favorite tool to automate some stuff that badly needed automating….
This involves processing a bunch of files importing the ;-seperated csv-s into a database running some queries and doing stuff based off that… Rince and repeat every minute.

It runs like a charm for hours on end, untill at some point it corrupts throws error 3310 and wont work anymore.
Compact and repair seems to fix the problem for the next hours then crashes again. I have done simular stuff in older versions of access without fail and have always prefered using non-front-line-versions to avoid stupid errors like this….

Now forced to use 2019 I find myself stuck, does anyone have a proper fix for this? I currently have the database close every hour then compact and repair the database and start fresh which seems to work for now.....
but what an ugly work around that is :(
 
Last edited:

isladogs

CID VIP
Local time
Today, 04:54
Joined
Jan 14, 2017
Messages
14,018
Welcome back.
Repeatedly importing tables eventually leads to instability and then corruption. The best solution in my experience is to decompile a database at intervals. However that may not work in your case.
 

theDBguy

I’m here to help
Staff member
Local time
Yesterday, 20:54
Joined
Oct 29, 2018
Messages
13,210
Hi. If you're using temporary tables, then I might suggest using a separate BE for them.
 

Mark_

Longboard on the internet
Local time
Yesterday, 20:54
Joined
Sep 12, 2017
Messages
2,111
Rince and repeat every minute.

This many give you your answer. If you do this every minute, I'd suggest putting this code into its own little FE that gets started every minute rather than having it internal to the code that says "Do this every minute". This would allow you to copy over a fresh FE each time, do your import, then delete the used FE each time.

If you can do this in a memory drive or SSD, there should be no noticeable difference in speed.
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
Hi. If you're using temporary tables, then I might suggest using a separate BE for them.

Doing that offcourse... running into the 2 gig limit is not an issue at all
Also compact the BE on a regular basis

This many give you your answer. If you do this every minute, I'd suggest putting this code into its own little FE that gets started every minute rather than having it internal to the code that says "Do this every minute". This would allow you to copy over a fresh FE each time, do your import, then delete the used FE each time.
Not really efficient, I ommited the fact that authentication is needed.
Authenticating works, but its slow as hell (mostly at the other end). As long as the connections remain open I dont have to re-authenticate.
Closing the database closes the connections which means re-authentication which would put a serious performance drag on this proces....

Though this gives me an idea to perhaps use 2 databases which run for 2 hours or so. A runs 2 hours then starts B and quits.
B compacts A or copies a fresh database in place then runs 2 hours then starts A and quits.

Need to reconfigure some tables that need to be shared and would only need to authenticate once every 2 hours (or 3 or 4 which ever turns out to work reliably) but that may just work... Thanks for thinking along.

Anoying little thing though, this hocus pocus should not be needed :(
 

The_Doc_Man

Immoderate Moderator, Former MVP, Retired SysAdmin
Staff member
Local time
Yesterday, 22:54
Joined
Feb 28, 2001
Messages
18,351
After a quick search, I find that you are not the first person to report this problem. For other readers, error 3310 is "property not supported for external data sources or for databases created with a previous version of JET." I found a few references to it from 2016, which surely isn't related to Ac2019.

https://social.msdn.microsoft.com/F...or-for-databases-created-with?forum=accessdev

https://stackoverflow.com/questions/35732725/access-vba-mass-text-file-import-run-time-error-3310

This next one applies to Ac2000 and claims a workaround.

https://bytes.com/topic/access/answers/565885-trasfertext-error-3310-a

The next one comes from our own forum from a couple of years ago (again, clearly not related to Ac2019) and offers a potential source of the problem: DECIMAL field type.

https://www.access-programmers.co.uk/forums/showthread.php?t=294525&page=2
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
I am importing only integers and varchar/short text fields, decimal doesnt seem related.

the workaround seems export related, however writing my own IMPORT is an option. Using the import text is a lazy solution isnt it?

Think I ran accross the DoEvents solution someplace, tried that and failed :(

So what I am left with is either writing my own import routine or making my earlier chain of start A run x time, close A start B run x time, close B start A rince and repeat.
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
Just reporting back, still in the (slow) process of recoding to create my own import routine instead of the use of docmd.transfertext

The reduced use of this transfertext already adds to stability of the app :)
 

CJ_London

Super Moderator
Staff member
Local time
Today, 04:54
Joined
Feb 19, 2013
Messages
12,585
perhaps you can just use a query rather than transfertext. The basic select query would be

SELECT *
FROM [TEXT;DATABASE=C:\pathtofile].filename.csv;

note pathtofile does not have an ending \

from this you can convert to an insert/update/upsert query for validation purposes. If joining to another table, I usually alias the whole shebang e.g.

SELECT *
FROM (SELECT *
FROM [TEXT;DATABASE=C:\pathtofile].filename.csv) AS txt LEFT JOIN sometable ON txt.ID=sometable.PK

otherwise you will get an error along the lines of 'filename.csv' is not a valid name'
 
Last edited:

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
intresting I have never seen this aproach, how does the database know how to split the file and what column names are? My files are without header unfortunatily (not by my choice)
 

CJ_London

Super Moderator
Staff member
Local time
Today, 04:54
Joined
Feb 19, 2013
Messages
12,585
it doesn't - you tell it

SELECT Fld1, Fld2, .... FROM (SELECT * FROM [TEXT;....) AS txt

If data does not have headers? you add HDR=No within the square brackets (don't forget the semi colon)

[TEXT;DATABASE=C:\pathtofile;HDR=No]

If headers are not specified, Access automatically names them F1, F2, F3 etc

Suggest try it and see. Start with a basic select query using the query builder and add complexity as required
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
so you add stuff that you normaly would use in the import specification?
Which would basicaly make it use the same logic and ultimately run into the same problem with crashing after x hours of processing? :(
 

CJ_London

Super Moderator
Staff member
Local time
Today, 04:54
Joined
Feb 19, 2013
Messages
12,585
no idea what logic you are using, I've never had your crashing issue and it has never failed me, although most of my apps import on a maximum of an hourly basis not minute basis. I don't use the import specification. If necessary you can use the cDate/cLng functions to convert the values to the correct datatype.

it sound more likely your repeating every minute is causing your issue perhaps causing memory leakage. Or as reported in other threads on this subject the issue is with transfertext.

The benefit of using the sql is you can destroy the recordset after each iteration - or keep using the same one if appropriate.
 

isladogs

CID VIP
Local time
Today, 04:54
Joined
Jan 14, 2017
Messages
14,018
so you add stuff that you normaly would use in the import specification?
Which would basicaly make it use the same logic and ultimately run into the same problem with crashing after x hours of processing? :(

See my original comment in post #2
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
Isladogs, I am keeping your option in reserve trying to "keep it clean". I mean if it works 1000, 2000, 5000, 10000 times, why should it fail after 100000 times processing a file.
Where 100000 seems to not have a fixed number, it indeed seems rather random as to when it exactly crashes.

Though a fall back to a situation like I posted in #5
X is my "fresh" clean database
Start database A
After an hour (or so) copy X to B
Start database B, close A
After an hour (or so) copy X to A
Start database A, close B
rince and repeat.

Though that seems like a shitty workaround for something that should simply work.

Another thought has been to try and rebuild this in SQL Server, the tables and queries offcourse are not the issue... This is the build the connections and stuff to fetch the files and actually get permision to use SQL Server and have a DB server connect to the outside world. The thought of the red tape I have to cut thru makes me shiver !
Never mind trying to find out if something like that is possible simple and plain in SQL server.

Also the amount of data being processed, really feels like SQL Server would be overkill.
Talking about 2k records or so total per itteration, really not all that much
 

isladogs

CID VIP
Local time
Today, 04:54
Joined
Jan 14, 2017
Messages
14,018
Sorry but that seems illogical to me.
Bearing in mind that decompiling only takes a few seconds and, in a somewhat similar situation, prevented crashing being an issue, why not at least try it before making other changes that will take time and effort to implement.
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
Just to make sure, where you write decompile I am reading Compact & Repair...

Same thing or different?
 

isladogs

CID VIP
Local time
Today, 04:54
Joined
Jan 14, 2017
Messages
14,018
Totally different. Answering on my phone so can’t give a link but there’s an excellent article about it on the FMS site. Or do a forum search on the topic as I’ve supplied the link several times before


Sent from my iPhone using Tapatalk
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:54
Joined
Aug 11, 2003
Messages
11,680
Did some googling, found some references indeed its different.
However how do I automagicaly decompile my current running database?
 

isladogs

CID VIP
Local time
Today, 04:54
Joined
Jan 14, 2017
Messages
14,018
You can’t. You need to open Access using the /decompile switch.

As I use it regularly, I have a desktop shortcut to do this. I also have a right click context menu for Access databases in Explorer allowing me to choose from options to compact/decompile/open exclusive/ open read only etc
 

Users who are viewing this thread

Top Bottom