Fast 5th Generation SSD Transfer Rate

  • Thread starter Thread starter Deleted Bruce 182381
  • Start date Start date
D

Deleted Bruce 182381

Guest
I just replaced my laptop's main storage with a 4TB, 5th Generation SSD and I'm amazed at the transfer speed.
It copied 6.5 gigs from my external backup to the new SSD in less than 20 seconds!

FastCopy.png
 
i have just Returned the external drive i purchased, it's made of china and
what inside is just like a small pendrive. the drive is 4T.
i needed it to backup my files 90G.
it copied alright, but the problem is all the copied files
are not readable or not in expected format.
 
NVe drives are even faster. Even on my laptop, without a raid drive?

I often see this:

1762806421460.png


And windows is "ready" for the future, since as you hit about 990 mb per seconds?
Then it jumps over to gb/s, such as this:

1762806475225.png


Imagine that - 1 gb per second!!!!.....
And my setup is by no means a fast computer - above is on my laptop.......

We really do live in a remarkable world in terms of memory, and cpu, and that of disk drive speeds.....

And, no moving parts anymore!!!

R
Albert
 
My 4TB SSD is a 5th Gen NVMe that's capable of sequential read/writes up to 13.8GB/sec and random read/writes up to 2.3M IOPS. However, the bottlenecks I have are slower USB 3.0 and SD Card ports on my older laptops. I need USB 4.0 and Thunderbolt USB-C v4 to leverage 40 Gbps speeds.
No doubt - you had well noted this was a external type of drive....(and still VERY fast!).

All in all?
Well, over the years I always seemed to have hardware that was lagging behind the software. And by the time I would get a new computer, then all of the software bloated so much, I was back to square one!!

Now? Well, it seems hardware for the most part has zoomed ahead of software.

We are in many ways spoiled today. And applications with legacy roots like MS-Access?
Well, Access sure loads fast these days - sure don't have to wait much anymore.....

I remember when we had 40 meg drives. or even my first 1 gig drive. yet today, we copy around such files without even batting an eye...

It's a wonderful life in regards of hardware.......


R
Albert
 
Despite all the advances in hardware, one thing that annoys me is when I get a "not responding" Windows message. The whole system crawls to sleep and you really can't do anything else until that process(es) completes or aborts. Makes me wonder how context switching is being handled. In Linux, you can have a long running process that gets queued into background and everything else runs normal. There's no "not responding" messages ever.
I suppose it depends on the GUI + the process.......

Linux land often is working with non GUI, and when the GUI "does" try to do something, then VERY often it's another separate process being kicked off.

And Linux does not have a single GUI manager like windows. So, there's no real "nanny" that can care or even know that some process has hung.
It really more of the GUI thing, then it is that some process went sideways.

I have for both web based, and even from Access?
We are attaching a PDF to a row of data (in Access, or from the web - web uploading of file(s)).
And when you do this, then I run code to create a thumbnail from that PDF (the first page of the PDF). This of course is done with a .net class I created (and both Access and my web site use the same .dll (class) from .net.

That .net class uses GhostScript to render and create the PDF thumbnail (and is saved in a row of the database).

Hence, say this:
1762826398077.png


Ok, then question:
How come if the GhostScript process goes sideways, or craps out, why then does MS-Access not freeze?
Answer:
Because I kick off a separate process! (not a new thread!).

So, in many cases, it comes down to how the software was written.

Since in Linux land, developers come from a development model that "encourages" one to launch a separate process, often via a shell() and pipe.

You do get a true/real separate process (that can have threads). And that process is NOT attached to the GUI in anyway - the GUI system has no knowledge as such. It's actually that lack of knowledge that results in the lack of "please wait in the GUI".

So, often it's a question of how such software is written. I mean often, when some copy, program, or whatever goes sideways on windows? Well, that hangs up the given "thing" or "program" , and the GUI waiting for a response. And to be fair, for the most part, the rest of the computer does not hang.

To be fair, some of the GUI's for Linux are now adding things like "force quit", and some are even graying out the window with a terminate option. However, there no real standard on how to do this in Linux, and there's no centralized GUI watchdog like windows has. So, as a result, you don't see a GUI dialog nagging of any kind. So, the process might have failed, but the user will not know!

So, yes, there is a different approach here - the GUI in windows was raised up as a young child expecting the OS to talk back and tell the GUI that something went wrong - it thus expects message(s) back from a process. Like so much of our industry?

History matters....

So, for advanced users? They probably don't care if some process has hung in Linux land.
For windows users? Hum, hard call, but it's probably a good idea that the window grays out, or some "nag" about the process going sideways, and do you want to wait some more.....

I have 6 cores, and 12 threads on my windows computer. I find if say I'm trying to open a folder, say non existing one on the network, and it hangs?

I don't wait, and simply launch another window/copy of file explorer (windows button + E).
That new copy of file explore? It's not hung, and I just try again....

R
Albert
 
The latest-generation drives are really very fast
Until a few years ago, the disk was the bottleneck, but fortunately that's no longer the case
On this laptop, which is nothing special, I divided the main drive into two 500 GB partitions
Copying large files (a 40 GB virtual machine) from drive C: to drive D: runs at a sustained 2 Gigabyte/sec
 
Some time ago, one of our members - I believe @amorosik - inquired on the topic of fork processes. We can certainly ask (a) if he succeeded and (b) how it turned out. I believe he was more interested in networking issues, but he still might have some insights.

As to an Access FE running against a native BE table, your problem isn't Access. It is the child process in which ACE runs. The GUI and FE code run in the main process, but anything that touches the DB has to have a separate process. I.e. queries don't run against the BE file. They run against ACE talking to the BE file.

I'm thinking that because of database infrastructure file locks, ACE linearizes all queries. (But it IS a black box to us, so I AM guessing a bit.) If we are talking about native BE files, I don't think they CAN go multi-thread. I also DO NOT believe that the same limitation exists for active SQL backends like SQL Server, ORACLE, or others in that class of database handler. I'm pretty sure they can go multi-thread if configured correctly.
 
Well, I have some Access apps that do some really long running DLOOKUP's and Queries against native Access tables with millions of rows and not only do I see the "Not Responding" message, but it also throws the entire box into limbo where I can't even see the cursor echo outside of the sleeping window. Since we're not sure about what's really going on inside the black box, perhaps someone like @Albert D. Kallal, or others, might be able to shed some light on this topic?
Well, first of all, if you do a lot of stuff, then the Access UI will lock up. There ARE some threads, and don't confuse that of "threading" models vs that of multiple processor tasks (often the term task and thread are mumbo jumbo up here).

>So then in Access, how do you fork/exec a new detached process for executing a long running query against native backend tables?

Well, first keep in mind that the JET now called ACE data engine runs in-process. That means it runs on/with the current process thread. You can think of calling ACE data engine code JUST LIKE calling a sub routine. You have to wait until that sub call returns back to you!

Remember, we often ran say windows XP with only ONE processor, and multiple programs seemed to be running, right?
And Access does actually have somewhat of a threaded model (not a multi processor model).

So, you can see this when a form loads, and you have say some controls that are the result of say dlookups(), or some VBA functions. The form loads, displays, and THEN the display starts updating the controls AFTER the data is displayed. This is the UI thread that access has.
These "threads" are in fact que's of events built up over time.

Hence:
me.calc - trigger the calculation thread - but not the UI (yet)
me.paint - trigger the UI thread to display current control states/status

DoEvents - trigger all of the above pending thread que's

And then we have ones for data, such as me.Refresh.

So, keep in mind, that EVEN if you ran access on a different OS - such as Linux and WINE? It's going to behave the same - it's not due to the OS, it due to that Access is not firing off, and triggering separate processes.

So, when you call a simple VBA sub? You have to wait for the sub to finish, and return, and your VBA can then continue.

Same goes for a update query - it's REALLY IS a sub call to some code (the JET/ACE) data engine, and once again, your code has to wait until that update query is done, and that "sub" returns back to you!

I mean, have you used a programming language and framework that supports threading?

So, could we introduce code that fires off another process (and thread)?
Sure, we can. The only problem is the archeological history of Access. VBA does not have async programming features built in - and it was written LONG BEFORE multi-core CPU's appeared.

So, can you say call a long running stored procedure on SQL server, and not have to wait in VBA?
Yes, you sure can! -- and of course that SQL server data engine is it's own whole server and CPU -- right?

However, can we do the same in Access?

Yes, once again, we could - but we would have to add some extra moving parts here.

Simple way:

Well, create a new front end, link the tables, and then use shell command from your current instance of Access. (shell() will start a whole new separate process). On the other hand, you could even just launch the extra front end, and run the query against linked tables - it would be a WHOLE new copy of Access, and thus you get a WHOLE NEW windows task.

Another way? Well, call some .net code from Access. That .net code can then start a new task/thread, run the query, but the sub call you make would return instantly. Of course then you not have any freeze up's, and you also not know when it is done, or if it failed (hey, that's just like Linux now - no nags, but no knowledge of what you just wanted to do finished, or froze up!! - As you can see, such a case now is less then ideal, right?

So, if you want that kind of cake (no freeze ups), then you ALSO be faced with having to test/check if the process in question finished, or it has hung up. But then again, those Linux folks know how to use the command line to test/check for those running tasks, and kill them, or check if they finished, right? And now YOU THE USER is faced with managing those stray tasks on your computer - but then again, that's what Linux folks use and like, right? (so, you use the jobs shell() in Linux to query and see what's going on).

Of course, you NEVER want to use dlookups() in a query - each row for each dlookup() will tirgger and require and create a WHOLE NEW query. And in 9 out of 10 times, you can replace the dlookup with a left join. You can also use a sub-query, but then we back to performance issues.

So, once again, this is not really a windows thing at all. It "mostly" comes down to that the data engine is NOT a seperate windows process - but then again, who always wants to setup a whole new processing system, say like SQL server to just edit some data in a table?

Now, in some cases, this concept of moving the data processing OUT of Access and pushing such processing such as SQL updates etc. to some other process? Sure, and that's called SQL server - and it's a option we have for Access.

Now, I could consider posting a small .net routine that you call from Access, and it would run such update queries in another process. However, keep in mind this would not help for returning data in most cases, but would I suppose be of use for update queries that take a long time.

But, then again, if you had/have say 5 update queries to run, each taking a long time? Well, if you adopt asynchronous programming, then does it matter if the queries you are running run out of order? (you see, all of sudden, introduction of asynchronous operations all of sudden becomes FAR MORE complex, since if you send 5 queries to that system, and don't want to wait? Well, now you can't necessary be sure which query you sent will even finish, or run first!!! And if you THEN decide that you need/want to wait for each query, then you kind of back to square one, and now are waiting for the update query to finish, right?

R
Albert
 
You are correct, but the REAL issue is in fact the data engine.
So, think of a old 8 bit game, or even a PC game back in the DOS days.
You had multiple things moving on the screen, you could use your arrow keys, and "everything" seems to continue updating. No UI lockup's, is there?

Access has this so called event loop for the UI part. As noted, that's why you can see a screen load, show data, AND THEN mutliple pending things on the screen THEN update - and the UI is not frozen. So, you can for example place a text box, a timer, and have that text box +1 (increment) by use of the time. But, during this time, you could ALSO be typing in another text box, and both the counter text box, and the textbox you typing into would both work, update fine, and you have no lockups. So, this type of threading model is used by Access.

So, with such an event model, then you don't necessary have a locking up UI - no more then that 8 bit video game that has multiple things occurring on the screen - all seemly at the same time.

Where things really fail is of course calls to the data engine. That engine is NOT part of the access processing loop - if it was, then in theory you could build a UI without lockup's, and do so with a single processor.

So it's not ONLY that Access runs on a single processing thread - with the so called UI loop that Access has, you as noted will "often" see the screen update, and update even AFTER the record displays. And since JET/ACE is not really part of Access, but is a part shipped with Access? Then that data engine and it's operations is just like calling a subroutine - your code has to wait.

So, Access does have provisions to prevent the UI from locking up - it's when you call external library code, be it some c++, some other routine, or in this case the data engine code, then you have to wait. However, as those often screen updates occur? They occur without a UI lockup, since access is able to process a "que" of events, such as .paint, .calc, .refresh and a few more - they can and do occur without a UI lockup (or at least without the appearance of one - just like how those 8 bit games worked....multiple things are seemelying occuring on the screen at the same time.

And in MANY cases, you can write VBA code to respect that event loop in Access. Take this simple form and code:

evLoop.gif


Note how the counter box updates in above.

Code behind:
Code:
Option Compare Database
Option Explicit

Private Sub cmdNoEvents_Click()

    Dim i       As Integer
        
    txtMsg = "Working..."
    txtCount = ""
    DoEvents   ' trigger pending UI updates que
    
    For i = 1 To 10
        txtCount = i
        Sleep 1000
        
    Next i
    
    txtMsg = "Done"

End Sub


Private Sub cmdWithEvents_Click()

    Dim i       As Integer

    txtMsg = "Working..."
    txtCount = ""
    DoEvents   ' trigger pending UI updates que
    
    For i = 1 To 10
    
        txtCount = i
        DoEvents    ' trigger pending UI updates que
        Sleep 1000
        
    Next i
    
    txtMsg = "Done"

End Sub

So, as above shows EVEN YOU can write non locking code! If you press the first button in above, the UI will not update.
If you press the 2nd button, the counter in "real time" does update on the form.....

So, if the JET/ACE data engine was setup to allow some "do events" every so often then UI locking could be avoided....

I have attached the sample database (zipped) that shows the 2 buttons in Access - the 2nd button having been written that allows the Access UI to not freeze up. Heck, while the code is running, you can right click the form, and select design mode!!!!


R
Albert
 

Attachments

might be able to shed some light on this topic?

Sure. For "Not Responding" the most common issue is a long-running loop in event code. Remember that VBA routines cannot interrupt other event routines. I'd bet dollars to donuts that there is some kind of timer internal in the GUI context that decides if you've been in Event context too long. So if you have a long-running loop in your event code, that is your bug-a-boo. The fix is to have some part of the loop count loop iterations and once every 1000 or 2000 iterations, toss in a DoEvents. That lets everything else play event catch-up. But it also allows the GUI to handle ITS business and reset those internal not-responding timers.

Once you are in event code mode, you can't leave it until you issue instructions that end or exit the event. Calling another routine doesn't go back through the GUI interface. Therefore, use of domain aggregates - which are functions that run queries - doesn't change your context in the long run. But DoEvents is the exceptional case that marks your place and lets you step aside for other events, then come back to your situation where you left it.
 
I put my long running DLOOKUP's inside DoEvents, and the UI still displays "Not Responding"
Right, as I explained, calling ACE/JET code means the code waits - just like if you called a VBA sub routine. It's blocking, and Access will have to simply wait until that dlookup is done - it's not ANY different then you having to wait for a VBA routine to complete. DoEvents() will thus not help.

However, it is a "common" Access myth that dlookups() are slow - they are not any slower then say using VBA code and a recordset to do the same thing.
However, it not that dlookup() is slow, it tends to be WHEN and WHERE it is used that is slow!
So for example, you NEVER want to use a dlookup() inside of Access SQL query - there are alternatives.

However, if you talking about binding dlookup() to controls on a form? Well, in fact you do get some threading here. If you have multiple dlookups() on that form? Then in-between each dlookup() Access can using it's threaded model update the screen, even if all dlookups() are not yet done. However, for EACH dlookup(), Access MUST and WILL wait, and during that wait time, Access can't do anything else except wait....

And often in place of 3 or 4 dlookups() on that form? Well, often you want several columns from the same data row. In that case, I suggest writing a custom routine, and use that to get the data, since with one data "row" pull, you can then use/get multiple columns from that row. With a mutliple dlookups(), you are re executing a full SQL query against the database for each dlookup() call. And if those dlookups() are resolving to the same (one) database row? Then in place of 5 dlookups(), you can build a function, and do only ONE database pull.

So, a pull of data from the database (dlookup(), VBA recordSet, or even a SQL query)? They all are blocking, and doevents() can't help. Access will wait, and the UI will be frozen during that time. If you adopted SQL server, then you could do many of these operations without blocking (freezing the UI).

However, if you pull into a reocrdset, and use a "loop" to process the data, and do a sum of the data using that loop? Then a doEvents say every 1000 records in fact would allow the UI to update before that processing is complete....
 
Last edited:
I am firing the DLOOKUP's in the On Open event of an unbound form. The form doesn't display any results until all 42 DLOOKUp's complete. In the meantime, I get "Not Responding" and the popup modal grays out.

View attachment 122276
Ah, very well then!

Ok, so do you have code in which a whole bunch of dlookups() are triggred by VBA then?

If yes, then I suggest several things:

First up, move such code to the on load event. On open is too soon, and the form display (and the display thread) has NOT YET been started by Access.

Keep in mind that on open event can view bound data values (you don't have any), but it can't modify them.....

And the open event of a form has a cancel parameter - if you set = true, then the form will never display.
(and, doEvents will NEVER help to display things!!!)


So, for "general" setup of a form? That code needs to be placed in the on load event. Save that event for ONLY to verify things, and PREVENT the form from ever loading and that form can't and will not display anything until the on open event is 100% done.

So, in fact, if you have a bunch of VBA dlookups()? Move that code to the on-load event.
In fact, you might not even need to place do events.

So, first, try the on-load event - it's better, since the form will have at least displayed.

Next, if each dlookup() does not display, then of course place a doevents between each one.

eg:
txtLoans = dlookup(bla bla bal)
doevents
txtInterest = dlookup(bla bal bal).
doevents
....more dlookups() + do events follow

So, for sure move this code to on load event. And if it does not start displaying each result, then try the above DoEvents between each dlookup()

So MUCH of the issue here is having used the on open event - as I pointed out, don't use that event for form "setup" code, and as I noted you can't even modify bound controls - they are read only, and updating not even allowed. And as noted, not only can't you update bound controls the display "thread" will not yet have started - doEvents can't and will not help in that on-load event.

So,
on open event - testing if form has valid data - set cancel = true, form will NEVER display.

on load event - general form setup code goes here...

R
Albert
 
Last edited:
I am firing the DLOOKUP's in the On Open event of an unbound form. The form doesn't display any results until all 42 DLOOKUp's complete. In the meantime, I get "Not Responding" and the popup modal grays out.

View attachment 122276
Frank, didn't you go through all this already a few years ago back on UA?

Why fire 42 queries when you are fetching identical data in six time periods?

You can do the same with a maximum of 7 queries and probably fewer.

If you want help sorting out speeding this up you should provide the tables, queries, fields, indices and criteria, and info about whether BE is Access or RDBMS.
 
the issue here is how to prevent the Windows "Not Responding" when running the current code, or any other code that takes a long time to complete. Has anyone else's Access app experienced that situation?

In my genealogy DB, it happened all of the time. Still happens now and then. I'm parsing my way through data for 2250 people (roughly) and the "Not Responding" kicks in. What I did was added a progress bar that updated every so often and made it do a screen refresh. Apparently that break in the action helped it know that Access was legitimately busy and therefore it complained less often. I had to add code so that it would do that only every 100 persons or so. Doing so was a balancing act. Too many updates became expensive in time, badly slowing down the desired process by wasting time on video cosmetics; too few updates let the "not responding" blurb pup up anyway.
 
yes, give my suggesting a try - it should then update each dlookup() as they complete.


Funny how one "little" tid bit of information (the on-open event) gives rise to a solution here.
I'm actually quite confident this will work for you....

You not need a repaint - the doEvents should suffice here. (it will allow the paint event to complete anyway).

Do post back and share how this worked!
R
Albert
 
Fair enough!! - and I VERY much like that you admit that what you have is less then ideal.....

However, that's life! - You certainly can "improve things" - always a great idea!

Regardless of the above, placing a doEvents between each dlookup will fix the freezing, and the controls will thus "update" as the dlookup(s) complete. The "key" or "Rosetta" stone here?

Moving this to on-load - the code does not have to change - and it will most likey fixed the "not responding" isssue(s).

However, as you note, you also want to give that code and "boatload" of dlookups() some love and care!

Do share how this turns out - I'm sure it's going to work out rather well!!!

R
Albert
 
is there any way to suppress those popup's?
Not really, but on the other hand? If you remove "poor" design decisions made, then 9 out of 10 times, you not experience any "not responding" messages.
In this case, the developer gave no thought as to why and when to use "on-open" vs "on-load". A basic mistake, and for me, I would have instant cringed that on-open was being used here! It's not the right event.

I mean, when you buy a car, there no information in the owner's manual that you should drive your car backwards down the freeway in reverse gear! That's going to cause problems - no matter how good (or bad) the car is!!

Had the developer realized that on-open has a "purpose" and setup of controls and code does NOT belong in that event?
Then the freeze up and the "not responding" probably would not have occurred here.

So, ultimately to prevent and not have those "not responding" issues?
Why of course simply adopt sound designs.....

In this case? The wrong approach was used - on-open is simply the wrong event, and EVEN adding doEvents will not fix this issue.

A simple cut + paste and moving that code into the on-load event will fix this example. Likely some doEvents may well be required, but regardless, the correct design choice(s) in the first place would have prevented this freezeup....

It really don't matter what dev tools and stack you use - bad decisions are, well, bad decisions.

Just last week, I had a page (one that I designed - so, I take full ownership of having made a bad design decision).

In this page, I display images from a database (a PDF preview thumb nail) - not a large image.

However, since the images are "in line" base64 images (no URL links to the image), and the image is in fact a asp.net "image button"?

Well, that means when users click on the button, then the page is posted back to the server. And since those images can't be cached by the browser, then all of those images are posted back to the server on button clicks. This is a "poor" design on my part. During testing, I only had about 5 or maybe 10 images tops. But, some clients used that page with 50, or 100 images!!

Well, that quite much causes a browser freeze up --- and customers get a VERY long browser spinner if they have a less then ideal internet speed....

Now, I was (and in fact had) changed the button code from a post-back of the page to that of a web method call (I changed button behavior to use ajax). But, as I started writing this code to fix the issue?

I realised I did't really have to change the code - what I had to do was tell asp.net to NOT perserve the image setting for that button (automatic view state!). By a simple turning off of viewstate for the button, then the delay, and "long waiting browser spinner" was all but eliminated....

In other words, this freeze-up like behavior was my fault, and the result of a poor design decision......

It don't matter if you using Access or doing web design - the problems in most cases are near always the fault of the developer! (that's me!!!!!).

All you have to do is fix a few of the poor designs, and for the most part is actually quite difficult to "kill" Access and freeze it up!

Access can actually tolerate quite a few bad design choices. Hence, in this example, use of on-open was the issue. Sure, you inherited some "less" then ideal dlookups(), but at the end of the day, even with those not so great dlookups(), the freeze up would have not occurred, or could be eliminated by adding some doEvents - and existing code not even changed. The wrong event was being used.

We often see the same in a access form when after update is used when before update should have been used....

Access applications don't and should not freeze up very often - and when they do, in most cases it means the developer(s) did something that should not have been done.....

So, no, there no magic fix - but in this case, and my web example? The problem was the developer bad choices, and not the tools being used....

R
Albert
 

Users who are viewing this thread

Back
Top Bottom