Modular Design Theory (1 Viewer)

Thales750

Formerly Jsanders
Local time
Today, 17:08
Joined
Dec 20, 2007
Messages
1,702
Databases with well developed core functions are useful for many businesses. However, addressing the differences is imperative, and costly. So for these problems we can either create custom systems, or use modules. Not "code modules" actual components that are more customer specific and integrate with the core program.

The question for discussion is when is it better to open a new database, and when is it preferable to replace the forms and queries in the core database?

Thank you.
 

vba_php

Forum Troll
Local time
Today, 16:08
Joined
Oct 6, 2019
Messages
2,884
Not "code modules" actual components that are more customer specific and integrate with the core program.

The question for discussion is when is it better to open a new database, and when is it preferable to replace the forms and queries in the core database?
don't think i'm following this. care to expand on it in detail?
 

GinaWhipp

AWF VIP
Local time
Today, 17:08
Joined
Jun 21, 2011
Messages
5,673
Hmm, well I would think some (if not most) developers have something like...

Having one cuts on the cost as you only to incorporate the differences. (And in my case this would also include Code Modules.) All that said, my Model Database has grown as every time you take a *new* industry you find and yet another *core* table that you never knew you needed.

When it's time to create a new database I simply copy the Model Database and move on from there. I have even done this for rewrites. It's quicker then trying to fix everything in the existing file.
 

The_Doc_Man

Happy Retired Curmudgeon
Local time
Today, 16:08
Joined
Feb 28, 2001
Messages
17,010
Thales750,

The question you asked is actually the core of at least one fairly well-known business - PeopleSoft. They have a data layout and optional modules for business management including payroll, benefits, taxes, insurance... the problem comes about when you wanted to do something that wasn't part of their "standard" offering. Then you have to pay through the nose to fix things that conform to YOUR business model.

In answer to your direct question, you must do a lot of research on just how closely this "commercial off-the-shelf" (COTS) project matches what you actually do, and how much of your business model are you willing to change to begin using the COTS product. The closer the match, the more likely it is that you can make it work. But when there is a big divergence between your model and the model used by the COTS product, that way lies pain, monetary disaster, and (if it is a big enough "fail" - a touch of ignominy.

This business-model/COTS-model mismatch once led to a billion-dollar screw-up. You can look up DIMHRS as the case study. I watched it happen.

In brief, the U.S. Congress tried to achieve a savings by converting all services to use a single personnel management system, Defense Integrated Military Human Resource System, which would have been powered by PeopleSoft. The problem was that the major military branches each had their own systems and were not willing to change the way their internal business models worked. (Yes, the military thought of themselves as having a business model.) That discord on business models meant that they couldn't reach common ground and the project fell apart over the squabbling of how to manage military personnel.

It started with PeopleSoft coming in to look over each major branch's HR system. The PeopleSoft "modules" were oriented to the commercial HR world, but the military world has radically different ways of doing personnel business. (Not the least of which is that if you try to quit during wartime, it is called "desertion" and they shoot you.) But even in peacetime, there were RADICAL differences between military and civilian HR.

For the Navy Reserve, PeopleSoft identified over 3500 such differences. EVERY MAJOR MODULE of PeopleSoft had to be modified. But worse, every module had to be modified DIFFERENTLY for all of the services. (We didn't have the Space Force yet, so only five services, and the Coast Guard used something similar to the Navy's system - though not identical.) Eventually, the evaluation was made that the Navy would only be able to use 19% of the code extant in the PeopleSoft modules. The next competitor was an ORACLE package because ORACLE hadn't yet bought out PeopleSoft. They were an 18% match. The contract was awarded to PeopleSoft on a 1% difference in usable code.

The problems that abounded included such things as the database engines getting swamped because PeopleSoft did EVERYTHING by Triggers - so the database servers went "compute bound" trying to resolve the stack of triggers that applied to each record. The best hardware we could get could only manage a 36-hour run-time on the daily processing cycle, and there were no shortcuts available such as skipping a step, because with the military, you do things in the right order or they don't get done. (It's a military oversight sort of thing.) Eventually it became clear that DIMHRS was not viable and Congress pulled the plug.

Here are some articles on the subject:



 

Thales750

Formerly Jsanders
Local time
Today, 17:08
Joined
Dec 20, 2007
Messages
1,702
Doc, thanks for that story. Gina, of course it would include code modules, and everything else needed to be a plug and play.
vba_php, Maybe less detail is needed.

The question is: When you are assembling a new system from components, how would you decide when to use a second database front end, instead of adding the object to the core database?
 

CJ_London

Super Moderator
Staff member
Local time
Today, 22:08
Joined
Feb 19, 2013
Messages
12,128
My own view is that most 'off the shelf' systems will meet 80% of the requirements, the rest is customisation. With your own modules, since they are usually initially created for a specific client it will meet 100% for that client and 80% for new clients. It is not just about data storage, but also layout and terminology/naming convention (e.g. postcode/zipcode)

If you can break your module/table(s) to a small enough 'component' then that component will probably be useable across all applications with some built in configuration functionality - e.g. an address table/form/report. So show/hide labels, show labels to left/top/right, make first line bold, include a gap between name and address, make postcode right justified, etc.

I frequently get involved with providing solutions between enterprise scale systems where they have different standard schema - sticking to the address theme, one has 4 lines for the address, another has 6 or 2. One has an address table that can be linked to customer or supplier, another has separate tables for each whilst a third might include a primary address in the customer table.

An accounting system has different address requirements to a marketing or contact management system. Another might have the addresses in another system that is designated as 'data master'.

So you could incorporate all these variations into one module, just to handle addresses - but is it worth it?

Having said that I do have a few modules which are 'off the shelf' and customisable, but the range of customisation is relatively limited - e.g. a search function you can associate with any form, a multivalue continuous form which mimics a multivalue control, another which is a continuous form but has some of the functionality of a datasheet (user can hide, resize and move columns) etc. None of them really care about the underlying data, just about presentation and functionality.

When you are assembling a new system from components, how would you decide when to use a second database front end, instead of adding the object to the core database?
Think you need to be clearer what you mean - are you suggesting adding a financial system to a CRM system - both front end and back end?
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 17:08
Joined
Feb 19, 2002
Messages
29,001
If you are working in Access, you have different issues. Even though Access is a great multi-user app, it is not suited to multi-user development which is one of the reasons it is rarely used for large projects. Even if you have a source control tool, it is still awkward to have multiple people working on the same app at the same time. So, that constraint should be your ultimate guide. When developing a large, multi-function app for a client, I look at three things.
1. What processes share tables?
2. Who uses what processes?
3. Is any process performed out of the mainstream?

Regarding question 1. When tables are used by different processes, do all processes need to update the table or does one group remain in charge of the data and everyone else just references it?

One of the concepts I learned early on in my programming career and which is not talked about in the current literature is coupling and cohesion. Coupling is how tightly separate "modules" are connected. When you call procedureA are you passing a dozen arguments or two? Are the connections all visible or are there hidden ones? Each "module" should be a "black box". Its only connection to the outside world should be arguments passed and returned via the call itself. It is this deviation from the "black box" principle that causes many unintended bugs. You change one procedure and a completely unrelated one breaks. Cohesion is the internal coherence of a procedure. Would you use the same procedure to print payroll checks as you do to send marketing emails to your customers just because they both happen twice a month on the same day?

In general, I use a single FE. I occasionally create additional FE's for special purpose tasks. For one app for example, I had a nightly process that exchanged files with a different company. Since people didn't interact with this, I made it a separate app.
 
Last edited:

GinaWhipp

AWF VIP
Local time
Today, 17:08
Joined
Jun 21, 2011
Messages
5,673
Doc, thanks for that story. Gina, of course it would include code modules, and everything else needed to be a plug and play.
vba_php, Maybe less detail is needed.

The question is: When you are assembling a new system from components, how would you decide when to use a second database front end, instead of adding the object to the core database?
Hmm, only once have I ever created a second Frontend and I still regret it... as in will never do that again. Separate Frontends become a nuisance to update and generally not worth the trouble in my opinion. Are you thinking of a specific scenario (because I'd be interested in which case you would think this would be needed) or just asking?
 

Thales750

Formerly Jsanders
Local time
Today, 17:08
Joined
Dec 20, 2007
Messages
1,702
Hmm, only once have I ever created a second Frontend and I still regret it... as in will never do that again. Separate Frontends become a nuisance to update and generally not worth the trouble in my opinion. Are you thinking of a specific scenario (because I'd be interested in which case you would think this would be needed) or just asking?
Not really planning it. But as Pat said I can see a situation where the user may not overlap.

This is what got me going on this. Now that the chemical plant software is converted to SQL, with only a few minor changes to be made, other opportunities for it are becoming available. Mainly, small custom manufacturing. But their marketing, and order taking are completely different.

The other system I have for managing complex processes and projects has been evolving for the last 4 years. But once again, the sales order format is completely limited to the type of projects it was designed for. I've ported all Clear Lake's data over to that backed and it is much better for managing projects then the old one. But, and its a big one, I still use the old system (linked to the new BE) because the invoicing is completely different. That got me to thinking.
 

Thales750

Formerly Jsanders
Local time
Today, 17:08
Joined
Dec 20, 2007
Messages
1,702
CL,
Same server, different frontends.

When is it the right thing to do?
 

GinaWhipp

AWF VIP
Local time
Today, 17:08
Joined
Jun 21, 2011
Messages
5,673
Okay, but Pat's scenario has a separate file no one interacts with. So, not seeing that as multiple Frontends.

I would not *mix* the two. I would take what I could from the one and create another one for the one where Order, Marketing, etc. are in play. (If I am understanding what you are saying.) Yes, there will be overlap but the two are radically different and I would not want to mic\combine them into one Backend.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 17:08
Joined
Feb 19, 2002
Messages
29,001
The deciding factor in spinning off a separate FE to handle the batch process was primarily done to allow me to take on a second developer. It is very difficult for two people to be developing in the same FE and I needed to decrease the time to delivery. This was an ideal situation since once the application was operational, no one ever actually had to use the FE for the batch process unless there was some problem. Once the development was done, I considered importing the objects so there would be only one FE but didn't feel that anything would be gained.

Back in my former life as a Mainframe project manager I gave my manager a plan that divided the development effort into 9 major pieces that I could distribute to several developers who could work in parallel. The manager's less than intelligent question - "Could you develop it faster if you broke it into five pieces?"

The bottom line, when you're developing in Access, you don't want people to have to switch FE's during their normal work process so you would rarely split an app unless it was just too big physically for one database and I've never run into that. If you need to split the FE to allow multiple people to develop in parallel, you can do it by creating the management FE with the main menus and common stuff and have the menus open separate FE's for less commonly used functions. It is hard to make this seamless but you can make it smooth enough to not be jarring to the users.
 
Last edited:

The_Doc_Man

Happy Retired Curmudgeon
Local time
Today, 16:08
Joined
Feb 28, 2001
Messages
17,010
An IT-savvy economist named Barry K Boehm (think that's the right name) wrote several treatises on project management from the economist's point of view, sometime in the late 1960s/early 1970s. Pat, your manager's question actually makes sense in light of what BKB found.

If you have a project that can be divided up into 9 pieces, depending on how closely related they are, and based on "milestone" issues, it MIGHT be possible to get faster results with fewer people. It all depends on how "cleanly" the parts can be divided and how many meetings you can avoid in order to coordinate interfaces and testing - because such meetings are work killers. The cleaner the split for the parts, the more likely that you can do the maximum division. But it is not a black/white case - it involves shades of gray.

The "more cleanly divisible" case usually means that you can gain some economy by dividing up tasks once the overall specification is complete and the task specifications are completing pending their interactions with reality. So in my case 35 years ago when I had a major multi-person development project, we had 6 major tasks and could divide the labor, but the parts were unequal in size. I had to let other groups use my team members at times because of milestone dependencies that left them idle for relatively short periods. From an efficiency viewpoint, true divisibility usually means some people have down time because (from queueing theory) the "equal-sized tasks, equal-ability processor" case is easy to solve but doesn't occur often in nature.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 17:08
Joined
Feb 19, 2002
Messages
29,001
I wasn't using 9 people, I was using two and I do agree that the more people you add to an already late project, the later it will become. Throwing people at a mess just increases the need for meetings and CYA. The point was similar to normalization. Once you properly normalize a schema (or in this case a functional design), combining several tables (processes) into one, just makes a mess. It does not increase efficiency. The development times were not equal so the best person got the hardest task (Oh yeah, we're not supposed to say that some people are better at some things than others are and we don't want the people who are not as good to feel bad.) and in their slow periods, they each tested the other's completed work and made sure that it matched the documentation. Programmers are such optimists that they really can't be trusted to test their own creations except for very basic unit testing.
 

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Top Bottom