newbie - basic questions about design

louis-m

Registered User.
Local time
Today, 09:33
Joined
Mar 5, 2008
Messages
24
hi everyone,
as a network engineer, it administrator, etc etc people expect to know absolutely everything about pc's. so here i am.....

a company whose network i look after are using excel spreadsheets on a weekly basis to take orders, wages, running costs etc from their remote sites (currently about 20 sites)
the spreadsheets are accessed via terminal services and are moved via a script at the end of the week so the remote site doesn't have access to the old records and replaced by a new sheet. so lots of spreadsheets floating about.
you can imagine how hard it is to pull past records & compare etc.

now, i'm new to databases and have order a book to gen up on the subject. the sites should only be able to edit the last 7 days of records but be able to see the last years. i can see how a query and subforms can achieve this etc.

each site shouldn't be able to see the other site's records and the head office should be able to run reports etc on each site or as a whole.

i can narrow the time down that everybody is accessing the data. the sites may grow to 50+ but the head office is unlikely to have more than 5 people access the dabase at the same time.

my question is..... (access 2007.... can't see any user security).....
am i better to do an individual database (all exactly the same apart from name, titles etc) for each site that sits in their individual folders via terminal services? then let the head office access each seperate database via linked tables?
my thinking for the above way was security (they can't see each other) and resilience (1 database gets corrupted, the others are still workable) and speed (only a few users accessing each database)
the downsides are obviously the amount of files although this is nothing compared to the excel files at the moment.
any ideas or help would be appreciated.
thanks
louis
 
consider these

split the database between code and data - i would consider just having one data file, rather than several (ie one for each company, but see below), and each user has a separate frontend.

access performance is unlikely to be acceptable over a network, but runnnig on a TS will be fine - you nneed to consider how the ts can save data spreadsheets/produce reports etc in a way that lets your users get at them - you may not want them to have full access to the TS

all versions of access have builtin security, that enables you to request login, and then be able to limit ability to open/process certain forms etc

however, this doesnt extend to restricting access to limiited content within those files tables - to do that you will need ot look at some conditional code etc to determine which records are retrieved by each user

your alternative of separate databases per regional office is workable, but makes it harder to get a consolidated picture of all the data.

i would have thought corrpution is very unlikely to be an issue - it sounds like you will be updating the dbs on a regular infrequent basis, and therefore even a catastropgic failure would be recoverable from backups. in any event access dbs are very reliable, in the same way as your spreadsheets dont really get corrupted.
 
hi, thanks for the prompt reply. the head office will always have between 3-5 users accessing the total data. the remote offices only access the data for say 15mins per day on a daily basis, most of the time at different times but you could end up with 30 offices accessing it at once for a few minutes every now and again. i was always under the impression that access would fall over at this level of users etc.
i am now leaning towards a front end/back end but can't for the life of me see how to restrict each user to a specific table etc (there isn't no users/groups in 2007 like there is in 2003)
and how do you go about from stopping the frontends being tampered with if you just supply a form to the users?
 

Users who are viewing this thread

Back
Top Bottom