Modifying their structure is generally frowned upon
Replication is more than likely out and I'd throw Sync out before that. (from real life high transacitonal tests on Sync Framework)
If 3-4 hours is your data latency exceptance, log shipping will probably be your bests bet here on a read-only copy. But how much change is happening in the log? find that out be monitoring it to see how quickly and how much you need to push across.
If you can't go to Mirroring or upgrade to 2012 enterprise, that is out although that would be a sound strategy if you can go Enterprise if not on it.
SSIS isn't meant to just dump data over but it can do it. You're looking at far too much in the terms of lookup transformations though and the task would be expensive in time and resources. Although, like I said, it can do it.
Really, there will be a distinct narrowing of choices based on answering a few questions
- Data latency acceptance
- Amount of data change in a given minute,hour and day Connectivity to the secondary
- Read requirements on the secondary instance
- Up-time of the secondary instance
- Alterations to existing schema and all objects
- Security
Yes there are significant benefits to data normalization if you are willing to do the work to achieve and maintain it. The two fundamental benefits to normalization are:
- Data Integrity
- Query Flexibility
The simple approach to normalization is to create a table for each person, place, thing, concept, or event. By doing this, you have each characteristic of each person, place, thing, concept, or event of interest to you in one and only one place in the database. The benefit of data integrity is achieved for when you insert and update data. Because you have normalized you don't have redundant copies of the same characteristic in many places across the database, each with a different value, that you have to remember to update and keep in sync. Second, you have only one place in the database to program checks when inserting or updating data to make sure the data is valid. Since your data is coming in from many excel spreadsheets, normalizing your data gives you the opportunity to implement data integrity checks on the load to make sure you analysis is based on accurate data. The benefit of query flexibility is achieved for when you want to read and analyze the data. Because you have normalized the data you can connect up the tables in a flexible way based upon the question about the data you want answered, including only what you need to answer the specific question. Second, this enables the database to return the answers to your questions much faster than if it had to scan through all the data, including the data not relevant to your question, in your un-normalized tables .
Access is a simplified DBMS and does include a basic SQL processor that allows you to write queries and thus take advantage of the benefits of normalized data. If you are eventually going to move to SQL Server, which is a full featured DBMS, then normalizing your data now will ease the transition and let you take advantage of the full capabilities of SQL Server and its very rich implementation of SQL.
As I mentioned in the beginning, to achieve these benefits you must be willing to do the up front programming to translate the data coming in from your various excel dumps and map the rows and columns in those spreadsheets to your normalized tables. This is not a trivial exercise but doable using Access programming. One approach would be to create tables that replicate the data as in the source and load the data into them. These are known as stage tables. Once you have the un-normalized data in Access tables you can then more easily write access code using SQL to extract the data from those stage tables, normalize it, identify data quality issues (say the same characteristic in two different excel dumps that should have the same value but do not), and load it into your normalized tables. This is the common method for normalizing data coming from an un-normalized source very common in subject area based data warehouses.
You will find this additional level of effort to be well worth it though once you have high quality, normalized data in your access database. You report consumers will see that you are a real data professional when you show them examples where data quality was poor and you discovered that fact so it can be corrected in the sources. Likewise, when they ask for a new report that analyzes the data in a very different way, you can rapidly create the new report using SQL to combine the data in the normalized tables in this very different way that wasn't originally anticipated. They will be very impressed that you are able to do this quickly and easily!
I hope this helps explain why normalization would be of benefit to you.
Best Answer
Yes.
When you "upsize", Access creates "linked tables".
The queries and forms on top of this don't know any difference
Some links:
Edit:
You'll need to deploy a new mdb/accdb to each client, along with a DSN (Control Panel, Admin tools, "Data Sources (ODBC)")