Every textbook I've ever seen gives you one or more algorithms you can use to determine every possible candidate key for R{ABCDEFG}. (It can be a fairly laborious and time-consuming job.) Based on the FDs you provided, I think R{ABCDEFG} has only one candidate key, {AC}.
Loosely speaking, for a relation to be in 3NF, it must be in 2NF and have no transitive dependencies. For a relation to be in 2NF, it must be in 1NF and have no partial key dependencies.
- The only candidate key here is {AC}.
- From A->DE we know that DE is dependent on A, which is only part of the candidate key.
Therefore, R{ABCDEFG} is not in 2NF (or 3NF) based on the FDs you provided.
When you have a relation R with a set of functional dependencies F and a decomposition of R in R1...Rn, you must consider two different concepts:
The closure of the set of functional dependencies F. This closure, called F+, is the set of all the dependencies derived from F, by applying, until possible, a set of rules called “Armstrong’s axioms”. This set can be very lange, (exponential with the number of dependencies of F), so in general it is not calculated. What can be calculated, instead, is a cover of it, which is a small number of dependencies from which F+ can be derived (and so which is equivalent to F+). This computation is useful, for instance, when you normalize a relation.
The projection of the set of dependencies F, over the decomposition of R. By definition, this projection is, for each Ri, the set of all the dependencies of F+ that have all the attributes in Ri (and is represented as πRi(F)). But, since we do not calculate F+, as said above, we cannot calculate such projection. What can be done in practice, in simple cases, is to see if there is some dependency in F that has all its attributes contained in Ri, so we are sure that such dependency holds in Ri. However, if in this way you don’t find a dependency in one of the decomposed relation, this is not a guarantee that such dependency is lost, since it could hold as consequence of other dependencies.
In your first example, you have two dependencies in F for the relation R(S, Ex, D):
D, St → Ex
Ex → D
So, consider R2 = (Ex, D), we can see immediately that in the dependency Ex → D holds in R2, since all its attributes are contained in R2, while the first dependency cannot hold neither in R1 and R2, since it has three attributes. Actually, in relation R1, you have neither the first, nor the second dependency (only trivial dependencies from F+ are present in R1, like St → St).
In your second example, instead, the decomposed relations are such that in the first there are all the attributes of the first dependency, so it hold trivially in it, while in the second there are all the attributes of second dependency, and again you can be sure that this second dependency holds in the second relation.
Finally, note that, while it is no computationally feasible to calculate πRi(F), there is an efficient polynomial algorithm to see if a certain decomposition preserves the dependencies, in other words, if ∪πRi(F) is a cover of F+. You can see more details in this answer.
Best Answer
In general, there are different canonical covers of a set of functional dependencies, and a canonical cover is called minimal if it has less dependencies of any equivalent cover.
So, for instance, in your example the cover:
is minimal, because it has 3 dependencies and it is not possible to find a cover with less dependencies.
There are several definitions and algorithms of cover. For instance, the Chapter 5 of "The Theory of Relational Databases" of Maier, D. Computer Science Press, 1983, describes several kinds of cover (nonredundant, canonical, minimal, optimal, annular), and different algorithms to find them, but for what I know nobody has given a formal algorithm to find all the possible minimal cover.