Sensitivity Equations

Below are descriptions of each of the utility-free sensitivity measures that Netica calculates.  First are some notes for interpreting the descriptions.

Definition:  In the definitions, "belief" means posterior probability (i.e. conditioned on all findings currently entered). In the names of the various measures "real" refers to the expected value of continuous nodes, or discrete nodes which have a state number associated with each state,  "expected value" means to take the expectation over a quantity.

   Range:  The minimum and maximum values that this measure can take on.

   Compare:   A quantity which is useful to compare the value of this measure against (perhaps to express this measure as a percentage).

Equation:   Note that all the conditionals should include all findings already entered into the network, so P(q) is really P(q|E), P(q|f) is really P(q|f,E), etc.

 Notation:

Q  is the query variable

F  is the varying variable

q  is a state of the query variable

f  is a state of the varying variable

Xq  is the state number corresponding to state q

SUM~q  means the sum over all states q of Q.  It applies to the whole expression following.

MIN~q, MAX~q  are similar to SUM~q

E(Q)   is the expected real value of Q before any new findings

E(Q|f) is the expected real value of Q after new finding f for node F

V(Q)   is the variance of the real value of Q before any new findings

H(Q)   is the entropy of Q before any new findings

RMS   is "root mean square", which is the square root of the average of the values squared.

Entropy Reduction (Mutual Information)

Definition:

The mutual information between Q and F (measured in bits). The expected reduction in entropy of Q (measured in bits) due to a finding at F.

Range:

[0, H(Q)]     0 if Q is independent of F

Reference:

Pearl88,p321.  He has sign of I(T,X) backwards.
Var mapping: T->Q, X->F, I(T,X)->I

Compare:

H(Q)

Equation:

I = H(Q) - H(Q|F)
= SUM~q SUM~f P(q,f) log (P(q,f) / [P(q) P(f)])

Note that the log is base 2, which is traditional for entropy and mutual information, so that the units of the results will be "bits".

Variance Reduction of Real

Definition:

The expected reduction in variance of the expected real value of Q due to a finding at F.  This turns out to be the square of RMS Change of Real.

Requires:

Node Q is continuous, or has  state numbers defined.

Range:

[0, V(Q)]     0 if Q is independent of F

Reference:

Pearl88,p323.  What he says is C(T|X) is actually C(T|X)-C(T).
Var mapping: T->Q, X->F, C->V, t->q and Xq

Compare:

V(Q)

Equation:

Vr = V(Q) - V(Q|F) = Vm
V(Q) = SUM~q P(q) [Xq - E(Q)] ^ 2
V(Q|f) = SUM~q P(q|f) [Xq - E(Q|f)] ^ 2
E(Q) = SUM~q P(q) Xq
Minimum Belief

Definition:

Minimum belief that each state q of Q can take due to a finding at F.  
This provides a value for each state.

Range:

[0, P(q)]     P(q) if Q is independent of F

Compare:

P(q)

Equation:

Pmin(q) = MIN~f P(q|f)
Maximum Belief

Definition:

Maximum belief that each state q of Q can take due to a  finding at F.  
This provides a value for each state.

Range:

[P(q), 1]     P(q) if Q is independent of F

Compare:

P(q)

Equation:

Pmax(q) = MAX~f P(q|f)
RMS Change of Belief

Definition:

The square root of the expected change squared of the belief of state q of Q, due to a finding at F  
This provides a value for each state.  
This is the standard deviation of P(q|f) about P(q) due to a finding at F, with the finding at F distributed by P(f).

Reference:

Spiegelhalter89 & Neapolitan90,p394.  They call the square of this quantity simply "variance".

Range:

[0, 1]     0 if Q is independent of F

Compare:

P(q)

Equation:

sp(q) = sqrt (Vp(q))
Vp(q) = SUM~f P(f) [P(q|f) - P(q)] ^ 2
"Variance" of Node Belief (named "Quadratic Score" in older versions of Netica)

Definition:

The expected change squared of the beliefs of Q, taken over all of its states, due to a finding at F.

Reference:

Spiegelhalter89 & Neapolitan90,p394.  They call this "variance" (for them it comes out the same as Vp(q) because they just use 2-state nodes).

Range:

[0, 1]      0 if Q is independent of F

Equation:

s2 = SUM~f SUM~q P(q,f) [P(q|f) - P(q)] ^ 2
Minimum Real

Definition:

The lowest that the expected real value of Q could go to, due to a finding at F.

Requires:

Node Q is continuous, or has state numbers defined.

Range:

(-infinity, E(Q)]      E(Q) if Q is independent of F

Compare:

E(Q) = SUM~q P(q) Xq

Equation:

mmin = MIN~f E(Q|f)
Maximum Real

Definition:

The highest that the expected real value of Q could go to, due to a finding at F.

Requires:

Node Q is continuous, or has state numbers defined.

Range:

[E(Q), infinity)      E(Q) if Q is independent of F

Compare:

E(Q)

Equation:

mmax = MAX~f E(Q|f)
RMS Change of Real

Definition:

The square root of the expected change squared in the expected real value of Q, due to a finding at F.  This turns out to be the same as the square root of the variance reduction of expected value.

Requires:

Node Q is continuous, or has  state numbers defined.

Range:

[0, V(Q)]     0 if Q is independent of F

Compare:

E(Q)  and maybe V(Q)

Equation:

sm = sqrt (Vm)
Vm = SUM~f P(f) [E(Q|f) - E(Q)] ^ 2 = Vr