Unbounded Knowledge Acquisition Based Upon
Mutual Information in Dependent Questions
Tony C. Smith & Chris van de Molen
Department of Computer Science Waikato University
tcs@cs.waikato.ac.nz
AI 2010
Outline
Motivation/background
Representations of knowledge
Truth table
Attribute vs. entity
Fruitbat Eagle Tiger Rock …
Is alive? T T T F
Flies? T T F F
Lays eggs? F T F F
…
Multivalued truth table
Attribute vs. entity
Fruitbat Eagle Tiger Rock …
Is alive? 0.90 0.96 0.88 0.01
Flies? 0.85 0.97 0.04 0.22
Lays eggs? 0.20 0.91 0.00 0.01
…
YES, NO, SOMETIMES, USUALLY, MAYBE, SELDOM, RARELY, etc
Mutual Information
A measure of the information (i.e.
uncertainty) in an event ω whose
probability is pω can be expressed in bits as
I (ω) = - log2 pω
Entropy is the average information content
I (ω) = - pω log2 pω
Mutual information is the amount of information two events share
MI(X, Y) = I(x) + I(y) – [I(x) + I(y|x)]
Mutual information example
Given:
P(A) = 1/32 P(B) = 1/64 P(B|A) = 1/2 P(A|B) = 1/4 Then:
I(A) = 5 I(B) = 6 I(A,B) = 1
MI(A,B) = 5 + 6 – (