Colin de la Higuera

Preview:

DESCRIPTION

Grammatical inference: techniques and algorithms. Colin de la Higuera. Acknowledgements. - PowerPoint PPT Presentation

Citation preview

1Erice 2005, the Analysis of Patterns. Grammatical Inference

1

Colin de la Higuera

Grammatical inference: techniques and

algorithms

2Erice 2005, the Analysis of Patterns. Grammatical Inference

2

Acknowledgements

• Laurent Miclet, Tim Oates, Jose Oncina, Rafael Carrasco, Paco Casacuberta, Pedro Cruz, Rémi Eyraud, Philippe Ezequel, Henning Fernau, Jean-Christophe Janodet, Thierry Murgue, Frédéric Tantini, Franck Thollard, Enrique Vidal,...

• … and a lot of other people to whom I am grateful

3Erice 2005, the Analysis of Patterns. Grammatical Inference

3

Outline

1 An introductory example2 About grammatical inference3 Some specificities of the task4 Some techniques and algorithms5 Open issues and questions

4Erice 2005, the Analysis of Patterns. Grammatical Inference

4

1 How do we learn languages?

A very simple example

5Erice 2005, the Analysis of Patterns. Grammatical Inference

5

The problem:

• You are in an unknown city and have to eat.

• You therefore go to some selected restaurants.

• Your goal is therefore to build a model of the city (a map).

6Erice 2005, the Analysis of Patterns. Grammatical Inference

6

The data

• Up Down Right Left Left Restaurant

• Down Down Right Not a restaurant

• Left Down Restaurant

7Erice 2005, the Analysis of Patterns. Grammatical Inference

7

Hopefully something like this:

N

N

R

u

u

d

r d

d,l

u,r

R

8Erice 2005, the Analysis of Patterns. Grammatical Inference

8

R

R

R

R

R

N

N

N

N

d

d

r d

N

d

u

u

u u

d

u

u

d

9Erice 2005, the Analysis of Patterns. Grammatical Inference

9

Further arguments (1)

• How did we get hold of the data?– Random walks– Following someone

•someone knowledgeable•Someone trying to lose us•Someone on a diet

– Exploring

10Erice 2005, the Analysis of Patterns. Grammatical Inference

10

Further arguments (2)

• Can we not have better information (for example the names of the restaurants)?

• But then we may only have the information about the routes to restaurants (not to the “non restaurants”)…

11Erice 2005, the Analysis of Patterns. Grammatical Inference

11

Further arguments (3)

What if instead of getting the information “Elimo” or “restaurant”, I get the information “good meal” or “7/10”?

Reinforcement learning: POMDP

12Erice 2005, the Analysis of Patterns. Grammatical Inference

12

Further arguments (4)

• Where is my algorithm to learn these things?

• Perhaps should I consider several algorithms for the different types of data?

13Erice 2005, the Analysis of Patterns. Grammatical Inference

13

Further arguments (5)

• What can I say about the result?

• What can I say about the algorithm?

14Erice 2005, the Analysis of Patterns. Grammatical Inference

14

Further arguments (6)

• What if I want something richer than an automaton?– A context-free grammar– A transducer– A tree automaton…

15Erice 2005, the Analysis of Patterns. Grammatical Inference

15

Further arguments (7)

• Why do I want something as rich as an automaton?

• What about– A simple pattern?– Some SVM obtained from features over the strings?

– A neural network that would allow me to know if some path will bring me or not to a restaurant, with high probability?

16Erice 2005, the Analysis of Patterns. Grammatical Inference

16

Our goal/idea

• Old Greeks:A whole is more than the sum of all parts

• Gestalt theoryA whole is different than the sum of all parts

17Erice 2005, the Analysis of Patterns. Grammatical Inference

17

Better said

• There are cases where the data cannot be analyzed by considering it in bits

• There are cases where intelligibility of the pattern is important

18Erice 2005, the Analysis of Patterns. Grammatical Inference

18

Nothing Lots

What do people know about formal language theory?

19Erice 2005, the Analysis of Patterns. Grammatical Inference

19

A small reminder on formal language theory

• Chomsky hierarchy• + and – of grammars

20Erice 2005, the Analysis of Patterns. Grammatical Inference

20

A crash course in Formal language theory

• Symbols• Strings• Languages• Chomsky hierarchy• Stochastic languages

21Erice 2005, the Analysis of Patterns. Grammatical Inference

21

Symbols

are taken from some alphabet

Stringsare sequences of symbols from

22Erice 2005, the Analysis of Patterns. Grammatical Inference

22

Languages

are sets of strings over

Languagesare subsets of *

23Erice 2005, the Analysis of Patterns. Grammatical Inference

23

Special languages

• Are recognised by finite state automata

• Are generated by grammars

24Erice 2005, the Analysis of Patterns. Grammatical Inference

24

a

b

a

b

a

b

DFA: Deterministic Finite State Automaton

25Erice 2005, the Analysis of Patterns. Grammatical Inference

25

a

b

a

b

a

b

ababL

26Erice 2005, the Analysis of Patterns. Grammatical Inference

26

What is a context free grammar?

A 4-tuple (Σ, S, V, P) such that:– Σ is the alphabet;– V is a finite set of non terminals;

– S is the start symbol;– P V (VΣ)* is a finite set of rules.

27Erice 2005, the Analysis of Patterns. Grammatical Inference

27

Example of a grammar

The Dyck1 grammar– (Σ, S, V, P)– Σ = {a, b}– V = {S}– P = {S aSbS, S }

28Erice 2005, the Analysis of Patterns. Grammatical Inference

28

Derivations and derivation trees

S aSbS aaSbSbS aabSbS aabbS aabb

a

a

b

b

S

SS

S

S

29Erice 2005, the Analysis of Patterns. Grammatical Inference

29

Chomsky Hierarchy

• Level 0: no restriction• Level 1: context-sensitive• Level 2: context-free• Level 3: regular

30Erice 2005, the Analysis of Patterns. Grammatical Inference

30

Chomsky Hierarchy• Level 0: Whatever Turing machines can do

• Level 1: – {anbncn: n }– {anbmcndm : n,m }– {uu: u*}

• Level 2: context-free– {anbn: n }– brackets

• Level 3: regular– Regular expressions (GREP)

31Erice 2005, the Analysis of Patterns. Grammatical Inference

31

The membership problem

• Level 0: undecidable• Level 1: decidable• Level 2: polynomial • Level 3: linear

32Erice 2005, the Analysis of Patterns. Grammatical Inference

32

The equivalence problem

• Level 0: undecidable• Level 1: undecidable• Level 2: undecidable• Level 3: Polynomial only when the representation is DFA.

33Erice 2005, the Analysis of Patterns. Grammatical Inference

33

4

1

3

1

2

1

2

1

2

13

2

b

b

a

a

a

b

4

3

2

1

PFA: Probabilistic Finite (state) Automaton

34Erice 2005, the Analysis of Patterns. Grammatical Inference

34

0.1

0.3

a

b

a

b

a

b

0.65

0.350.9

0.7

0.3

0.7

DPFA: Deterministic Probabilistic Finite (state) Automaton

35Erice 2005, the Analysis of Patterns. Grammatical Inference

35

What is nice with grammars?

• Compact representation• Recursivity• Says how a string belongs, not just if it belongs

• Graphical representations (automata, parse trees)

36Erice 2005, the Analysis of Patterns. Grammatical Inference

36

What is not so nice with grammars?

• Even the easiest class (level 3) contains SAT, Boolean functions, parity functions…

• Noise is very harmful:– Think about putting edit noise to language {w: |w|a=0[2]|w|b=0[2]}

37Erice 2005, the Analysis of Patterns. Grammatical Inference

37

2 Specificities of grammatical

inferenceGrammatical inference consists (roughly) in finding the (a) grammar or automaton that has produced a given set of strings (sequences, trees, terms, graphs).

38Erice 2005, the Analysis of Patterns. Grammatical Inference

38

Inductive Inference Pattern Recognition

Grammatical Inference

The field

Machine Learning

Computational linguistics Computational biology Web technologies

39Erice 2005, the Analysis of Patterns. Grammatical Inference

39

The data

• Strings, trees, terms, graphs• Structural objects• Basically the same gap of information as in programming between tables/arrays and data structures

40Erice 2005, the Analysis of Patterns. Grammatical Inference

40

Alternatives to grammatical inference

• 2 steps:– Extract features from the strings

– Use a very good method over n.

41Erice 2005, the Analysis of Patterns. Grammatical Inference

41

Examples of strings

A string in Gaelic and its translation to English:

• Tha thu cho duaichnidh ri èarr àirde de a’ coisich deas damh

•You are as ugly as the north end of a southward traveling ox

42Erice 2005, the Analysis of Patterns. Grammatical Inference

42

43Erice 2005, the Analysis of Patterns. Grammatical Inference

43

44Erice 2005, the Analysis of Patterns. Grammatical Inference

44

>A BAC=41M14 LIBRARY=CITB_978_SKBAAGCTTATTCAATAGTTTATTAAACAGCTTCTTAAATAGGATATAAGGCAGTGCCATGTAGTGGATAAAAGTAATAATCATTATAATATTAAGAACTAATACATACTGAACACTTTCAATGGCACTTTACATGCACGGTCCCTTTAATCCTGAAAAAATGCTATTGCCATCTTTATTTCAGAGACCAGGGTGCTAAGGCTTGAGAGTGAAGCCACTTTCCCCAAGCTCACACAGCAAAGACACGGGGACACCAGGACTCCATCTACTGCAGGTTGTCTGACTGGGAACCCCCATGCACCTGGCAGGTGACAGAAATAGGAGGCATGTGCTGGGTTTGGAAGAGACACCTGGTGGGAGAGGGCCCTGTGGAGCCAGATGGGGCTGAAAACAAATGTTGAATGCAAGAAAAGTCGAGTTCCAGGGGCATTACATGCAGCAGGATATGCTTTTTAGAAAAAGTCCAAAAACACTAAACTTCAACAATATGTTCTTTTGGCTTGCATTTGTGTATAACCGTAATTAAAAAGCAAGGGGACAACACACAGTAGATTCAGGATAGGGGTCCCCTCTAGAAAGAAGGAGAAGGGGCAGGAGACAGGATGGGGAGGAGCACATAAGTAGATGTAAATTGCTGCTAATTTTTCTAGTCCTTGGTTTGAATGATAGGTTCATCAAGGGTCCATTACAAAAACATGTGTTAAGTTTTTTAAAAATATAATAAAGGAGCCAGGTGTAGTTTGTCTTGAACCACAGTTATGAAAAAAATTCCAACTTTGTGCATCCAAGGACCAGATTTTTTTTAAAATAAAGGATAAAAGGAATAAGAAATGAACAGCCAAGTATTCACTATCAAATTTGAGGAATAATAGCCTGGCCAACATGGTGAAACTCCATCTCTACTAAAAATACAAAAATTAGCCAGGTGTGGTGGCTCATGCCTGTAGTCCCAGCTACTTGCGAGGCTGAGGCAGGCTGAGAATCTCTTGAACCCAGGAAGTAGAGGTTGCAGTAGGCCAAGATGGCGCCACTGCACTCCAGCCTGGGTGACAGAGCAAGACCCTATGTCCAAAAAAAAAAAAAAAAAAAAGGAAAAGAAAAAGAAAGAAAACAGTGTATATATAGTATATAGCTGAAGCTCCCTGTGTACCCATCCCCAATTCCATTTCCCTTTTTTGTCCCAGAGAACACCCCATTCCTGACTAGTGTTTTATGTTCCTTTGCTTCTCTTTTTAAAAACTTCAATGCACACATATGCATCCATGAACAACAGATAGTGGTTTTTGCATGACCTGAAACATTAATGAAATTGTATGATTCTAT

45Erice 2005, the Analysis of Patterns. Grammatical Inference

45

46Erice 2005, the Analysis of Patterns. Grammatical Inference

46

47Erice 2005, the Analysis of Patterns. Grammatical Inference

47

48Erice 2005, the Analysis of Patterns. Grammatical Inference

48

<book> <part> <chapter> <sect1/> <sect1> <orderedlist numeration="arabic"> <listitem/> <f:fragbody/> </orderedlist> </sect1> </chapter> </part> </book>

49Erice 2005, the Analysis of Patterns. Grammatical Inference

49

<?xml version="1.0"?><?xml-stylesheet href="carmen.xsl" type="text/xsl"?><?cocoon-process type="xslt"?><!DOCTYPE pagina [<!ELEMENT pagina (titulus?, poema)><!ELEMENT titulus (#PCDATA)><!ELEMENT auctor (praenomen, cognomen, nomen)><!ELEMENT praenomen (#PCDATA)><!ELEMENT nomen (#PCDATA)><!ELEMENT cognomen (#PCDATA)><!ELEMENT poema (versus+)><!ELEMENT versus (#PCDATA)>]><pagina><titulus>Catullus II</titulus><auctor><praenomen>Gaius</praenomen><nomen>Valerius</nomen><cognomen>Catullus</cognomen></auctor>

50Erice 2005, the Analysis of Patterns. Grammatical Inference

50

51Erice 2005, the Analysis of Patterns. Grammatical Inference

51

A logic program learned by GIFT

color_blind(Arg1) :- start(Arg1,X),p11(Arg1,X).

start(X,X). p11(Arg1,P) :- mother(M,P),p4(Arg1, M). p4(Arg1,X) :- woman(X),father(F,X),p3(Arg1,F). p4(Arg1,X) :- woman(X),mother(M,X),p4(Arg1,M). p3(Arg1,X) :- man(X),color_blind(X).

52Erice 2005, the Analysis of Patterns. Grammatical Inference

52

3 Hardness of the task

– One thing is to build algorithms, another is to be able to state that it works.

– Some questions:– Does this algorithm work?– Do I have enough learning data?– Do I need some extra bias?– Is this algorithm better than the other?

– Is this problem easier than the other?

53Erice 2005, the Analysis of Patterns. Grammatical Inference

53

Alternatives to answer these questions:

– Use well admitted benchmarks– Build your own benchmarks– Solve a real problem– Prove things

54Erice 2005, the Analysis of Patterns. Grammatical Inference

54

Use well admitted benchmarks

• yes: allows to compare

• no: many parameters

• problem: difficult to better

(also, in GI, not that many

about!)

55Erice 2005, the Analysis of Patterns. Grammatical Inference

55

Build your own benchmarks

• yes: allows to progress

• no: against one-self

• problem: one invents the

benchmark where one is best!

56Erice 2005, the Analysis of Patterns. Grammatical Inference

56

Solve a real problem

• yes: it is the final goal

• no: we don’t always know why

things work

• problem: how much pre-

processing?

57Erice 2005, the Analysis of Patterns. Grammatical Inference

57

Theory

• Because you may want to be able to say something more than « seems to work in practice ».

58Erice 2005, the Analysis of Patterns. Grammatical Inference

58

Identification in the limit

L Pres XA class of languages

A class of grammars

G

L A learnerThe naming function

yields

f()=g() yields(f)=yields(g)L((f))=yields(f)

59Erice 2005, the Analysis of Patterns. Grammatical Inference

59

f1 f2

h1 h2

fn

hn

fi

hi hn

L(hi)= L

L is identifiable in the limit in terms of G from Pres iff

LL, f Pres(L)

60Erice 2005, the Analysis of Patterns. Grammatical Inference

60

     No quería componer otro Quijote —lo cual es fácil— sino el Quijote. Inútil agregar que no encaró nunca una transcripción mecánica del original; no se proponía copiarlo. Su admirable ambición era producir unas páginas que coincidieran palabra por palabra y línea por línea con las de Miguel de Cervantes.

   […]

“Mi empresa no es difícil, esencialmente” leo en otro lugar de la carta. “Me bastaría ser inmortal para llevarla a cabo.”

Jorge Luis Borges(1899–1986)Pierre Menard, autor del Quijote (El jardín de senderos que se

bifurcan) Ficciones

61Erice 2005, the Analysis of Patterns. Grammatical Inference

61

4 Algorithmic ideas

62Erice 2005, the Analysis of Patterns. Grammatical Inference

62

The space of GI problems

• Type of input (strings)• Presentation of input (batch)• Hypothesis space (subset of the regular grammars)

• Success criteria (identification in the limit)

63Erice 2005, the Analysis of Patterns. Grammatical Inference

63

Types of input

the cat hates the dogStrings:

StructuralExamples:

cat dog the the hates

(+)

(-)

Graphs:

64Erice 2005, the Analysis of Patterns. Grammatical Inference

64

Types of input - oracles

• Membership queries– Is string S in the target language?

• Equivalence queries– Is my hypothesis correct?– If not, provide counter example

• Subset queries– Is the language of my hypothesis a subset of the target language?

65Erice 2005, the Analysis of Patterns. Grammatical Inference

65

Presentation of input

• Arbitrary order• Shortest to longest• All positive and negative examples up to some length

• Sampled according to some probability distribution

66Erice 2005, the Analysis of Patterns. Grammatical Inference

66

Presentation of input

• Text presentation– A presentation of all strings in the target language

• Complete presentation (informant)– A presentation of all strings over the alphabet of the target language labeled as + or -

67Erice 2005, the Analysis of Patterns. Grammatical Inference

67

Hypothesis space

• Regular grammars– A welter of subclasses

• Context free grammars– Fewer subclasses

• Hyper-edge replacement graph grammars

68Erice 2005, the Analysis of Patterns. Grammatical Inference

68

Success criteria

• Identification in the limit– Text or informant presentation– After each example, learner guesses language

– At some point, guess is correct and never changes

• PAC learning

69Erice 2005, the Analysis of Patterns. Grammatical Inference

69

Theorem’s due to Gold• The good news

– Any recursively enumerable class of languages can be learned in the limit from an informant (Gold, 1967)

• The bad news– A language class is superfinite if it includes all finite languages and at least one infinite language

– No superfinite class of languages can be learned in the limit from a text (Gold, 1967)

– That includes regular and context-free

70Erice 2005, the Analysis of Patterns. Grammatical Inference

70

A picture

Little information

A lot of information

Poor languages Rich Languages

Sub-classes of reg, from pos

Mildly context sensitive, from queries

DFA, from queries

Context-free, from pos

DFA, from pos+neg

71Erice 2005, the Analysis of Patterns. Grammatical Inference

71

Algorithms

RPNI

K-Reversible

GRIDS

SEQUITUR

L*

72Erice 2005, the Analysis of Patterns. Grammatical Inference

72

4.1 RPNI

• Regular Positive and Negative Grammatical Inference

Identifying regular languages in polynomial time

Jose Oncina & Pedro García 1992

73Erice 2005, the Analysis of Patterns. Grammatical Inference

73

• It is a state merging algorithm;

• It identifies any regular language in the limit;

• It works in polynomial time;• It admits polynomial charac-teristic sets.

74Erice 2005, the Analysis of Patterns. Grammatical Inference

74

The algorithm

function rmerge(A,p,q)A = merge(A,p,q)while a, p,qA(r,a), pq do

rmerge(A,p,q)

75Erice 2005, the Analysis of Patterns. Grammatical Inference

75

A=PTA(X); Fr ={(q0,a): a };

K ={q0};

While Fr dochoose q from Frif pK: L(rmerge(A,p,q))X-=

then A = rmerge(A,p,q) else K = K {q}Fr = {(q,a): qK} – {K}

76Erice 2005, the Analysis of Patterns. Grammatical Inference

76

X+={, aaa, aaba, ababa, bb, bbaaa}

a

a

aa

b

b

b

a

a

a

ba b

a

X-={aa, ab, aaaa, ba}

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

77Erice 2005, the Analysis of Patterns. Grammatical Inference

77

Try to merge 2 and 1

a

a

aa

b

b

b

a

a

a

ba b

a

X-={aa, ab, aaaa, ba}

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

78Erice 2005, the Analysis of Patterns. Grammatical Inference

78

Needs more merging for determinization

aa

aa

b

b

b

a

a

a

b a ba

X-={aa, ab, aaaa, ba}

1,2

3

4

5

6

7

8

9

10

11

12

13

14

15

79Erice 2005, the Analysis of Patterns. Grammatical Inference

79

But now string aaaa is accepted, so the merge must be rejected

a

b

b a

a

a

ab

a

X-={aa, ab, aaaa, ba}

1,2,4,7

3,5,86

9, 11

10

12

13

14

15

80Erice 2005, the Analysis of Patterns. Grammatical Inference

80

Try to merge 3 and 1

a

a

aa

b

b

b

a

a

a

ba b

a

X-={aa, ab, aaaa, ba}

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

81Erice 2005, the Analysis of Patterns. Grammatical Inference

81

Requires to merge 6 with {1,3}

a

a

aa

b

b

a

a

a

ba b

a

X-={aa, ab, aaaa, ba}

1,3

2

4

5

6

7

8

9

10

11

12

13

14

15

b

82Erice 2005, the Analysis of Patterns. Grammatical Inference

82

And now to merge 2 with 10

a

a

aa

b

aa

a

ba b

a

X-={aa, ab, aaaa, ba}

1,3,6

2

4

5

7

8

9

10

11

12

13

14

15

b

83Erice 2005, the Analysis of Patterns. Grammatical Inference

83

And now to merge 4 with 13

a

a

aa

b

a

ba b

a

X-={aa, ab, aaaa, ba}

1,3,6

2,104

5

7

8

9

11

12

13

14

15

b a

84Erice 2005, the Analysis of Patterns. Grammatical Inference

84

And finally to merge 7 with 15

a

a

aa

b

a

ba b

a

X-={aa, ab, aaaa, ba}

1,3,6

2,10

4,13

5

7

8

9

11

12

14

15

b

85Erice 2005, the Analysis of Patterns. Grammatical Inference

85

No counter example is accepted so the merges are kept

a

a

aa

bb

a ba

X-={aa, ab, aaaa, ba}

1,3,6

2,10

4,13

5

7,15

8

9

11

12

14

b

86Erice 2005, the Analysis of Patterns. Grammatical Inference

86

Next possible merge to be checked is {4,13} with {1,3,6}

a

a

aa

bb

a ba

X-={aa, ab, aaaa, ba}

1,3,6

2,10

4,13

5

7,15

8

9

11

12

14

b

87Erice 2005, the Analysis of Patterns. Grammatical Inference

87

a

a

ab

ba b

a

X-={aa, ab, aaaa, ba}

1,3,4,6,13

2,10

5

7,15

8

9

11

12

14

b

a

More merging for determinization is needed

88Erice 2005, the Analysis of Patterns. Grammatical Inference

88

a ba b

a

X-={aa, ab, aaaa, ba}

1,3,4,6,8,13

2,7,10,11,15

5

9

12

14

b

a

But now aa is accepted

89Erice 2005, the Analysis of Patterns. Grammatical Inference

89

So we try {4,13} with {2,10}

a

a

aa

bb

a ba

X-={aa, ab, aaaa, ba}

1,3,6

2,10

4,13

5

7,15

8

9

11

12

14

b

90Erice 2005, the Analysis of Patterns. Grammatical Inference

90

After determinizing, negative string aa is again accepted

a ba b

a

X-={aa, ab, aaaa, ba}

1,3,62,4,7,10,13,15

5,8

9,11 12

14

b

a

91Erice 2005, the Analysis of Patterns. Grammatical Inference

91

So we try 5 with {1,3,6}

a

a

aa

bb

a ba

X-={aa, ab, aaaa, ba}

1,3,6

2,10

4,13

5

7,15

8

9

11

12

14

b

92Erice 2005, the Analysis of Patterns. Grammatical Inference

92

But again we accept ab

aa

aa

b

b

X-={aa, ab, aaaa, ba}

1,3,5,6,12

2,9,10,14

4,13

7,15

8

11

b

93Erice 2005, the Analysis of Patterns. Grammatical Inference

93

So we try 5 with {2,10}

a

a

aa

bb

a ba

X-={aa, ab, aaaa, ba}

1,3,6

2,10

4,13

5

7,15

8

9

11

12

14

b

94Erice 2005, the Analysis of Patterns. Grammatical Inference

94

Which is OK. So next possible merge is {7,15} with {1,3,6}

a

a

a

ab

b

X-={aa, ab, aaaa, ba}

1,3,6

2,5,10

4,9,13

7,15

8,12

11,14

b

95Erice 2005, the Analysis of Patterns. Grammatical Inference

95

Which is OK. Now try to merge {8,12} with {1,3,6,7,15}

a a

a

ab

a

X-={aa, ab, aaaa, ba}

1,3,6,7,15

2,5,10

4,9,13

8,12

11,14

b

b

96Erice 2005, the Analysis of Patterns. Grammatical Inference

96

And ab is accepted

a

a

b

a

X-={aa, ab, aaaa, ba}

1,3,6,7,8,12,15

2,5,10,11,14

4,9,13

b

b

97Erice 2005, the Analysis of Patterns. Grammatical Inference

97

Now try to merge {8,12} with {4,9,13}

a a

a

ab

a

X-={aa, ab, aaaa, ba}

1,3,6,7,15

2,5,10

4,9,13

8,12

11,14

b

b

98Erice 2005, the Analysis of Patterns. Grammatical Inference

98

This is OK and no more merge is possible so the algorithm halts.

a a

a

b

a

X-={aa, ab, aaaa, ba}

1,3,6,7,11,14,15

2,5,10

4,8,9,12,13

b

b

99Erice 2005, the Analysis of Patterns. Grammatical Inference

99

Definitions

• Let be the length-lex ordering over *

• Let Pref(L) be the set of all prefixes of strings in some language L.

100

Erice 2005, the Analysis of Patterns. Grammatical Inference

100

Short prefixes

Sp(L)={uPref(L): (q0,u)=(q0,v) uv}

• There is one short prefix per useful state

0

1 2a

b

a

b b

aSp(L)={, a}

101

Erice 2005, the Analysis of Patterns. Grammatical Inference

101

Kernel-sets

• N(L)={uaPref(L): uSp(L)}{}• There is an element in the Kernel-set for each useful transition

0

1 2a

b

a

b b

aN(L)={, a, b, ab}

102

Erice 2005, the Analysis of Patterns. Grammatical Inference

102

A characteristic sample

• A sample is characteristic (for RPNI) if xSp(L) xuX+

xSp(L), yN(L), (q0,x)(q0,y)

z*: xzX+yzX-

xzX-yzX+

103

Erice 2005, the Analysis of Patterns. Grammatical Inference

103

About characteristic samples

• If you add more strings to a characteristic sample it still is characteristic;

• There can be many different characteristic samples;

• Change the ordering (or the exploring function in RPNI) and the characteristic sample will change.

104

Erice 2005, the Analysis of Patterns. Grammatical Inference

104

Conclusion

• RPNI identifies any regular language in the limit;

• RPNI works in polynomial time.

Complexity is in O(║X+║3.║X-║);

• There are many significant variants of RPNI;

• RPNI can be extended to other classes of grammars.

105

Erice 2005, the Analysis of Patterns. Grammatical Inference

105

Open problems

• RPNI’s complexity is not a tight upper bound. Find the correct complexity.

• The definition of the characteristic set is not tight either. Find a better definition.

106

Erice 2005, the Analysis of Patterns. Grammatical Inference

106

Algorithms

RPNI

K-Reversible

GRIDS

SEQUITUR

L*

107

Erice 2005, the Analysis of Patterns. Grammatical Inference

107

4.2 The k-reversible languages

• The class was proposed by Angluin (1982).

• The class is identifiable in the limit from text.

• The class is composed by regular languages that can be accepted by a DFA such that its reverse is deterministic with a look-ahead of k.

108

Erice 2005, the Analysis of Patterns. Grammatical Inference

108

Let A=(, Q, , I, F) be a NFA, we denote by AT=(, Q, T, F, I) the reversal automaton with:

T(q,a)={q’Q: q(q’,a)}

109

Erice 2005, the Analysis of Patterns. Grammatical Inference

109

0 1

3

b2

4

a

ba

a a a

0 1

3

b2

4

a

ba

a a a

A

AT

110

Erice 2005, the Analysis of Patterns. Grammatical Inference

110

Some definitions

• u is a k-successor of q if │u│=k and (q,u).

• u is a k-predecessor of q if │u│=k and T(q,uT).

is 0-successor and 0-predecessor of any state.

111

Erice 2005, the Analysis of Patterns. Grammatical Inference

111

0 1

3

b2

4b

a

a a a

A

• aa is a 2-successor of 0 and 1 but not of 3.

• a is a 1-successor of 3.• aa is a 2-predecessor of 3 but not of 1.

a

112

Erice 2005, the Analysis of Patterns. Grammatical Inference

112

A NFA is deterministic with look-ahead k iff q,q’Q: qq’(q,q’I) (q,q’(q”,a))

(u is a k-successor of q) (v is a k-successor of q’) uv

113

Erice 2005, the Analysis of Patterns. Grammatical Inference

113

Prohibited:

2

1

a

a

u

u

│u│=k

114

Erice 2005, the Analysis of Patterns. Grammatical Inference

114

Example

This automaton is not deterministic with look-ahead 1 but is deterministic with look-ahead 2.

0 1

3

b2

4

a

ba

a a a

115

Erice 2005, the Analysis of Patterns. Grammatical Inference

115

K-reversible automata• A is k-reversible if A is deterministic and AT is deterministic with look-ahead k.

• Example

0 1

b

2ba

a

b

0 1

b

2ba

a

bdeterministic deterministic with look-ahead 1

116

Erice 2005, the Analysis of Patterns. Grammatical Inference

116

Violation of k-reversibility• Two states q, q’ violate the k-reversibility condition iff– they violate the deterministic condition: q,q’(q”,a);

or– they violate the look-ahead condition: •q,q’F, uk: u is k-predecessor of both;

uk, (q,a)=(q’,a) and u is k-predecessor of both q and q’.

117

Erice 2005, the Analysis of Patterns. Grammatical Inference

117

Learning k-reversible automata

• Key idea: the order in which the merges are performed does not matter!

• Just merge states that do not comply with the conditions for k-reversibility.

118

Erice 2005, the Analysis of Patterns. Grammatical Inference

118

K-RL Algorithm (k-RL)

Data: k, X sample of a k-RL LA=PTA(X)While q,q’ k-reversibility violators do

A=merge(A,q,q’)

119

Erice 2005, the Analysis of Patterns. Grammatical Inference

119

Let X={a, aa, abba, abbbba}

a

ab abb

aa

abbbbabbb abbbba

abba

a

b b b b a

a

a

k=2

Violators, for u= ba

120

Erice 2005, the Analysis of Patterns. Grammatical Inference

120

Let X={a, aa, abba, abbbba}

a

ab abb

aa

abbbbabbb

abba

a

b b b b

a

a

a

k=2

Violators, for u= bb

121

Erice 2005, the Analysis of Patterns. Grammatical Inference

121

Let X={a, aa, abba, abbbba}

a

ab abb

aa

abbb

abbaa

b b b

b

a

a

k=2

122

Erice 2005, the Analysis of Patterns. Grammatical Inference

122

Properties (1)k0, X, k-RL(X) is a k-reversible language.

• L(k-RL(X)) is the smallest k-reversible language that contains X.

• The class Lk-RL is identifiable in the limit from text.

123

Erice 2005, the Analysis of Patterns. Grammatical Inference

123

Properties (2)

• Any regular language is k-reversible iff (u1v)-1L (u2v)-1L and │v│=k

(u1v)-1L=(u2v)-1L

(if two strings are prefixes of a string of length at least k, then the

strings are Nerode-equivalent)

124

Erice 2005, the Analysis of Patterns. Grammatical Inference

124

Properties (3)

• Lk-RL(X) L(k+1)-RL(X)

• Lk-TSS(X) L(k-1)-RL(X)

125

Erice 2005, the Analysis of Patterns. Grammatical Inference

125

Properties (4)

The time complexity is O(k║X║3).

The space complexity is O(║X║).

The algorithm is not incremental.

126

Erice 2005, the Analysis of Patterns. Grammatical Inference

126

Properties (4) Polynomial aspects

• Polynomial characteristic sets• Polynomial update time• But not necessarily a polynomial number of mind changes

127

Erice 2005, the Analysis of Patterns. Grammatical Inference

127

Extensions

• Sakakibara built an extension for context-free grammars whose tree language is k-reversible

• Marion & Besombes propose an extension to tree languages.

• Different authors propose to learn these automata and then estimate the probabilities as an alternative to learning stochastic automata.

128

Erice 2005, the Analysis of Patterns. Grammatical Inference

128

Exercises

• Construct a language L that is not k-reversible, k0.

• Prove that the class of k-reversible languages is not in TxtEx.

• Run k-RL on X={aa, aba, abb, abaaba, baaba} for k=0,1,2,3

129

Erice 2005, the Analysis of Patterns. Grammatical Inference

129

Solution (idea)

• Lk={ai: ik}

• Then for each k: Lk is k-reversible but not k-1-reversible.

• And ULk = a*

• So there is an accumulation point…

130

Erice 2005, the Analysis of Patterns. Grammatical Inference

130

Algorithms

RPNI

K-Reversible

GRIDS

SEQUITUR

L*

131

Erice 2005, the Analysis of Patterns. Grammatical Inference

131

4.4 Active Learning:

learning DFA from membership and

equivalence queries: the L* algorithm

132

Erice 2005, the Analysis of Patterns. Grammatical Inference

132

The classes C and H

• sets of examples• representations of these sets• the computation of L(x) (and h(x)) must take place in time polynomial in x.

133

Erice 2005, the Analysis of Patterns. Grammatical Inference

133

Correct learning

A class C is identifiable with a polynomial number of queries of type T if there exists an algorithm that:

1) LC identifies L with a polynomial number of queries of type T;

2) does each update in time polynomial in f and in xi, {xi} counter-examples seen so far.

134

Erice 2005, the Analysis of Patterns. Grammatical Inference

134

Algorithm L*

• Angluin’s papers• Some talks by Rivest• Kearns and Vazirani• Balcazar, Diaz, Gavaldà & Watanabe

135

Erice 2005, the Analysis of Patterns. Grammatical Inference

135

Some references

• Learning regular sets from queries and counter-examples, D. Angluin, Information and computation, 75, 87-106, 1987.

• Queries and Concept learning, D. Angluin, Machine Learning, 2, 319-342, 1988.

• Negative results for Equivalence Queries, D. Angluin, Machine Learning, 5, 121-150, 1990.

136

Erice 2005, the Analysis of Patterns. Grammatical Inference

136

The Minimal Adequate Teacher

• You are allowed:– strong equivalence queries;– membership queries.

137

Erice 2005, the Analysis of Patterns. Grammatical Inference

137

General idea of L*

• find a consistent table (representing a DFA);

• submit it as an equivalence query;• use counterexample to update the table;

• submit membership queries to make the table complete;

• Iterate.

138

Erice 2005, the Analysis of Patterns. Grammatical Inference

138

An observation table

a

a

abaab

1 000

010001

139

Erice 2005, the Analysis of Patterns. Grammatical Inference

139

The states (S) or test set

The transitions (T)

The experiments (E)

a

a

abaab

1 000

010001

140

Erice 2005, the Analysis of Patterns. Grammatical Inference

140

Meaning

(q0, . )F

L

a

a

abaab

1 000

010001

141

Erice 2005, the Analysis of Patterns. Grammatical Inference

141

(q0, ab.a) F

aba L

a

a

abaab

1 000

010001

142

Erice 2005, the Analysis of Patterns. Grammatical Inference

142

Equivalent prefixes

These two rows are equal,

hence

(q0,)= (q0,ab)

a

a

abaab

1 000

010001

143

Erice 2005, the Analysis of Patterns. Grammatical Inference

143

Building a DFA from a table

a

a

abaab

1 000

010001

aa

144

Erice 2005, the Analysis of Patterns. Grammatical Inference

144

a

a

abaab

1 000

010001

aa

b

a

b

145

Erice 2005, the Analysis of Patterns. Grammatical Inference

145

a

a

abaab

1 000

010001

aa

b

a

b

Some rules

This set is prefix-closed

This set is suffix-closed

S\S=T

146

Erice 2005, the Analysis of Patterns. Grammatical Inference

146

An incomplete table

a

a

abaab

1 00

01001

aa

b

a

b

147

Erice 2005, the Analysis of Patterns. Grammatical Inference

147

Good idea

We can complete the table by making membership queries...

u

v

?uvL ?

Membership query:

148

Erice 2005, the Analysis of Patterns. Grammatical Inference

148

A table is

closed if any row of T corresponds to some row in S

a

a

abaab

1 000

011001

Not closed

149

Erice 2005, the Analysis of Patterns. Grammatical Inference

149

And a table that is not closed

a

a

abaab

1 000

011001

aa

b

a

b

?

150

Erice 2005, the Analysis of Patterns. Grammatical Inference

150

What do we do when we have a table that is not

closed?

• Let s be the row (of T) that does not appear in S.

• Add s to S, and a sa to T.

151

Erice 2005, the Analysis of Patterns. Grammatical Inference

151

An inconsistent table

a

abaa

1 0ab

00000101

bbba 01

00

Are a and b equivalent?

152

Erice 2005, the Analysis of Patterns. Grammatical Inference

152

A table is consistent if

Every equivalent pair of rows in H remains equivalent in S after appending any symbol

row(s1)=row(s2)

a, row(s1a)=row(s2a)

153

Erice 2005, the Analysis of Patterns. Grammatical Inference

153

What do we do when we have an inconsistent

table?Let a be such that row(s1)=row(s2) but row(s1a)row(s2a)

• If row(s1a)row(s2a), it is so for experiment e

• Then add experiment ae to the table

154

Erice 2005, the Analysis of Patterns. Grammatical Inference

154

What do we do when we have a closed and consistent table ?

• We build the corresponding DFA

• We make an equivalence query!!!

155

Erice 2005, the Analysis of Patterns. Grammatical Inference

155

What do we do if we get a counter-example?

• Let u be this counter-example

wPref(u) do– add w to S a, such that waPref(u) add wa to T

156

Erice 2005, the Analysis of Patterns. Grammatical Inference

156

Run of the algorithm

a

b

1

1

1 Table is now closed

and consistent

b

a

157

Erice 2005, the Analysis of Patterns. Grammatical Inference

157

An equivalence query is made!

b

a

Counter example baa is returned

158

Erice 2005, the Analysis of Patterns. Grammatical Inference

158

a

b1

1

0baaba

baaa

bbbab

baab

1

01

1

11

Not consistent

Because of

159

Erice 2005, the Analysis of Patterns. Grammatical Inference

159

a

a

b1 1

1

0 0

baaba

baaa

bbbab

baab

1 1

0 1

1 0

1 1

Table is now closed and consistent

ba

baa

a

b

a

b b

a

0

0 0

1 1

160

Erice 2005, the Analysis of Patterns. Grammatical Inference

160

Proof of the algorithm

Sketch only

Understanding the proof is important for further algorithms

Balcazar et al. is a good place for that.

161

Erice 2005, the Analysis of Patterns. Grammatical Inference

161

Termination / Correctness

• For every regular language there is a unique minimal DFA that recognizes it.

• Given a closed and consistent table, one can generate a consistent DFA.

• A DFA consistent with a table has at least as many states as different rows in S.

• If the algorithm has built a table with n different rows in S, then it is the target.

162

Erice 2005, the Analysis of Patterns. Grammatical Inference

162

Finiteness

• Each closure failure adds one different row to S.

• Each inconsistency failure adds one experiment, which also creates a new row in S.

• Each counterexample adds one different row to S.

163

Erice 2005, the Analysis of Patterns. Grammatical Inference

163

Polynomial

• |E| n• at most n-1 equivalence queries• |membership queries| n(n-1)m where m is the length of the longest counter-example returned by the oracle

164

Erice 2005, the Analysis of Patterns. Grammatical Inference

164

Conclusion• With an MAT you can learn DFA

– but also a variety of other classes of grammars;

– it is difficult to see how powerful is really an MAT;

– probably as much as PAC learning.– Easy to find a class, a set of queries and provide and algorithm that learns with them;

– more difficult for it to be meaningful.

• Discussion: why are these queries meaningful?

165

Erice 2005, the Analysis of Patterns. Grammatical Inference

165

Algorithms

RPNI

K-Reversible

GRIDS

SEQUITUR

L*

166

Erice 2005, the Analysis of Patterns. Grammatical Inference

166

4.5 SEQUITUR

(http://sequence.rutgers.edu/sequitur/)(Neville Manning & Witten, 97)Idea: construct a CF grammar from a very long string w, such that L(G)={w}– No generalization– Linear time (+/-)– Good compression rates

167

Erice 2005, the Analysis of Patterns. Grammatical Inference

167

Principle

The grammar with respect to the string:– Each rule has to be used at least twice;

– There can be no sub-string of length 2 that appears twice.

168

Erice 2005, the Analysis of Patterns. Grammatical Inference

168

Examples

Sabcdbc

SAbAabA aa

S aAdAA bc

Saabaaab

SAaAA aab

169

Erice 2005, the Analysis of Patterns. Grammatical Inference

169

abcabdabcabd

170

Erice 2005, the Analysis of Patterns. Grammatical Inference

170

In the beginning, God created the heavens and the earth.

And the earth was without form, and void; and darkness was upon the face of the deep. And the Spirit of God moved upon the face of the waters.

And God said, Let there be light: and there was light.

And God saw the light, that it was good: and God divided the light from the darkness.

And God called the light Day, and the darkness he called Night. And the evening and the morning were the first day.

And God said, Let there be a firmament in the midst of the waters, and let it divide the waters from the waters.

And God made the firmament, and divided the waters which were under the firmament from the waters which were above the firmament: and it was so.

And God called the firmament Heaven. And the evening and the morning were the second day.

171

Erice 2005, the Analysis of Patterns. Grammatical Inference

171

172

Erice 2005, the Analysis of Patterns. Grammatical Inference

172

• appending a symbol to rule S; • using an existing rule; • creating a new rule; • and deleting a rule.

Sequitur options

173

Erice 2005, the Analysis of Patterns. Grammatical Inference

173

Results

On text:– 2.82 bpc– compress 3.46 bpc– gzip 3.25 bpc– PPMC 2.52 bpc

174

Erice 2005, the Analysis of Patterns. Grammatical Inference

174

Algorithms

RPNI

K-Reversible

GRIDS

SEQUITUR

L*

175

Erice 2005, the Analysis of Patterns. Grammatical Inference

175

4.6 Using a simplicity bias

(Langley & Stromsten, 00)Based on algorithm GRIDS

(Wolff, 82)

Main characteristics:– MDL principle;– Not characterizable;– Not tested on large benchmarks.

176

Erice 2005, the Analysis of Patterns. Grammatical Inference

176

Two learning operatorsCreation of non terminals and rules

NP ART ADJ NOUNNP ART ADJ ADJ NOUN

NP ART AP1NP ART ADJ AP1AP1 ADJ NOUN

177

Erice 2005, the Analysis of Patterns. Grammatical Inference

177

Merging two non terminals

NP ART AP1NP ART AP2AP1 ADJ NOUNAP2 ADJ AP1

NP ART AP1AP1 ADJ NOUNAP1 ADJ AP1

178

Erice 2005, the Analysis of Patterns. Grammatical Inference

178

• Scoring function: MDL

principle: G+wT d(w)

• Algorithm: – find best merge that improves current grammar

– if no such merge exists, find best creation

– halt when no improvement

179

Erice 2005, the Analysis of Patterns. Grammatical Inference

179

Results

• On subsets of English grammars (15 rules, 8 non terminals, 9 terminals): 120 sentences to converge

• on (ab)*: all (15) strings of length 30

• on Dyck1: all (65) strings of length 12

180

Erice 2005, the Analysis of Patterns. Grammatical Inference

180

Algorithms

RPNI

K-Reversible

GRIDS

SEQUITUR

L*

181

Erice 2005, the Analysis of Patterns. Grammatical Inference

181

5 Open questions and conclusions

• dealing with noise• classes of languages that adequately mix Chomsky’s hierarchy with edit distance compacity

• stochastic context-free grammars• polynomial learning from text• learning POMDPs• fast algorithms

182

Erice 2005, the Analysis of Patterns. Grammatical Inference

182

ERNESTO SÁBATO, EL TÚNEL 

   Intuí que había caído en una trampa y quise huir. Hice un enorme esfuerzo, pero era tarde: mi cuerpo ya no me obedecía. Me resigné a presenciar lo que iba a pasar, como si fuera un acontecimiento ajeno a mi persona. El hombre aquel comenzó a transformarme en pájaro, en un pájaro de tamaño humano. Empezó por los pies: vi cómo se convenían poco a poco en unas patas de gallo o algo así. Después siguió la transformación de todo el cuerpo, hacia arriba, como sube el agua en un estanque. Mi única esperanza estaba ahora en los amigos, que inexplicablemente no habían llegado. Cuando por fin llegaron, sucedió algo que me horrorizó: no notaron mi transformación. Me trataron como siempre, lo que probaba que me veían como siempre. Pensando que el mago los ilusionaba de modo que me vieran como una persona normal, decidí referir lo que me había hecho. Aunque mi propósito era referir el fenómeno con tranquilidad, para no agravar la situación irritando al mago con una reacción demasiado violenta (lo que podría inducirlo a hacer algo todavía peor), comencé a contar todo a gritos. Entonces observé dos hechos asombrosos: la frase que quería pronunciar salió convertida en un áspero chillido de pájaro, un chillido desesperado y extraño, quizá por lo que encerraba de humano; y, lo que era infinitamente peor, mis amigos no oyeron ese chillido, como no habían visto mi cuerpo de gran pájaro; por el contrario, parecían oír mi voz habitual diciendo cosas habituales, porque en ningún momento mostraron el menor asombro. Me callé, espantado. El dueño de casa me miró entonces con un sarcástico brillo en sus ojos, casi imperceptible y en todo caso sólo advertido por mí. Entonces comprendí que nadie, nunca, sabría que yo había sido transformado en pájaro. Estaba perdido para siempre y el secreto iría conmigo a la tumba.