1
84kviews
Left recursion and left factoring removal technique.
1 Answer
2
2.3kviews

Left Recursion

  • Left Recursion. The production is left-recursive if the leftmost symbol on the right side is the same as the non-terminal on the left side.
  • For example, expr → expr + term. If one were to code this production in a recursive-descent parser, the parser would go in an infinite loop.

Elimination of left Recursion

We eliminate left-recursion in three steps.

  • eliminate ɛ -productions (impossible to generate ɛ!)
  • eliminate cycles (A ⇒+ A)
  • eliminate left-recursion

Algorithm

enter image description here

Step 1

2. Direct Recursion

For each rule which contains a left-recursive option,

A --> A | β

introduce a new nonterminal A' and rewrite the rule as

A --> β A'

A' --> | A'

  • Thus the production:

    E --> E + T | T

    is left-recursive with "E" playing the role of "A","+ T" playing the role of , and "T" playing the role of β A'.

  • Introducing the new nonterminal E', the production can be replaced by:

    E --> T E'

    E' --> | + T E'

  • Of course, there may be more than one left-recursive part on the right-hand side. The general rule is to replace

    A --> A | | ... | β | β | ... | β

    by

    A --> β A' | β A'| ...| β A'

    A' --> | A' | A' | ...| A'

  • Note that this may change the "structure". For the expression above, the original grammar is left-associative, while the non-left recursive one is now right-associative.

Step 2

Indirect Recursion

  • Step one describes a rule to eliminate direct left recursion from a production. To eliminate left-recursion from an entire grammar may be more difficult because of indirect left-recursion.
  • For example

    A --> B x y | x

    B --> C D

    C --> A | c

    D --> d

is indirectly recursive because

A ==> B x y ==> C D x y ==> A D x y.

That is, A ==> ... ==> A where is D x y.

The algorithm eliminates left-recursion entirely. It contains a "call" to a procedure which eliminates direct left-recursion (as described in step one).

Left Factoring

Left factoring is another useful grammar transformation used in parsing

Left Factoring is a grammar transformation technique. It consists in "factoring out" prefixes which are common to two or more productions.

For example, going from:

A -> α β | α γ

to:

A -> α A'

A' -> β | γ

Left factor:

Let the given grammar: A-->ab1 | ab2 | ab3

1)We can see that, for every production, there is a common prefix & if we choose any production here, it is not confirmed that we will not need to backtrack. 2)It is non deterministic, because we cannot choice any production and be assured that we will reach at our desired string by making the correct parse tree. But if we rewrite the grammar in a way that is deterministic and also leaves us to be flexible enough to make it any string that may be possible without backtracking.... it will be:

  • A --> aA', A' --> b1 | b2| b3 now if we are asked to make the parse tree for string ab2.... we don't need back tracking. Because we can always choose the correct production when we get A' thus we will generate the correct parse tree.
  • Left factoring is required to eliminate non-determinism of a grammar. Suppose a grammar, S -> abS | aSb
  • Here, S is deriving the same terminal a in the production rule (two alternative choices for S), which follows non-determinism. We can rewrite the production to defer the decision of S as-

    S -> aS'

    S' -> bS | Sb

    Thus, S' can be replaced for bS or Sb

    Difference between Left Factoring and Left Recursion

    1. Left recursion: when one or more productions can be reached from themselves with no tokens consumed in-between.

    2. Left factoring: a process of transformation, turning the grammar from a left-recursive form to an equivalent non-left-recursive form.

    Left factoring is removing the common left factor that appears in two productions of the same non-terminal. It is done to avoid back-tracing by the parser. Suppose the parser has a look-ahead ,consider this example-

    A -> qB | qC

    where A,B,C are non-terminals and q is a sentence. In this case, the parser will be confused as to which of the two productions to choose and it might have to back-trace. After left factoring, the grammar is converted to-

    A -> qD

    D -> B | C

    In this case, a parser with a look-ahead will always choose the right production.

    Left recursion is a case when the left-most non-terminal in a production of a non-terminal is the non-terminal itself (direct left recursion) or through some other non-terminal definitions, rewrites to the non-terminal again (indirect left recursion).

    Consider these examples -

    (1) A -> Aq (direct) (2) A -> Bq B -> Ar (indirect)

    Left recursion has to be removed if the parser performs top-down parsing

Please log in to add an answer.