0
8.6kviews
Explain the decision tree algorithm by using the example of an agent which needs to make a decision about "Whether to wait for a table" in a restaurant.
1 Answer
1
568views

Decision tree induction is one of the simplest and yet most successful forms of machine

Learning. We first describe the representation—the hypothesis space—and then show how to learn a good hypothesis.

A decision tree represents a function that takes as input a vector of attribute values and returns a “decision”—a single output value. The input and output values can be discrete or continuous.

For now we will concentrate on problems where the inputs have discrete values and the output has exactly two possible values; this is Boolean classification, where each example input will be classified as true (a positive example) or false (a negative example).

A decision tree reaches its decision by performing a sequence of tests. Each internal node in the tree corresponds to a test of the value of one of the input attributes, Ai, and the branches from the node are labeled with the possible values of the attribute, Ai =vik.

Each leaf node in the tree specifies a value to be returned by the function. The decision tree representation is natural for humans; indeed, many “How To” manuals (e.g., for car repair) are written entirely as a single decision tree stretching over hundreds of pages.

As an example, we will build a decision tree to decide whether to wait for a table at a restaurant. The aim here is to learn a definition for the goal predicate Will Wait.

First we list the attributes that we will consider as part of the input:

enter image description here

Alternate: whether there is a suitable alternative restaurant nearby.

Bar: whether the restaurant has a comfortable bar area to wait in.

Fri/Sat: true on Fridays and Saturdays.

Hungry: whether we are hungry.

Patrons: how many people are in the restaurant (values are None, Some, and Full)

Price: the restaurant’s price range ($, $$, $$$).

Raining: whether it is raining outside.

Reservation: whether we made a reservation.

Type: the kind of restaurant (French, Italian, Thai, or burger).

Wait Estimate: the wait estimated by the host (0–10 minutes, 10–30, 30–60, or >60). Note that every variable has a small set of possible values; the value of Wait Estimate, for example, is not an integer, rather it is one of the four discrete values 0–10, 10–30, 30–60, or >60. The decision tree usually used by one of us (SR) for this domain is shown in Figure 18.2.

Notice that the tree ignores the Price and Type attributes. Examples are processed by the tree

Starting at the root and following the appropriate branch until a leaf is reached. For instance,

An example with Patrons =Full and Wait Estimate=0–10 will be classified as positive (i.e., yes, we will wait for a table).

enter image description here

Here T= True, F=False

11.Entrophy= $i^{\sum}$P($v_i$)log10[P($v_i$)]

i=no of values

p=purity of each decision.

v=decision is yes/no.

Please log in to add an answer.