fish F 5. Consider the following dataset where the decision attribute is restaurant: mealPreference gender drinkPreference…

fish F 5. Consider the following dataset where the decision attribute is restaurant: mealPreference gender drinkPreference restaurant hamburger M coke mcdonalds M pepsi burgerking chicken coke mcdonalds hamburger coke mcdonalds chicken pepsi wendys fish F coke burgerking chicken M pepsi burger King chicken IF coke wendys fish coke mcdonalds hamburger coke mcdonalds IM M F If we want to make a decision tree for determining restaurant, we must decide which of the three non-decision attributes (mealPreference, gender, or drinkPreference) to use as the root of the tree. a. Set up the equation to compute what in lecture we called entropy Before Split for restaurant. You do not have to actually solve (i.e., calculate the terms in the equation, just set up the equation with the appropriate values. (2 pts.) b. Set up the equation to compute entropy for mealPreference when its value is chicken. That is, a tree with mealPreference at the root would have three branches (one for hamburger, one for chicken, and one for fish), requiring us to compute entropyHamburger, entropy Chicken, and entropy Fish; here we only want you to set up the equation to compute entropyChicken. You do not have to actually solve (i.e., calculate the terms in the equation, just set it up using the appropriate values. (2 pts.) c. Suppose that instead of considering mealPreference to be the root of this decision tree, we had instead considered drinkPreference. Set up the equation to compute information gain for drinkPreference given the variables specified below. (2 pts.) entropy before any split: entropy for drinkPreference = pepsi: entropy for drinkPreference = coke: XAU