Information Technology (Semester 7)
Total marks: 80
Total time: 3 Hours
INSTRUCTIONS
(1) Question 1 is compulsory.
(2) Attempt any three from the remaining questions.
(3) Draw neat diagrams wherever necessary.
1.a.
Explain with example k-fold cross validation.
(5 marks)
00
1.b.
Write short note on Vapnik-Chervonenkis dimension.
(5 marks)
00
Or
2.a.
Explain two methods for reducing dimensionality.
(5 marks)
00
2.b.
Write short note on Gram Matrix with an example.
(5 marks)
00
3.a.
What are support vector margins and also explain soft margin.
(5 marks)
00
3.b.
Explain the term bias-Variance dilemma.
(5 marks)
00
Or
4.a.
Explain Predictive and descriptive task.
(5 marks)
00
4.b.
Explain Perceptron training algorithm for linear classification.
(5 marks)
00
5.a.
Consider following five data points:
(0,3), (3,3) (3,0), (-2,-4) and (-4,-2)
Clusters are formed as follows:
Case 1:
A. First two points together in one cluster.
B. Remaining three in another cluster.
Case 2:
A. First three points together in one cluster.
B. Remaining two in another cluster.
Find Out:
i) Within-cluster scatters for both cases.
ii) Between-cluster scatters for both cases.
iii) Also comment which clustering produces tighter cluster whose centroids are further apart.
(12 marks)
00
5.b.
Define and explain following terms.
i) Minority Class.
ii) Gini Index.
iii) Entropy.
(6 marks)
00
Or
6.a.
Find all association rules in the following database in the following
database with minimum support = 2 and minimum confidence = 65%.
(10 marks)
00
6.b.
Consider following splits having four features: [8]
Length = [3,4,5] [2+,0-][1+, 3-] [2+, 2-]
Gills = [Yes, No] [0+, 4-] [5+, 1-]
Beak = [Yes, No] [5+, 3-] [0+2-]
Teeth = [many, few] [3+, 4-] [2+, 1-]
Find Total weighted Entropy & Gini-index of all Features.
(5 marks)
00
7.a.
Define Bayes Rule and solve following example.
Example: 5% of people in a city having cancer. In that city 10% people are smoker, Also 20% of people with cancer and smoker. Find out the probability of people who are smoker possess cancer.
(8 marks)
00
7.b.
Define
i) Bernoulli’s distribution
ii) Binomial distribution.
iii) MAP decision rule.
iv) Maximum likelihood function.
(8 marks)
00
Or
8.a.
For the given dataset apply Naïve Bayes algorithm and predict the
outcome for the car={Red, Domestic, SUV}.
(8 marks)
00
8.b.
Write short note on GMM.
(8 marks)
00
9.a.
Write short note on Feed - forward Neural Network.
(8 marks)
00
9.b.
Write short note on Ensemble learning.
(8 marks)
00
Or
10.a.
Explain why we use non-linearity function? States & explain 3 types of
neurons that add non-linearity in their computations.
(8 marks)
00
10.b.
Write short note on Reinforcement learning.
(8 marks)
00