• Tidak ada hasil yang ditemukan

MACHINE LEARNING - IARE

N/A
N/A
Protected

Academic year: 2025

Membagikan "MACHINE LEARNING - IARE"

Copied!
3
0
0

Teks penuh

(1)

1 | P a g e

MACHINE LEARNING

VIII Semester: CSE / IT

Course Code Category Hours / Week Credits Maximum Marks

ACS014 Core L T P C CIA SEE Total

3 - - 3 30 70 100

Contact Classes: 45 Tutorial Classes: Nil Practical Classes: Nil Total Classes: 45 I. COURSE OVERVIEW:

The main emphasis of this course is to provide systems the ability to automatically learn and improve from experience without being explicitly programmed.The course includes the fundamental concepts to build, train, and predict data models using machine learning (ML) algorithms. This course provides a clear understanding on concepts ofsupervised learning through decision trees, advanced techniques like neural networks, Naive Bayes and k-nearest neighbor algorithmandintroduction tounsupervised and reinforcement learning.Machine Learning has revolutionized industries like medicine, healthcare, manufacturing, banking, and several other industries.

II. OBJECTIVES:

The course should enable the students to:

I The fundamental concepts, issues and challenges of machine learning associated to data for model selection.

II The supervised learning methods such as decision trees, Na¨ıve Bayes classifier, k-nearest neighbor learning for building data models and basics of unsupervised learning methods.

III The knowledge used for making predictions or decisions without human intervention on real-world problems.

III. COURSE OUTCOMES:

After successful completion of the course, students should be able to:

CO 1 Demonstrate machine learning concepts with decision trees in data classification for smart and automated applications.

Understand CO 2 Make use of support vector machine and multilayer perceptrons to control learning

rate in high dimensionality data classification.

Apply CO 3 Select probabilistic classifiers with Naivebayes and graphical models for temporal

data classification.

Remember CO 4 Outline evolutionary algorithms to solve optimization problems in stochastic

manner in machine learning.

Remember CO 5 Utilize data clustering algorithms to perform cluster analysis with large

categorical datasets in real life data mining applications.

Apply CO 6 Identify appropriate machine learning techniques and suitable computing

environment for real time applications.

Remember

IV. SYLLABUS:

UNIT-I TYPES OF MACHINE LEARNING Classes: 09

Concept learning: Introduction, version spaces and the candidate elimination algorithm; Learning with trees:

Constructing decision trees, CART, classification example.

UNIT-II LINEAR DISCRIMINANTS Classes: 09

Perceptron (MLP): Going forwards, backwards, MLP in practices, deriving back; Propagation support vector Machines: Optimal separation, kernels.

(2)

2 | P a g e

UNIT-III BASIC STATISTICS Classes: 09

Averages, variance and covariance, the Gaussian; The bias-variance tradeoff Bayesian learning: Introduction, Bayes theorem, Bayes optimal classifier, naïve Bayes classifier.

Graphical models: Bayesian networks, approximate inference, making Bayesian networks, hidden Markov models, the forward algorithm.

UNIT-IV EVOLUTIONARY LEARNING Classes: 09

Genetic Algorithms, genetic operators; Genetic programming; Ensemble learning: Boosting, bagging;

Dimensionality reduction: Linear discriminate analysis, principal component analysis (JAX-RPC).

UNIT-V CLUSTERING Classes: 09

Similarity and distance measures, outliers, hierarchical methods, partitional algorithms, clustering large databases, clustering with categorical attributes, comparison.

Text Books:

1.

Tom M. Mitchell, "Machine Learning ", McGraw Hill, 1st Edition, 2013.

2.

Stephen Marsland, "Machine Learning - An Algorithmic Perspective ", CRC Press, 1st Edition, 2009.

Reference Books:

1. Margaret H Dunham, "Data Mining", Pearson Edition, 2nd Edition, 2006.

2. Galit Shmueli, Nitin R Patel, Peter C Bruce, "Data Mining for Business Intelligence", John Wiley and Sons, 2nd Edition, 2007.

3. Rajjal Shinghal, "Pattern Recognition and Machine Learning", Springer-Verlag, New York, 1st Edition, 2006.

Web References:

1. Httd://ww.udemy.com/MachineLearning/Online_Course‎

2. https://en.wikipedia.org/wiki/Machine_learning E-Text Books:

1. http://www.e-booksdirectory.com/details.php?ebook=1118 2. http://www.otexts.org/sfml

Course Home Page:

(3)

3 | P a g e

Referensi

Dokumen terkait

Dalam penelitian ini mempunyai tujuan untuk mengimplementasikan algoritma k-nearest neighbor classifier dan naïve bayes classifier untuk menghasilkan klasifikasi beasiswa

In this paper several methods of supervised learning (decision tree, naïve bayes, neural network, and deep learning) are used to classify cancer cells based on

Ridge or Lasso Linear Regression, Bayesian Linear MACHINE LEARNING PATTERN RECOGNITION Supervised Learning Unsupervised Learning Semi supervised Learning Reinforcement

4.2 Experimental Results and Analysis We have used supervised algorithms like Decision tree, Naive Bayes, Random Forest, Support Vector Machine SVM, K-Nearest Neighbors KNN and

Nonparametric:  Decision trees, decision lists  Kernel estimation and K- nearest- neighbour algorithms  Naive Bayes classifier  Neural networks multi- layer perceptrons 

Then Some classic Machine Learning algorithms like Naive Bayes, Random Forest, Logistic Regression, K-Nearest Neighbors, Support Vector Machine SVM, Decision Tree and Neural Network

TABLE 4.1:CLASSIFICATION ACCURACY Name of classifier Accuracy Random Forest 90% Decision Tree 90% Gaussian Naive Bayes 80% Bernoulli Naive Bayes 45% Multinomial Naive Bayes 65%

The results of performance calculations for the K-Nearest Neighbor, Decision Tree, Random Forest, and Naïve Bayes algorithms using data validation, namely 10 k-fold and performance