• Tidak ada hasil yang ditemukan

Applying Twenty-Two Attributes

5.1 Performance evaluation

5.1.5 Applying Twenty-Two Attributes

Table (5.32) shows the execution time for each classifier, which shows the computational cost in building classification models using nineteen attributes.

Classifier Execution Time (Secs)

ANFIS 12.67

Decision tree 5.45

SVM 5.64

Naïve Bayes 8.14

MLP 13.86

Table 5. 32 Execution time for nineteen attributes classifiers

As shown from Table (5.32), execution time for applying ANFIS classifier when using nineteen attributes has increased. As shown in the above table, MLP classifier as usual has got the highest value of execution time with 13.86s, which is considered long time in building this model, then ANFIS with 12.67s. The values of execution time for all classifiers have also increased. Decision tree has got the less value of execution time of 5.45s, which proves its best time efficiency over the other classifiers when increasing the number of attributes. SVM comes in the second place with a value of 5.64s. The third place goes to Naïve Bayes with time execution of 8.14s.

3- Age.

4- Gender.

5- GPA.

6- Social strata.

7- Math Skills.

8- Time to work.

9- Type of degree.

10- Educational degree.

11- English skills.

12- Completion period.

13- Interpersonal skills.

14- High school grade.

15- Talent.

16- Programming skills.

17- Tech certificate.

18- Experience.

19- Team work skills.

20- Status.

21- Major.

22- No of application.

Class Employed =

yes

Employed = no

Total

Employed = yes

290 10 300

Employed = no

5 395 400

Total 295 405 700

Table 5. 33 Confusion matrix of ANFIS twenty two attibutes classifier

Accuracy for ANFIS classifier = 640 / 700 = 91 %

Class Employed =

yes

Employed = no

Total

Employed = yes

295 5 300

Employed = no

2 398 400

Total 297 403 700

Table 5. 34 Confusion matrix of Decision Tree twenty two attibutes classifier Accuracy for Decision Tree = 650/ 700 = 92 %

Class Employed =

yes

Employed = no

Total

Employed = yes

255 45 300

Employed = no

36 364 400

Total 291 409 700

Table 5. 35 Confusion matrix of SVM classifier with twenty two attributes Accuracy for SVM classifier = 574 / 700 = 82 %

Class Employed =

yes

Employed = no

Total

Employed = yes

237 83 300

Employed = no

32 358 400

Total 267 421 700

Table 5. 36 Confusion matrix of Naive Bayes classifier with twenty two attibutes Accuracy for Naïve Bayes classifier = 545/ 700 = 77 %

Class Employed =

yes

Employed = no

Total

Employed = yes

278 22 300

Employed = no

21 379 400

Total 299 408 700

Table 5. 37Confusion matrix of MLP classifier with twenty two attibutes Accuracy for MLP classifier = 615 / 700 = 87 %

As shown in the tables above that important change has occurred, which is Decision tree classifier beats both ANFIS and MLP classifiers with accuracy ratio of 92% when applying 22 attributes including "No of application" attribute. The ANFIS classifier comes in second place with accuracy ratio of 91. The accuracy ratio of MLP classifier not affected with the same accuracy ratio of 87 %. Fourth place goes to SVM classifier with an accuracy of 82 %. Naive Bayes has got the lowest accuracy with an accuracy of 77%. As shown (Figure 5.17), Decision tree and ANFIS classifiers have gained the highest accuracy ratio over the other classifiers.

Figure 5. 17 Efficiency comparison of classifiers with twenty two attributes

Table 5.38 shows Recall, False-Positive rate, Precision, and F-score values for applying the classification model with twenty-two attributes. The measure values in the following table are for both classes; "Employed"

and "Not-employed".

Classifier Class label Recall (%) False-Positive rate (%)

Prec. (%) F-score (%)

ANFIS Employed 91.2 1.1 89.6 91.3

Not- employed

90.9 1.8 89.3 90.1

Decision tree

Employed 92.4 0.9 92.3 92.6

Not- employed

91.7 1.0 90.1 91.3

SVM Employed 82.5 3.9 81.6 82.5

Not- employed

81.7 3.8 82.5 81.4

65%

70%

75%

80%

85%

90%

95%

ANFIS Decision tree

SVM Naïve bayes

MLP

Accuracy

Accuracy

Naïve Bayes

Employed 78.5 4.1 77.7 77.5

Not- employed

77.4 4.2 76.1 76.5

MLP Employed 88.6 1.7 88.1 88.6

Not- employed

87.8 2.0 87.9 87.3

Table 5. 38 Detailed accuracy by each class for twenty two attributes classifiers.

As shown in Table (5.38), after building prediction classifiers using twenty-two attributes, the Decision tree classifier has obtained the largest values for Recall, Precision, Recall, F-score; and less value for False- Positive rate. The accuracy ratio has improved when added "No_of_application" attribute to selected attributes list.

Figure 5. 18 F-score of employed and not employed classes for each classifier with applying twenty two attributes

As shown in Figure (5.18) the prediction ratio has enhanced when "No of application" attribute added to the selected attributes list, which indicates good weight of this attribute.

0 10 20 30 40 50 60 70 80 90 100

ANFIS Decision

tree SVM

Naïve bayes MLP

F-measure (%) of employed

F-measure (%) of not- employed

Figure 5. 19 False-Positive rate of employed and not employed classes for each classifier with twenty two attributes

As shown in Figure (5.19) the False-Positive rate values of the employed class for all classifier are lower than False-Positive rate values of not-employed class. This shows again, the excellent ratio of prediction for the “employed” class when applying twenty-two attributes.

Table (5.39) shows the values of RMSE and Kappa measures to measure the efficiency of the prediction classifier when using twenty-two attributes including "No of application" attribute.

Classifier RMSE Kappa statistic

ANFIS 0.1746 0.9176

Decision tree 0.1638 0.9235

SVM 0.2538 0.8264

Naïve Bayes 0.3125 0.7848

MLP 0.1813 0.8843

Table 5. 39 RMSE and Kappa statistic values for each classifier applying twenty two attributes

As shown in Table (5.39), Kappa and RMSE measure values have enhanced when adding "No of application"

attribute. Figure (5.20) shows an efficiency comparison of classifiers according to RMSE and Kappa statistic

0 1 2 3 4 5 6

ANFIS Decision

tree SVM

Naïve bayes MLP

FP-rate (%) of employed

FP-rate (%) of not- employed

values when applying twenty-two attributes. Also, the above shows that Decision tree has defeated the ANFIS classifier for the first time.

Figure 5. 20 An efficiency comparison of classifiers according to RMSE and Kappa statistic values when applying twenty two attributes

As shown in Figure (5.20), the efficiency performance of all classifiers has improved with preference to Decision tree classifier.

Table (5.40) shows the execution time for each classifier, which shows the computational cost in building classification models using twenty-two attributes.

Classifier Execution Time (Secs)

ANFIS 17.48

Decision tree 7.16

SVM 7.59

Naïve Bayes 10.48

MLP 18.74

Table 5. 40 Execution time for twenty two attributes classifiers

As shown from Table 5.40, execution time for applying ANFIS classifier when using twenty-two attributes has increased significantly. As shown in the above table, MLP classifier as usual has got the highest value

RMSE

00.20.40.6 0.81 RMSE

Kappa statistic

of execution time with 18.74s, which is considered very long time in execution the building task for this classifier, then ANFIS with 17.48 s, which also long time. The values of execution time for all classifiers have also increased. Decision tree has got the less value of execution time of 7.16s, which proves its best time efficiency over the other classifiers when increasing the number of attributes. SVM comes in the second place with a value of 7.59s. The third place goes to Naïve Bayes with time execution of 10.48s.