µ¦¨¤·· o°¤¼¨oª¥µ¦´Á¨º°»¨´¬³ o°¤¼¨Ã¥°µ«´¥ª·¸µ¦´¨Îµ´
¤·· o°¤¼¨Â®¨µ¥¨Îµ´´Êoª¥Á·Linear SVM Weight ¦nª¤´ReliefF Dimensionality Reduction using Multi-layers for Feature Selection Combined
Linear SVM Weight and ReliefF Technique
ª·£µª¦¦´ª°1
´¥n°
µª·´¥¸ÊÁ}µ¦ÎµÁ°ª·¸µ¦¨¤·· o°¤¼¨oª¥ª·¸µ¦´Á¨º°»¨´¬³ o°¤¼¨Ã¥°µ«´¥ª·¸µ¦´¨Îµ´¤·· o°¤¼¨Â®¨µ¥¨Îµ´´Êoª¥Á·Linear SVM Weight ¦nª¤´5HOLHI) Á¡ºÉ°Äo妳Á£ o°¤¼¨´´ªÎµÂ¦³Á£ o°¤¼¨
classifierÂSupport Vector Machine Ã¥°´» o°¤¼¨ SRBCT ¨³ o°¤¼¨usps ¨µµ¦«¹¬µ¡ªnµª·¸µ¦¨¤·· o°¤¼¨¸ÉεÁ°Ä®o¦³··£µ¡
¸ªnµª·¸µ¦¨¤·· o°¤¼¨oª¥ª·¸µ¦´Á¨º°»¨´¬³ o°¤¼¨Â¨Îµ´´ÊÁ¸¥ªoª¥
Á·Linear SVM Weight ®¦º°Â5HOLHI) ¨nµªº°» o°¤¼¨ SRBCT ¹ÉÄo ª·¸µ¦´Á¨º°»¨´¬³ o°¤¼¨Â®¨µ¥¨Îµ´´ÊÄ®onµªµ¤¼o°Äµ¦ÎµÂ
o°¤¼¨°¥¼n¸É¦o°¥¨³100 ¨³µ¤µ¦¨¤·· o°¤¼¨Á®¨º°Á¡¸¥8 ¤·· o°¤¼¨µ´Ê®¤
2,308 ¤·· o°¤¼¨Â¨³» o°¤¼¨usps Ä®onµªµ¤¼o°Ánµ´¦o°¥¨³95.76 ¸É55 ¤··
o°¤¼¨µ´Ê®¤256 ¤·· o°¤¼¨
Abstract
This research investigated how dimensional data could be downsized using multi-layers feature selection method. A combined Linear SVM Weight DQG5HOLHI)ZDs a proposed technique for data classification with classifier QDPHO\6XSSRUW9HFWRU0DFKLQHV690V7ZRGDWDVHWVLQFOXGLQJ65%&7DQG usps, were used for the experiment. We discovered that the proposed technique was more efficient than using either LineaU690:HLJKWRU5HOLHI)
1 °µµ¦¥rµ µÁÃ襸µ¦Á«³ª·¥µ«µ¦r¨³ÁÃ襸¤®µª·¥µ¨´¥¦µ£´£¼ÁÈ
e¸É11 ´¸É2 ¦µ¤ – ´ªµ¤2558
alone for dimensionality reduction. The dimensional data could be downsized from 2,308 to 8 attributes where the accuracy rate could reach 100 percent in SRBCT. The experimental result of SBRCT was also consistent with that of usps in which the dimensional data could be downsized 256 to 55 attributes with the accuracy of 95.76 percent.
Keywords : Dimensionality Reduction, Multi-/D\HUV)HDWXUH6HOHFWLRQ5HOLHI) Linear SVM Weight
ε
{®µ¤·· ° o°¤¼¨¸É¤µÁ·ÅÁ}{®µ¸É¡Åoĵ¦ÎµÁ®¤º° o°¤¼¨Ä
{»´µ¦¨¤·· o°¤¼¨´Á}¦³ªµ¦®¹ÉÄ ´Ê°µ¦Á¦¸¥¤ o°¤¼¨Tan P. N., Steinbach, M., and Vipin, K., 2005) ´Éº°µ¦ÎµÄ®o o°¤¼¨´Êo¤¸ µ¨¨Ã¥
¼Á¸¥¨´¬³Îµ´ ° o°¤¼¨o°¥¸É»Â¨³¼Á¸¥ªµ¤¼o° °¨¨´¡ro°¥
¸É»ÁºÉ°µ o°¤¼¨Ân¨³´ª³¤¸ªµ¤Îµ´n°µ¦´¨»n¤ o°¤¼¨Å¤nÁnµ´oª¥
Á·µ¦Á¨º° o°¤¼¨¸É¸³ÎµÄ®oµ¤µ¦Á¨º° o°¤¼¨¸É¤¸ªµ¤Îµ´Â¨³µ¤µ¦Äo Á}´ªÂ ° o°¤¼¨nªÄ®nÅo¹Éε¤Á}¦·{®µ °¤·· o°¤¼¨ ¤´¡Åo Á¤°Ä o°¤¼¨ µÄ®n¹ÎµÁ}o°¤¸µ¦¨ µ¤·· o°¤¼¨¨µ¦¨¤·· o°¤¼¨¤´
¦³ÎµÄ°Âªµº°1¨¨´¬³¸É°·µ¥ o°¤¼¨Ä®oÁ®¨º°Á¡µ³¨´¬³®¨´¸É
ε´attributes of features³°¥¼nĨ»n¤¸ÉÁ¦¸¥ªnµµ¦´Á¨º°¨´¬³ o°¤¼¨
feature selection¨³2¨Îµª¦µ¥µ¦ o°¤¼¨Ä®oÁ®¨º°Á¡µ³¦µ¥µ¦ o°¤¼¨¸É
µ¤µ¦Á}´ªÂ ° o°¤¼¨¨»n¤Ä®nÅofeature extraction¹ÉÁ·¸É·¥¤Äo¨
εª o°¤¼¨¦³Á£¸Êº°µ¦»n¤samplingµ¦´¨»n¤clustering) 0RKG$Iizi 0RKG6KXNUDQ2PDU=DNDULD1RRUKDQL]D:DKLG$KPDGDQG0XMDKLG$KPDG
=DLGL
{»´¤¸´ª·´¥Îµª®¹ÉÅo嵦«¹¬µª·´¥Á¸É¥ªµ¦¨¤·· o°¤¼¨oª¥
ª·¸µ¦´Á¨º°»¨´¬³´¸Ê£´¦µª»·Â¨³³ £´¦µª»·Â«·¦·, «¸¤µª·Á¸¥¦
¨³¡¥»¤¸´, Á¤¬µ¥-¤·»µ¥, 2553) Åo嵦Á¦¸¥Á¸¥µ¦¨¤·· o°¤¼¨Ã¥Äo ª·¸µ¦´Á¨º°»¨´¬³Â&RUUHODWLRQ %DVHG )HDWXUH 6HOHFWLRQ &%)6 Gain Ratio ¨³Information Gain ¨oªÎµ¨¨´¡r¸ÉÅoÅÁ}°·¡» °´ªÎµÂ
¦³Á£ o°¤¼¨Â´¡¡°¦rÁªÁ°¦r¤¸Support Vector Machine: SVM¡ªnµ ª·¸µ¦ÂGain Ratio ¨³Information Gain Ä®o¦³··£µ¡¸ªnµÂ&%)6 ÁnÁ¸¥ª´µª·´¥ °ªÂ¨³³ ª«¦¸°»Å¦, ¡¥»¤¸´Â¨³¼µ·®§Å¥³
«´·Í, ¡§¬£µ¤ ¸É«¹¬µª·¸µ¦ÎµÂ®¤ª®¤¼nÁ°µ¦Ã¥ÄoÂε¨°
®´ª o°¦nª¤´ª·¸µ¦´Á¨º°»¨´¬³¸Éε´°ª·¸º°Gain Ratio ¨³Chi Squared Á oµ¤µÄoĵ¦´Á¨º°»¨´¬³¸ÉÁ®¤µ³¤Á¡ºÉ°Îµ¤µÎµÂ®¤ª®¤¼n Á°µ¦oª¥´ªÎµÂ¦³Á£ o°¤¼¨Âª·¸µ¦ °Á¥r´¡¡°¦rÁªÁ°¦r¤¸
¨³µ¦´·ÄÃ¥Äo´oŤo¨µ¦ª·´¥ÂÄ®oÁ®Èªnµµ¦´Á¨º°
»¨´¬³oª¥ª·¸ÁInformation Gain¨³Äo´ªÎµÂ¦³Á£ o°¤¼¨Â´¡¡°¦r
ÁªÁ°¦r¤¸Ä®o¦³··£µ¡¸¸É» $EGROKRVVHLQ6DUUDI]DGHK+DELEROODK$JK Atabay, Mir Mosen Pedram, Jamshid Shanbehzadeh, 2012) Äoª·¸µ¦´Á¨º°
»¨´¬³Â5HOLHI) ĵ¦¨¤·· °Content-Based Image ¡ªnµµ¤µ¦
o®µ o°¤¼¨Åo¦ªÁ¦ÈªÂ¨³¼o°¤µ ¹Ê¨³µª·´¥ °;LQ -LQ 5RQJ\DQ /L Xian Shen, Rongfang Bie, 2007) ¸ÉÅo嵦«¹¬µWeb Mining Ã¥Äo´ªÎµÂ
¦³Á£ o°¤¼¨ÂHiddenNaïve Bayes ¨³Äoª·¸µ¦´Á¨º°»¨´¬³Â
5HOLHI) ¡ªnµnµªµ¤¼o°Äµ¦ÎµÂ¦³Á£ o°¤¼¨¼ ¹Ê
´Ê¸Êª·¸µ¦¨¤·· o°¤¼¨¸ÉÄoĵª·´¥Îµª¤µ°¥¼n¡ºÊµ °
µ¦¡·µ¦µ´Á¨º°»¨´¬³ o°¤¼¨¸Éε´Á¡¸¥¦´ÊÁ¸¥ªoª¥µ¦¡·µ¦µ
´Á¨º°Á¡¸¥¨Îµ´´ÊÁ¸¥ªÂ¨³ÄoÁ·Á¸¥ªÁnµ´Ê®¦º°°µ¨nµªÅoªnµÄoÁr Á¡¸¥ÁrÁ¸¥ªÄµ¦¡·µ¦µ´Á¨º°¤·· o°¤¼¨¹É°µÎµÄ®oÁ·{®µ¤·· o°¤¼¨¸É¤¸
ªµ¤Îµ´Å¤nÅo¦´µ¦´Á¨º°
µª·´¥¸Ê¤¸ª´»¦³rÁ¡ºÉ°ÎµÁ°Á·µ¦¨¤·· o°¤¼¨oª¥ª·¸µ¦
´Á¨º°»¨´¬³¸Éε´Â5HOLHI) /LQHDU 690 :HLJKW ¨³ÂLinear SVM Weight ¦nª¤´5HOLHI)¤µÁ¦¸¥Á¸¥´Ã¥ÄoÁ·µ¦ÎµÂ o°¤¼¨Â
e¸É11 ´¸É2 ¦µ¤ – ´ªµ¤2558
Support Vector Machine Á¡ºÉ°Âo{®µµ¦´¤·· o°¤¼¨¸É¤¸ªµ¤Îµ´·Êŵ
µ¦Äoª·¸µ¦´Á¨º°»¨´¬³ o°¤¼¨oª¥¨Îµ´´ÊÁ¡¸¥´ÊÁ¸¥ªÄµ¦¡·µ¦µ
´Á¨º°Â¨³¡·µ¦µªnµÁ·´¨nµªÁ®¤µ³¤´ o°¤¼¨¦³Á£Ä´´Êµª·´¥
·Ê¸Ê¹¤¸Âª·¸É³Á¡·É¤¨Îµ´´Êµ¦¡·µ¦µÄµ¦´Á¨º°¤·· o°¤¼¨Ã¥ÄoÁ·
µ¦¨¤·· o°¤¼¨®¨µ¥Á·¤µ¤µ´Â¨³Á}µ¦Á¡·É¤¨Îµ´´Êĵ¦¡·µ¦µ
´Á¨º°¤·· o°¤¼¨Ä®o¨³Á°¸¥ ¹Ê
µÂ¨³§¬¸¸ÉÁ¸É¥ª o°
´¡¡°¦rÁªÁ°¦r¤¸
´¡¡°¦rÁªÁ°¦r¤¸Hsu, C.-w., Chang, C.-c., & Lin, C.-j , 2010) Á}´ªÎµÂ¦³Á£ o°¤¼¨·®¹É¸ÉÄo®¨´µ¦®µ °¸Éªoµ¸É»Â¨³Âo{®µÄ
µ¦ÎµÂ o°¤¼¨Â2 ¨µ®¦º°ÄoÁ¡ºÉ°®µ¦³µµ¦´·Äĵ¦Ân o°¤¼¨
°°Á}°nªÃ¥Äo¤µ¦Áo¦Á¡ºÉ°ÂnÁ o°¤¼¨¨»n¤°°µ´Â¨³®µ
¨¨´¡r¸É¸¸É»Ã¥Äoµ¦Á¦¸¥¦¼oµ·· ° o°¤¼¨¹ÉεµÃ¥µ¦®µnµ¦³¥³ °¸É
¤µ¸É»Maximum Margin) °¦³µ´·Ä'HFLVLRQ+\SHUSODQHĵ¦
ÂnÂ¥¨»n¤ o°¤¼¨¸ÉÄo f®´°°µ´Ã¥Äo¢{r´Â¤È o°¤¼¨µInput space Å¥´)HDWXUH 6SDFH ´¦¼¸É1 ¨³¦oµ¢{r´ª´ªµ¤¨oµ¥¸ÉÁ¦¸¥ªnµÁ°¦rÁ¨
¢{r´ )HDWXUH 6SDFH Ã¥¤¸ª´»¦³r¸É³¡¥µ¥µ¤¨ªµ¤·¡¨µµ
µ¦Îµµ¥0LQLPL]H HUURU ¡¦o°¤´Á¡·É¤¦³¥³ÎµÂÄ®o¤µ¸É»0D[LPL]HG margin) ¹ÉnµµÁ·Ã¥´ÉªÅÁnæ nµ¥¦³µÁ¸¥¤$UWLILFLDO 1HXUal Network : ANN) ¸É¤»nÁ¡¸¥ÎµÄ®oªµ¤·¡¨µµµ¦Îµµ¥Éε¸É»Á¡¸¥°¥nµ
Á¸¥ªÁ®¤µ³Îµ®¦´ o°¤¼¨¸É¤¸¨´¬³¤·· ° o°¤¼¨¦·¤µ¤µ
Ħ¸¸É o°¤¼¨°¨»n¤Å¤nµ¤µ¦ÂnÃ¥Äo¤µ¦´¡¡°¦rÁªÁ°¦r ¤¸ÂÁo¦ÅoÁ¡¦µ³ o°¤¼¨°µ´¨»n¤Äε®nnµÇ´´´Ê¹o°¤¸
Á¦ºÉ°¤º°¤µnª¥Ä®o o°¤¼¨Á®¨nµ´ÊÁ¦¸¥´ª´Ä®¤nÄ¡ºÊ¸É¸ÉÁ¦¸¥ªnµ¡ºÊ¸É¤··¼Higher Dimension Space) µ¦ÂnÂ¥¨»n¤ ° o°¤¼¨µ¦³µ®¨µ¥¤··Ä®o¦³··£µ¡¸É
¸ªnµª·¸µ¦Ã¥´ÉªÅÃ¥Äo¢{r´Á°¦rÁ¨.HUQHO)XQFWLRQ)
´¡¡°¦rÁªÁ°¦r¤¸¹Á}Á·¸ÉÄoĵ¦¦oµÂε¨°µ¦Á¦¸¥¦¼o °Á¦ºÉ°Á¡ºÉ°ÎµÂ¦³Á£ o°¤¼¨¸ÉÅo¦´ªµ¤·¥¤ÁºÉ°µÄ®oªµ¤¼o°Ä
µ¦ÎµÂ¦³Á£ o°¤¼¨¼´¦µÄµª·´¥®¨µ¥·ÊHuang, C.L., Chen, M.C.,
& Wang, C.J, 2007) Xuehua, L., & Lan,S, 2008) 6ZHLODP1+7KDrwat, A. A., &
Abdel Moniem, N.K, 2010) &KHQ+L., Yang, B., Liu, J., & Liu, D.Y, 2011)
¦¼¸É µ¦Ân¨»n¤ ° o°¤¼¨oª¥¤µ¦ SVM Áo¦
Linear SVM Weight
Linear SVM Weight Á}ª·¸µ¦´Á¨º°»¨´¬³Â®¹É¸Éµ¤µ¦
¦³¥»rÄoÅo¸´»Classifier *X\RQ,:HVWRQ-%DUQKLOO6 9DSQLN9
¤¸°´¨°¦·¹¤´¸Ê
¦¼¸É 2 °´¨°¦·¹¤ °Linear SVM Weight
Algorithm 1 Feature Ranking Based on Linear SVM Weights
Input : Training sets, (Xi, Yi), i =1,…..m.
Output : Sorted feature ranking list.
1. Use grid search to find the best parameter C.
2. Train aL2-loss linear SVM model using the best C.
3. Sort the features according to the absolute values
e¸É11 ´¸É2 ¦µ¤ – ´ªµ¤2558
µ£µ¡¸É2 µ¤µ¦°·µ¥°´¨°¦·¹¤Åoº°°·¡»Á} o°¤¼¨ f nª
Á°µr¡»³Åo»¨´¬³ o°¤¼¨¸É´¨Îµ´Â¨oª¹É¤¸ ´Ê°º°1Äoµ¦o®µÂ
¦·grid) Á¡ºÉ°Ä®oÅonµ¡µ¦µ¤·Á°¦rC ¸É¸¸É» 2) ¦oµÃ¤Á¨Ã¥Äo¢{r´L2-loss linear ´ o°¤¼¨Ã¥Äonµ¡µ¦µ¤·Á°¦r °C ¸É¸¸É»¸ÉÅoµ o°1 ¨³ 3) Á¦¸¥¨Îµ´
»¨´¬³ o°¤¼¨Ã¥¼µnµ´¤¼¦r °nµÊε®´¸ÉÅo <LQ-Wen Chang, Chih- Jen Lin, 2008)
ReliefF
5HOLHI) ¼¡´µ¤µµRelief ¹É¤¸ o°o°¥º°µ¤µ¦¦³Îµ´ o°¤¼¨¸É¤¸
Á¡¸¥°¨µÁnµ´Ê¨³· ° o°¤¼¨Á}ÂŤnn°ÁºÉ° nominal´ o°¤¼¨
¸ÉÁ}´ªÁ¨ numeric) Ánµ´ÊÄ ³¸É 5HOLHI)Á}ª·¸µ¦´Á¨º°»¨´¬³
¸Éµ¤µ¦¦³Îµ´ o°¤¼¨¸É¤¸¤µªnµ°¨µÃ¥ÄoÅo´ o°¤¼¨»·Â¨³¤¸
ªµ¤µn° o°¤¼¨¸ÉŤn¼o°Â¨³Å¤n¦oª$ 6DUUDI]DGHK +$ $WDED\
M.M, Pedram., and J., Shanbehzadeh, 2012) °´¨°¦·¹¤ 5HOLHI) ·oÃ¥
.RQRQHQNR ,JRU DQG (GYDUG 6LPHF Á}°´¨°¦·¹¤¸ÉεªnµÊε®´
»¨´¬³ o°¤¼¨µµ¦¡·µ¦µ o°¤¼¨¸ÉĨoÁ¸¥´ o°¤¼¨»n¤¤¸µ¦ÎµÁ·Â
K-NN Á oµ¤µ¦nª¤Äoµoª¥Äµ¦®µnµk ³®µ£µ¥Ä¨µÁ¸¥ª´n°µ¤µ¦
°´¨°¦·¹¤Åo´¸Ê
¦¼¸É °´¨°¦·¹¤ °5HOLHI)
ା (()כ ௗቀ,ோ,ெೕ()ቁ
(.)
ೖ
ಯೌೞೞ൫ೃ൯ ೕసభ
);
Algorithm ReliefF
Input: for each training instance a vector of attribute values and the class value
Output: the vector W of estimations of the qualities of attributes 1. set all weights W[A] :=0.0;
2. for i:=1 to ݉ do begin
3. randomly select an instance ܴ; 4. find k nearest hits ܪ ;
5. for each class ܥ ് ݈ܿܽݏݏ(ܴ) do 6. from class ܥ find k nearest misses ܯ(ܥ) ; 7. for A :=1 to a do
8. ݓ[ܣ]ؔ ݓ[ܣ]െ σୀଵௗ൫,ோ(.),ுೕ൯
9. end;
e¸É11 ´¸É2 ¦µ¤ – ´ªµ¤2558
ª·¸ÎµÁ·µ¦ª·´¥
´Ê°¸ÉÄoĵ¦ÎµÁ·µ«¹¬µª·´¥Ä¦´Ê¸Ê¤¸´Ê®¤3 ´Ê°¹É
¦µ¥¨³Á°¸¥ °Ân¨³ ´Ê°¤¸´n°Å¸Ê
¦¼¸É 4 ´Ê°µ¦ÎµÁ·µ
µ¦´Á¨º°» o°¤¼¨
µ¦«¹¬µª·´¥¦´Ê¸ÊÄo»µ o°¤¼¨ http://orange.biolab.si/datasets.php
εª2 » o°¤¼¨ÅoÂnSRBCT ¤¸Îµª¨µ4 ¨µÂ¨³ usps ¤¸Îµª¨µ
10 ¨µ Ã¥»» o°¤¼¨¸ÉÁ¨º°³Å¤n¤¸nµ¸É®µ¥ÅMissing Value¨³¤¸ÎµªÂ°
¦··ªr¤µªnµ100 °¦··ªr Á¡ºÉ°Ä®o o°¤¼¨¤¸ªµ¤Á®¤µ³¤´µ¦¨¤·· o°¤¼¨
´µ¦µ¸É1
Dataset SRBCT, usps
Dimensionality Reduction
ReliefF
Linear SVM Weight Linear SVM Weight +
ReliefF
Classifier (SVM)
Evaluation
µ¦µ » o°¤¼¨¸ÉÄoĵ¦¨°
» o°¤¼¨ εªÂ°¦··ªr 媨µ
65%&7椳Á¦Èn°¤ÊεÁ®¨º°) 2,308 4
8VSVµ¦Á ¸¥´ªÁ¨ oª¥¨µ¥¤º°) 256 10
Á¦ºÉ°¤º°Â¨³Á·¸ÉÄoĵ¦¨¤·· o°¤¼¨
Äo°¢rª¦rOrange Canvas version 2.72 嵦¦´¨¤·· o°¤¼¨
oª¥ª·¸µ¦´Á¨º°»¨´¬³¸Éε´Â5HOLHI) /LQHDU 690 :HLJKW ¨³ Linear SVM Weight ¦nª¤´5HOLHI)
Á·¸ÉÄoĵ¦ÎµÂ o°¤¼¨Â690 6XSSRUW 9HFWRU 0DFKLQH Äo¢{r´5%) Á°¦rÁ¨Â¨³Îµ o°¤¼¨¸É¦´¨¤··Â¨oª¤µ¦oµÂε¨°µ¦Á¦¸¥¦¼o °Á¦ºÉ°Machine Learning Model)
宦´Á·Linear SVM Weight ¦nª¤´5HOLHI) ĵ¦¨¤··¨Îµ´´Ê
¦³ÄoÁ·µ¦´Á¨º°»¨´¬³ÂLinear SVM Weight n°Ä¨Îµ´´Ê
¸É°³ÄoÁ·Â5HOLHI) ª´Â¨³¦³Á¤·¨
µ¦ª´Â¨³¦³Á¤·¦ª¤¹µ¦Á¦¸¥Á¸¥¦³··£µ¡ °Ã¤Á¨Ä
µ¦¨¤·· o°¤¼¨Ân¨³Âĵª·´¥¸ÊÄoª·¸µ¦10-)ROGV &URVV 9DOLGDWLRQ Á}
ª·¸µ¦°¦³··£µ¡ °Âε¨°µ¦Á¦¸¥¦¼o °Á¦ºÉ°oª¥µ¦Ân» o°¤¼¨
°°Á}k »ÁnµÇ´Ã¥Äo o°¤¼¨Îµªk-1 »Á¡ºÉ°Îµµ¦¦oµÂε¨°µ¦Á¦¸¥¦¼o °Á¦ºÉ°nª o°¤¼¨°¸ 1 »¸ÉÁ®¨º°³Äo宦´°ªµ¤¼o°Â¨³³
εÊεªnµ o°¤¼¨»»³¼Äo°Âε¨°µ¦Á¦¸¥¦¼o °Á¦ºÉ°Â¨oªÎµnµ
ªµ¤¼o°®¦º°ªµ¤·¡¨µ °Ân¨³¦°¤µ¦ª¤´Â¨³®µnµÁ¨¸É¥Á¡ºÉ°Á}nµ
³o°¦³··£µ¡ °µ¦Á¦¸¥¦¼o宦´µ¦¦³Á¤·¦³··£µ¡ °Ã¤Á¨Äo ª·¸µ¦ª´nµªµ¤¼o°nµªµ¤Â¤n¥Îµ Precisionnµªµ¤¦³¨¹ Recall¨³
e¸É11 ´¸É2 ¦µ¤ – ´ªµ¤2558
µ¦ª´¦³··£µ¡Ã¥¦ª¤)-measureµ´Êεnµªµ¤Â¤n¥Îµ¸ÉÅoÄÂn¨³¦°
¤µÎµª®µnµÁ¨¸É¥ªµ¤Â¤n¥Îµ °Ã¤Á¨ Accuracy´¤µ¦¸É1-4 µ¤¨Îµ´
Precision = ୰ୣୡ୧ୱ୧୭୬()ା୰ୣୡ୧ୱ୧୭୬()
Recall = ୖୣୡୟ୪୪()ାୖୣୡୟ୪୪()
F- measure = ୶ (ୖୣୡୟ୪୪ ୶ ୰ୣୡ୧ୱ୧୭୬)
ୖୣୡୟ୪୪ା୰ୣୡ୧ୱ୧୭୬
Accuracy = ୪୪ ୈୟ୲ୟା
Á¤ºÉ°Îµ®Ä®o N º°Îµª¨µ
TP º°nµ °True Positive TN º°nµ °True Negative
¨µ¦ª·´¥
µª·´¥´¸Ê¼oª·´¥³ÎµÁ°¨¨´¡rµµ¦¨¤·· o°¤¼¨Â¨oªÁ®¨º°Îµª
°¦··ªrÁ¡¸¥60 °¦··ªrÁnµ´ÊÃ¥°oµ°·µµª·´¥®¨µ¥Çµªnµnª
°µ¦¨¤··¸É¤¸¦³··£µ¡¸É»³°¥¼n¸É20 - 60 °¦··ªr µµ¦°´»
o°¤¼¨SRBCT ¡ªnµª·¸µ¦´Á¨º°»¨´¬³Â®¨µ¥¨Îµ´´Êº°Á·Â
Linear SVM Weight ¦nª¤´Relief Ä®onµªµ¤¼o°Äµ¦ÎµÂ o°¤¼¨¼» 100 Á°¦rÁÈr¸É8 °¦··ªrÄ ³¸Éµ¦´Á¨º°»¨´¬³Â´ÊÁ¸¥ª¸ÉÄoÁ·
Linear SVM Weight ³Ä®onµªµ¤¼o°¼»100 Á°¦rÁÈr¸É11 °¦··ªr ¨³µ¦´Á¨º°»¨´¬³Â´ÊÁ¸¥ª¸ÉÄoÁ·Relief ³Ä®onµªµ¤¼o°
¼»¸É98.89 Á°¦rÁÈr¨³¸É31 °¦··ªr宦´» o°¤¼¨usps ª·¸µ¦´Á¨º°
®¨µ¥¨Îµ´´ÊÄ®onµªµ¤¼o°Äµ¦ÎµÂ o°¤¼¨Åo¸ªnµµ¦Äoª·¸µ¦
´Á¨º°Â¨Îµ´´ÊÁ¸¥ªº°Ä®onµªµ¤¼o°Ánµ´95.76% ¸É55 ¤·· o°¤¼¨µ
´Ê®¤256 ¤·· o°¤¼¨ µ¤µ¦ÂÅo´¦¼¸É5 ¨³6 µ¤¨Îµ´
¦¼¸É µ¦´Á¨º°»¨´¬³Ã¥Äo» o°¤¼¨SRBCT
¦¼¸É 6 µ¦´Á¨º°»¨´¬³Ã¥Äo» o°¤¼¨usps
¦»
µ¨µ¦ª·´¥ª·¸µ¦¨¤·· o°¤¼¨´» o°¤¼¨¸É¤¸Îµª¨µ®¨µ¥¨µ
´¨nµª oµoµ¤µ¦¦»Åoªnµµ¦´Á¨º°»¨´¬³Â®¨µ¥¨Îµ´´ÊÃ¥Äo Á·ÂLinear SVM Weight ¦nª¤´5HOLHI) Ä®onµªµ¤¼o°Äµ¦ÎµÂ
o°¤¼¨Åo¸ªnµµ¦Äoª·¸µ¦´Á¨º°Â¨Îµ´´ÊÁ¸¥ª¸ÉÄoÁ·Linear SVM Weight ®¦º°5HOLHI) ¨³µ¦´Á¨º°»¨´¬³Â®¨µ¥¨Îµ´´Ê´¨nµªµ¤µ¦
εµ´ o°¤¼¨ µÄ®nÅoÁ}°¥nµ¸ÁºÉ°µÄoÁª¨µÄµ¦ÎµªÅ¤n¤µÂ¨³
e¸É11 ´¸É2 ¦µ¤ – ´ªµ¤2558
¥´µ¤µ¦ÎµÅ¦³¥»rÄo´ o°¤¼¨oµµ¦Â¡¥roµ£¼¤·µ¦Á«®¦º°µoµ
°ºÉǸɤ¸¦·¤µ o°¤¼¨Á}媤µÅo Á°µ¦°oµ°·
£´¦µª»· «·¦·, «¸¤µ ª·Á¸¥¦ ¨³¡¥» ¤¸´2553). µ¦´Â¥¦³Á£ °
¤³Á¦ÈÁ¤ÈÁ¨º° µªÃ¥Äoª·¸µ¦´°´´¦nª¤´Á·´¡¡°¦rÁªÁ°¦r ¤¸. ªµ¦µ¦ª·´¥¤®µª·¥µ¨´¥ °Ân, 10–17.
ª «¦¸°»Å¦, ¡¥» ¤¸´ ¨³¼µ· ®§Å¥³«´·Íµ¦Á¦¸¥¤¢eÁ°¦r¡ºÊµ
Âε¨°®´ª o°Îµ®¦´µ¦ÎµÂ®¤ª®¤¼n °Á°µ¦. µ¦¦³»¤
ª·µµ¦¦³´¦³Á«oµ°¤¡·ªÁ°¦r¨³ÁÃ襸µ¦Á« ¦´Ê¸É 5, SS-151). ¦»Á¡².
A, Sarrafzadeh., H.A, $WDED\003HGUDPDQG-6KDQEHK]DGHK 5HOLHI)%DVHG)HDWXUH6HOHFWLRQ,Q&RQWHQW-Based Image Retrieval.
Proceeding of International MultiConference of Engineers and
&RPSXWHU6FLHQWLVWVSS-19).
Abdolhossein Sarrafzadeh, Habibollah Agh Atabay, Mir Mosen Pedram, -DPVKLG6KDQEHK]DGHK5HOLHI)%DVHG)HDWXUH6HOHFWLRQ,Q Content-Based Image Retrieval. Proceeding of International MultiConference of Engineers and Computer Scientists . Hong Kong.
Chen, H-L., Yang, B., Liu, J., & Liu, D.-<$VXSSRUWYHFWRUPDFKLQH classifier with rough set-based feature selection for breast cancer diagnosis. Expert Systems with Applications, 9014-9022.
*X\RQ , :HVWRQ - %DUQKLOO 6 9DSQLN 9 *HQH VHOHFWLRQ IRU cancer classification using support vector machines. Machine learning, 389-422.
Hsu, C.-w., Chang, C.-c., & Lin, C.-M$SUDFWLFDOJXLGHWRVXSSRUW vector classification. Taiwan: Department of Computer Science and Information Engineering, National Taiwan.
Huang, C.-L., Chen, M.-C., & Wang, C.--&UHGLWVFRULQJZLWKDGDWD mining approach based on support vector machines. Expert Systems with Applications, 847-856.
.RQRQHQNR,JRUDQG(GYDUG6LPHF,QGXFWLRQRIGHFLVLRQWUHHVXVLQJ 5(/,()0DWKHPDWLFDl and Statistical Methods in Artificial Intelligence, 363-375.
0RKG $IL]L 0RKG 6KXNUDQ 2PDU =DNDULD 1RRUKDQL]D :DKLG $KPDG DQG 0XMDKLG$KPDG=DLGL$&ODVVLILFDWLRQ0HWKRGIRU'DWD0LQLQJ Using Svm-Weight And Euclidean Distance. Australian Journal of Basic and Applied Sciences, 2053-2059.
6ZHLODP1+7KDUZDW$$ $EGHO0RQLHP1.6XSSRUWYHFWRU machine for diagnosis cancer disease: A comparative study. Egyptian Informatics Journal, 81-92.
Tan P. N., Steinbach, M., and Vipin, K. ,QWURGXFWLRQWR'DWD0LQLQJ United State of America: Addison Wesley.
;LQ-LQ5RQJ\DQ/L;LDQ6KHQ5RQJIDQJ%LH$XWRPDWLF:HE3DJHV
&DWHJRUL]DWLRQZLWK5HOLHI)DQG+LGGHQ1DĮYH%D\HV$&0$VVRFLDWH for Computing Machinery, 617-621.
XXHKXD/ /DQ6)X]]\7KHRU\%DVHG6XSSRUW9HFWRU 0DFKLQH
&ODVVLILHU)X]]\6\VWHPVDQG.QRZOHGJH'LVFRYHU\
Yin-Wen Chang, Chih--HQ /LQ )HDWXUH 5DQNLQJ 8VLQJ /LQHDU 690 :RUNVKRSDQG&RQIHUHQFH3URFHHGLQJVSS-64).