这篇文章主要介绍了python如何实现求特征选择的信息增益,具有一定借鉴价值,感兴趣的朋友可以参考下,希望大家阅读完这篇文章之后大有收获,下面让小编带着大家一起了解一下。
使用python语言,实现求特征选择的信息增益,可以同时满足特征中有连续型和二值离散型属性的情况。
代码块
import numpy as np
import math
class IG():
def __init__(self,X,y):
X = np.array(X)
n_feature = np.shape(X)[1]
n_y = len(y)
orig_H = 0
for i in set(y):
orig_H += -(y.count(i)/n_y)*math.log(y.count(i)/n_y)
condi_H_list = []
for i in range(n_feature):
feature = X[:,i]
sourted_feature = sorted(feature)
threshold = [(sourted_feature[inde-1]+sourted_feature[inde])/2 for inde in range(len(feature)) if inde != 0 ]
thre_set = set(threshold)
if float(max(feature)) in thre_set:
thre_set.remove(float(max(feature)))
if min(feature) in thre_set:
thre_set.remove(min(feature))
pre_H = 0
for thre in thre_set:
lower = [y[s] for s in range(len(feature)) if feature[s] < thre]
highter = [y[s] for s in range(len(feature)) if feature[s] > thre]
H_l = 0
for l in set(lower):
H_l += -(lower.count(l) / len(lower))*math.log(lower.count(l) / len(lower))
H_h = 0
for h in set(highter):
H_h += -(highter.count(h) / len(highter))*math.log(highter.count(h) / len(highter))
temp_condi_H = len(lower)/n_y *H_l+ len(highter)/n_y * H_h
condi_H = orig_H - temp_condi_H
pre_H = max(pre_H,condi_H)
condi_H_list.append(pre_H)
self.IG = condi_H_list
def getIG(self):
return self.IG
if __name__ == "__main__":
X = [[1, 0, 0, 1],
[0, 1, 1, 1],
[0, 0, 1, 0]]
y = [0, 0, 1]
print(IG(X,y).getIG())
输出结果为:
[0.17441604792151594, 0.17441604792151594, 0.17441604792151594, 0.6365141682948128]
感谢你能够认真阅读完这篇文章,希望小编分享的“python如何实现求特征选择的信息增益”这篇文章对大家有帮助,同时也希望大家多多支持亿速云,关注亿速云行业资讯频道,更多相关知识等着你来学习!
亿速云「云服务器」,即开即用、新一代英特尔至强铂金CPU、三副本存储NVMe SSD云盘,价格低至29元/月。点击查看>>
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。