2
function [theta] = LR(D)
% D is the data having feature variables and class labels
% Now decompose D into X and C
%Note that dimensions of X = , C =
C = D(:,1);
C = C';
size(C)
X = D(:,2:size(D,2));
size(X)
alpha = .00001;
theta_old = zeros(1,34);
theta_new = .001.*ones(1,34);
count = 1;
for count = 1:100000
theta_old = theta_new;
theta_new = theta_new + alpha*(C-sigmoid(X*theta_new')')*X;
llr = sum(LLR((X*theta_new').*(C')))
end
thetaopt = theta_new
end
function a = LLR(z)
a= 1.*log(1.0 + exp(-z));
end
function a = sigmoid(z)
a = 1.0 ./ (1.0 + exp(-z));
end
我的问题是,对数似然比先减小,然后开始上升。这是渐变下降算法还是代码问题?
是标签是D(:,1)0/1或-1/1。可以向我们显示重量(theta_new)梯度llr,在每10个iter中,似乎该模型过度拟合,因为您需要100000次,这是很多的。您应该在渐变达到某个值时退出(例如:1e-4)。 – michaeltang
你可以尝试正则化,在l2正则化中,grad =(C-sigmoid(X * theta_new')')* X + thread_new – michaeltang