机器学习 Softmax classifier (无隐含层)
生活随笔
收集整理的這篇文章主要介紹了
机器学习 Softmax classifier (无隐含层)
小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
程序?qū)崿F(xiàn) Softmax classifer, 沒(méi)有隱含層,
f=wx+b
y=efi∑jefj
%% Softmax classifierfunction Out=Softmax_Classifier(train_x, train_y, opts)% setting learning parameters step_size=opts.step_size; reg=opts.reg; batchsize = opts.batchsize; numepochs = opts.numepochs; K=opts.class; %% learn on the training set% initialize the parameters D=size(train_x, 2); W=0.01*randn(D,K); b=zeros(1,K);loss(1 : numepochs)=0;num_examples=size(train_x, 1); numbatches = num_examples / batchsize;for epoch=1:numepochs% % tic; % % % % sprintf('epoch %d: \n, ' , epoch)kk = randperm(num_examples);loss(epoch)=0;for bat=1:numbatchesbatch_x = train_x(kk((bat - 1) * batchsize + 1 : bat * batchsize), :);batch_y = train_y(kk((bat - 1) * batchsize + 1 : bat * batchsize), :);cc=repmat(b, batchsize , 1);scores=batch_x*W+cc;exp_scores=exp(scores);dd=repmat(sum(exp_scores, 2), 1, K);probs=exp_scores./dd;correct_logprobs=-log(sum(probs.*batch_y, 2));data_loss=sum(correct_logprobs)/batchsize;reg_loss=0.5*reg*sum(sum(W.*W));loss(epoch) =loss(epoch)+ data_loss + reg_loss;dscores = probs-batch_y;dscores=dscores/batchsize;dW=batch_x'*dscores;db=sum(dscores);dW=dW+reg*W;W=W-step_size*dW;b=b-step_size*db;endloss(epoch)=loss(epoch)/numbatches;if (mod(epoch, 10)==0)sprintf('epoch: %d, training loss is %f: \n', epoch, loss(epoch))end% % toc;endOut.W=W; Out.b=b; Out.loss=loss;end轉(zhuǎn)載于:https://www.cnblogs.com/mtcnn/p/9412464.html
總結(jié)
以上是生活随笔為你收集整理的机器学习 Softmax classifier (无隐含层)的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: laravel--表单验证
- 下一篇: ROS Learning-032 (提