ex1记录

其实这个实验就是蛮简单的线性回归,我就不引用概念什么的啦,就是分享和记录一下其中会有的思路,只要跟着给出的pdf的思路来基本上是可以完成的

warmUpExercise.m

function A = warmUpExercise()
%WARMUPEXERCISE Example function in octave
%   A = WARMUPEXERCISE() is an example function that returns the 5x5 identity matrixA = [];
% ============= YOUR CODE HERE ==============
% Instructions: Return the 5x5 identity matrix 
%               In octave, we return values by defining which variables
%               represent the return values (at the top of the file)
%               and then set them accordingly. 
A = eye(5);
% ===========================================end

computeCost.m

这部分就是计算代价函数(Cost Function),课中用J(θ)表示

function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
%   parameter for linear regression to fit the data points in X and y% Initialize some useful values
m = length(y); % number of training examples% You need to return the following variables correctly 
J = 0;% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
%               You should set J to the cost.
h=X*theta;
J=sum((h-y).^2)/(2*m);
% =========================================================================end

gradientDescent.m

计算梯度下降法,其实就是J(θ)的导数。

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
%   theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by 
%   taking num_iters gradient steps with learning rate alpha% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);for iter = 1:num_iters% ====================== YOUR CODE HERE ======================% Instructions: Perform a single gradient step on the parameter vector%               theta. %% Hint: While debugging, it can be useful to print out the values%       of the cost function (computeCost) and gradient here.%h=X*theta;theta=theta-(alpha/m).*(X'*(h-y));% ============================================================% Save the cost J in every iteration    J_history(iter) = computeCost(X, y, theta);endend


本文来自互联网用户投稿,文章观点仅代表作者本人,不代表本站立场,不承担相关法律责任。如若转载,请注明出处。 如若内容造成侵权/违法违规/事实不符,请点击【内容举报】进行投诉反馈!

相关文章

立即
投稿

微信公众账号

微信扫一扫加关注

返回
顶部