ex1Optional (ungraded) exercise
额外作业比起ex1多了一个参数所以变成了多参数的线性回归问题
然后还有一个正规方程(normal equation)求解,也不难
featureNormalize.m
特征标准化 (Feature Normalization),我个人理解就是让X保持在一个比较小的范围内
其实就是求出X的平均值u(用mean函数)和x的标准差s(用std函数)
最后得出的X_norm=(X-u)/s
function [X_norm, mu, sigma] = featureNormalize(X)
%FEATURENORMALIZE Normalizes the features in X
% FEATURENORMALIZE(X) returns a normalized version of X where
% the mean value of each feature is 0 and the standard deviation
% is 1. This is often a good preprocessing step to do when
% working with learning algorithms.% You need to set these values correctly
X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));% ====================== YOUR CODE HERE ======================
% Instructions: First, for each feature dimension, compute the mean
% of the feature and subtract it from the dataset,
% storing the mean value in mu. Next, compute the
% standard deviation of each feature and divide
% each feature by it's standard deviation, storing
% the standard deviation in sigma.
%
% Note that X is a matrix where each column is a
% feature and each row is an example. You need
% to perform the normalization separately for
% each feature.
%
% Hint: You might find the 'mean' and 'std' functions useful.
%
sigma=std(X);
aver=mean(X);
mu=X-aver;
X_norm=mu./sigma;% ============================================================end
computeCostMulti.m
function J = computeCostMulti(X, y, theta)
%COMPUTECOSTMULTI Compute cost for linear regression with multiple variables
% J = COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y% Initialize some useful values
m = length(y); % number of training examples% You need to return the following variables correctly
J = 0;% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.
J=(X*theta-y)'*(X*theta-y)/(2*m);
% =========================================================================end
gradientDescentMulti.m
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
%GRADIENTDESCENTMULTI Performs gradient descent to learn theta
% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);for iter = 1:num_iters% ====================== YOUR CODE HERE ======================% Instructions: Perform a single gradient step on the parameter vector% theta. %% Hint: While debugging, it can be useful to print out the values% of the cost function (computeCostMulti) and gradient here.%theta=theta-(alpha/m).*(X'*(X*theta-y));% ============================================================% Save the cost J in every iteration J_history(iter) = computeCostMulti(X, y, theta);endend
normalEqn.m
正规化方程就是直接通过矩阵运算得出θ,公式和代码就一行能表达了
function [theta] = normalEqn(X, y)
%NORMALEQN Computes the closed-form solution to linear regression
% NORMALEQN(X,y) computes the closed-form solution to linear
% regression using the normal equations.theta = zeros(size(X, 2), 1);% ====================== YOUR CODE HERE ======================
% Instructions: Complete the code to compute the closed form solution
% to linear regression and put the result in theta.
%% ---------------------- Sample Solution ----------------------
theta=(inv(X'*X))*X'*y;
% -------------------------------------------------------------% ============================================================end
最后在ex1_multi.m里面,有两行是自己要写的(但是对提交没有影响)
分别对part2和part3里面写一下预测结果的公式
% Estimate the price of a 1650 sq-ft, 3 br house
% ====================== YOUR CODE HERE ======================
price = theta(1)+theta(2)*1650+theta(3)*3; % You should change this% ============================================================
本文来自互联网用户投稿,文章观点仅代表作者本人,不代表本站立场,不承担相关法律责任。如若转载,请注明出处。 如若内容造成侵权/违法违规/事实不符,请点击【内容举报】进行投诉反馈!
