ML: Week 2
25 Mar 2014Multivariate linear regression
####Improve Gradiant Descent
- Feature Scaling
- Make sure features are on a similar scale
- Get every feature into approximately range
- Mean Normalization
- Make features have approximately zero mean(excluding )
- where:
- is avg value of
- is range of or standard deviation
-
Playing around with learning rate
- Plot as function of number of iterations
- if is too small: slow convergence
- if is too large: not decrease or even not converge (possibly slow convergence)
####Normal Equation
- Solve analytically
-
Doesn’t work well if n is too big(in millions)
Taking inverse of a matrix is approx on the order of cube of dimension
- Suitable for linear regression
- Use pinv in Octave
####Invertibility
Why can be non-invertible
-
Redundant features(linear dependent)
= size in
= size in
-
Too many features
eg:
- Delete some features
- Use regularization
Your Comments
comments powered by Disqus