Skip to content

Fix inconsistency between loss and gradient in linear regression (use…#25

Open
yz847zzz wants to merge 1 commit intoalirezadir:mainfrom
yz847zzz:fix-linear-regression-loss
Open

Fix inconsistency between loss and gradient in linear regression (use…#25
yz847zzz wants to merge 1 commit intoalirezadir:mainfrom
yz847zzz:fix-linear-regression-loss

Conversation

@yz847zzz
Copy link
Copy Markdown

Fix: Inconsistency between loss function and gradient in linear regression example

Hi, thanks for this great repository!

I noticed a small inconsistency in the linear regression implementation.

Issue

The current loss function uses:

np.linalg.norm(E, 2)

Suggested Fix:
np.sum(E**2)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants