일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
- 링글커리어
- 영어시험
- #체험수업
- 스몰토크
- Ringle
- #nlp
- #직장인영어
- 영어공부법
- 링글
- #영어공부
- 둔촌역장어
- 화상영어
- #링글후기
- 링글경험담
- 영어공부
- 성내동장어
- 뉴노멀챌린지
- 해외취업컨퍼런스
- 장어랑고기같이
- 소통챌린지
- 총각네장어
- #Ringle
- 영어회화
- 링글리뷰
- #링글
- 오피스밋업
- 영어로전세계와소통하기
- CommunicateWiththeWorld
- 강동구장어맛집
- #영어발음교정
- Today
- Total
Soohyun’s Machine-learning
Univaraite Linear regression 본문
this part from : https://www.toptal.com/machine-learning/machine-learning-theory-an-introductory-primer
"A Computer program is said to learn from experience E with respect to some task T and some performance measure P, if its performance on T, as measured by P, improves with experience E." - Tom Mitchell, Carnegie Mellon University -
So if you want your program to predict, for example, traffic patterns at a busy intersection (task T), you can run it through a machine learning algorithm with data about past traffic patterns (experience E) and, if it has successfully "learned", it will then do better at predicting future traffic patterns (performance measure P).
The highly complex nature of many real-world problems, through, often means that inventing specialized algorithms that will solve them perfectly every time is impractical, if not impossible. Examples of machine learning problems include,
All of these problems are excellent targets for an ML project, and in fact ML has been applied to each of them with great success.
ML solves problems that cannot be solved by numerical means alone.
Among the different types of ML tasks, a crucial distinction is drawn between supervised and unsupervised learning. We will primarily focus on supervised learning here, but the end of the article includes a brief discussion of unsupervised learning with some links for those who are interested in pursuing the topic further.
this part from : http://kr.mathworks.com/help/symbolic/mupad_ug/univariate-linear-regression.html
Univariate Linear Regression
Regression is the process of fitting models to data. Linear regression assumes that the relationship between the dependent variable yi and the independent xi is linear:
Here a is the offset and b is the slope of the linear relationship. For linear regression of a data sample with one independent
with the positive weight Wi. By default, the weights are equal to 1.
Besides the slope a and the offset b of a fitted linear model, stats::lineReg also returns the value of the quadratic deviation x^2. For example, fit the linear model to the following data:
x := [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] y := [11, 13, 15, 17, 19, 21, 23, 25, 27, 29] stats::linReg(x,y) result) [[9, 2], 0]
|