In statistics, **regression analysis** includes any techniques for modelling and analyzing several variables, when the focus is on the relationship between a dependent variable and one or more independent variables. More specifically, regression analysis helps us understand how the typical value of the dependent variable changes when any one of the independent variables is varied, while the other independent variables are held fixed. Most commonly, regression analysis estimates the conditional expectation of the dependent variable given the independent variables — that is, the average value of the dependent variable when the independent variables are held fixed. Less commonly, the focus is on a quantile, or other location parameter of the conditional distribution of the dependent variable given the independent variables. In all cases, the estimation target is a function of the independent variables called the regression function. In regression analysis, it is also of interest to characterize the variation of the dependent variable around the regression function, which can be described by a probability distribution.

**Multiple Correlation** is a linear relationship among more than two variables. It is measured by the coefficient of multiple determination, denoted as R2, which is a measure of the fit of a linear regression. A regression's R2 falls somewhere between zero and one (assuming a constant term has been included in the regression); a higher value indicates a stronger relationship among the variables, with a value of one indicating that all data points fall exactly on a line in multidimensional space and a value of zero indicating no relationship at all between the independent variables collectively and the dependent variable.
Unlike the coefficient of determination in a regression involving just two variables, the coefficient of multiple determination is not computationally commutative: a regression of y on x and z will in general have a different R2 than will a regression of z on x and y. For example, suppose that in a particular sample the variable z is uncorrelated with both x and y, while x and y are linearly related to each other. Then a regression of z on y and x will yield an R2 of zero, while a regression of y on x and z will yield a positive R2.

The term correlation deals with the relationship between two or more variables. If a change in one variable effect a change in other variable, the variables are said to be correlated.

There are basically three type of correlation, namely,

- Positive correlation
- Negative correlation
- Zero correlation.

**Positive correlation:**

If the value of two variables deviate (change) in the same direction i.e. If the increase (or decrease) in one variable results in a corresponding increase (or decrease) in the order, the correlation between them is said to be positive.

**Examples of** **Positive correlation****:**

- The heights and weights of the individuals
- The income and expenditure
- Experience and salary

**Negative correlation: **

If the value of two variables deviate (change) in the opposite direction i.e. If the increase (or decrease) in one variable results in a corresponding decrease (or increase) in the order, the correlation between them is said to be negative.

**Examples of negative correlation:**

- Number of school students using laptops and mobile phones in the classroom.
- The number of persons eating their breakfast at night.
- How many students bought laptop from their own salary?
- Number of persons used gold spoon for eating their food.

**Zero correlation:**

There is no relationship between two variable means such a correlation is a zero correlation

**Formula****: **Correlation Co-Efficient:

Correlation(r) = [ NΣXY - (ΣX)(ΣY) / SQRT([NΣX

where

N = Number of values or elements

X = First Score

Y = Second Score

ΣXY = Sum of the product of first and Second Scores

ΣX = Sum of First Scores

ΣY = Sum of Second Scores

ΣX

ΣY

**Regression Definition:**

A regression is defined as a statistical analysis assessing the association between two variables. It is used to find the relationship between two variables.

**Regression Formula:**

Regression Equation(y) = a + bx

Slope(b) = (NΣXY - (ΣX)(ΣY)) / (NΣX^{2} - (ΣX)^{2})

Intercept(a) = (ΣY - b(ΣX)) / N

where

x and y are the variables.

b = the slope of the regression line

a = the intercept point of the line and the y axis.

N = Number of values or elements

X = First Score

Y = Second Score

ΣXY = Sum of the product of first and Second Scores

ΣX = Sum of First Scores

ΣY = Sum of Second Scores

ΣX ^{2} = Sum of square First Scores

In case you face any problem or have any query please email us at :-info@homeworkassignmenthelp.com

Our tutors start working only after the payment is made, to ensure that we are doing work only for serious clients and also our solution meets the required standard.

- Send us you Other Assignment or problem through email
- Specify the required format such as Word, Excel, Notepad, PDF
- Give us a deadline when you need the assignment completed along with the Time Zone.

(for example: EST, Australian GMT etc) - Send documents related to your assignment which can help our tutors to provide a better work,

any example or format you want the solutions to be in. - Our tutors will review the assignment sent by you and if all the required information is there we will

send you the price quoted by our tutor along with the time needed to solve the assignment - You can pay us through paypal or credit card.
- After receiving the payment tutors start working on your assignment.
- Finally, we deliver the solutions and get a feedback from you regarding our work

In case you face any problem or have any query please email us at :- info@homeworkassignmenthelp.com