1. LON-CAPA Logo
  2. Help
  3. Log In
 

Worked Examples - Ch. 5

Scatterplots and the correlation coefficient.

Example:

Suppose you were studying the educational level of husbands and wives (measured in number of years of education). You have randomly selected 10 couples and have obtained the data in the following table:

HusbandWife
1214
1616
1614
1816
2016
1718
2318
1412
1216
1620

Scatterplot for this data:

To help us judge the degree of linear relationship between the two variables, we need to compute the correlation coefficient.

The correlation coefficient may be computed by hand or on a TI calculator or on EXCEL:

EXCEL:
In the function wizard menu, in the statistical submenu, the function CORREL will compute the correlation coefficient.
TI (82 or 83 models):
Choose STAT menu Edit enter data.
To compute the regression line and the correlation coefficient:
STAT menu CALC LinReg(ax+b).
By Hand: The easiest formula for hand computation is

\[
r\, =\frac{n\sum xy-(\sum x)\cdot (\sum y)}{\sqrt{\left( n\sum x^{2}-\left( \sum x\right) ^{2}\right) \cdot \left( n\sum y^{2}-\left( \sum y\right) ^{2}\right) }}\]
where n is the number of ordered pairs,

 \( \sum x \) is the sum of the  \( x \)-column,
 \( \sum y \) is the sum of the  \( y \)-column,
 \( \sum x^2 \) is the sum of the  \( x^2 \)-column,
 \( \sum y^2 \) is the sum of the  \( y^2 \)-column,
 \( \sum xy \) is the sum of the  \( xy \)-column,
The following table can be helpful in computing the sums:

x (husband) y (wife) x2 y2 x·y
1214 122=144 142=196 12·14=168
1616 162=256 162=256 16·16=256
1614 162=256 142=196 16·14=224
1816 182=324 162=256 18·16=288
2016 202=400 162=256 20·18=360
1718 172=289 182=324 17·18=306
2318 232=529 182=324 23·18=414
1412 142=196 122=144 14·12=168
1216 122=144 162=256 12·16=192
1620 162=256 202=400 16·20=320


x=164



y=160



x2=2794



y2=2608



xy=2656

The correlation coefficient is

\[
r=\frac{(10)(2656)-(164)(160)}{\sqrt{\left( (10)(2794)-(164)^{2}\right) \cdot \left( (10)(2608)-(160)^{2}\right) }}=0.45\]

The coefficient of determination, r2 is (0.45)2

(Note: The coefficient of determination gives the percent of the variation that is explained by the model.)

The correlation coefficient can be classified as:

weak/moderate/strong and positive/negative using the following guide from your text:

r = 0 no linear relationship
0 < r < .5 or -.5 < r < 0 weak (or low) linear relationship
.5 r < .8 or -.8 < r -.5 moderate linear relationship
.8 r < 1 or -1 < r -.8 strong (or high) linear relationship
r = 1 or r = -1 perfect (or exact) linear relationship

For our example, r = 0.45 is a low positive relationship.

Regression Lines & Making Predictions

Suppose that we have a husband with 10 years of education. We would like to predict the wife's education based on our data table.

We can do this by fitting a line through the data points. This line is called a regression line. We can use this line to make a prediction. If we have a strong linear relation, then our prediction should be very accurate.

(Note: In research, a problem occurs when subjects drop out of a study. As a result, data sets may be incomplete. The regression line can be used to "fill in" the incomplete information.)

Using the TI calculator, we can do several types of regression curves: linear, quadratic, cubic, quartic, logarithmic, exponential, and power. By hand, only the linear regression is feasible to do.

The easiest formulas to use by hand are as follows:

Slope:


\[ a=\frac{n\sum xy-\sum x\sum y}{n\sum x^{2}-(\sum x)^{2}}\]

y-intercept:


\[
b=\frac{1}{n}\left( \sum y-a\sum x\right) \]

Regression line:


\[
y=ax+b\]
The regression line for the above data:


\[
a=\frac{n\sum xy-\sum x\sum y}{n\sum x^{2}-(\sum x)^{2}}=\frac{10\cdot 2656-164\cdot 160}{10\cdot 2794-(164)^{2}}=0.3065\]

\[
b=\frac{1}{n}\left( \sum y-a\sum x\right) =\frac{1}{10}(160-0.3065\cdot 164)=10.97\]
\[
y=0.3065x+10.97\]
We can use the regression line to make a prediction. For example, if the husband had 10 years of education (x=10) then we could predict that the wife had 0.3065·10+10.97=14 years of education. The symbol,
^
y
 
('' y hat") is commonly used for predicted values, thus, here
^
y
 
=14.
Since for this example, r2=0.20 , the model explains only 20% of the variation in education levels between husbands and wives. It would be risky to try to predict the wife's education on the basis of the husband's education.

Error Analysis

We can get a feel for the error by comparing the actual values # to the values predicted by the regression line,

error=y-
^
y
 
(i.e. the actual value minus the predicted value). Since the sum of the errors is always 0, S error is not helpful in measuring the error. Instead we use the sum of the squares of the errors:
SSE=
(y-
^
y
 
)2

The SSE for this data set is:

x y
Predicted
^
y
 


Error: y-
^
y
 


Squared Error: (y-
^
y
 
)2

121414.651-.6510.4238
161615.8740.1260.01588
161415.674-1.8743.5119
181616.487-0.4870.23717
201617.100-1.11.21
171816.1811.8193.3088
231818.020-0.020.0004
141215.261-3.26110.634
121614.6481.3521.8279
162015.8744.12617.024
SSE=38.2