h2(#description). Description
This template will return the correlation matrix of supplied numerical variables.
h3(#introduction). Introduction
"Correlation":http://en.wikipedia.org/wiki/Correlation_and_dependence is one of the most commonly used statistical tool. With the help of that we can get information about a possible "linear relation":http://en.wikipedia.org/wiki/Linear_independence between two variables. According to the definition of the correlation, one can call it also as the standardized "covariance":http://en.wikipedia.org/wiki/Covariance.
The maximum possible value of the correlation (the so-called "correlation coefficient":http://en.wikipedia.org/wiki/Correlation_coefficient) could be 1, the minimum could be -1. In the first case there is a perfect positive (thus in the second case there is a perfect negative) linear relationship between the two variables, though perfect relationships, especially in the social sciences, are quite rare. If two variables are independent from each other, the correlation between them is 0, but 0 correlation coefficient only means certainly a "linear independency":http://en.wikipedia.org/wiki/Correlation_and_dependence#Correlation_and_linearity.
Because extreme values occur seldom we have rule of thumbs for the coefficients, like other fields of statistics:
* we call two variables highly correlated if the absolute value of the correlation coefficient between them is higher than 0.7 and
* we call them uncorrelated if that is smaller than 0.2.
Please note that "correlation has nothing to do with causal models":http://en.wikipedia.org/wiki/Correlation_does_not_imply_causation, it only shows association but not effects.
h3(#variable-description). Variable description
_709_ variables with _2_ cases provided.
There are no highly correlated (r < -0.7 or r > 0.7) variables.
There are no uncorrelated correlated (r < -0.2 or r > 0.2) variables.
h3(#correlation-matrix). Correlation matrix
Correlation matrix
*age* |
|
0.2185 * * * |
*edu* |
0.2185 * * * |
|
Where the stars represent the "significance levels":http://en.wikipedia.org/wiki/Statistical_significance of the bivariate correlation coefficients: one star for a "p value":http://en.wikipedia.org/wiki/P-value below @0.05@, two for below @0.01@ and three for below @0.001@.
On the plot one can see the correlation in two forms: below the "diagonal":http://en.wikipedia.org/wiki/Main_diagonal visually, above that one can find the coefficient(s).
"!plots/Correlation-1.png(Scatterplot matrix)!":plots/Correlation-1-hires.png
h2(#description-1). Description
This template will return the correlation matrix of supplied numerical variables.
h3(#introduction-1). Introduction
"Correlation":http://en.wikipedia.org/wiki/Correlation_and_dependence is one of the most commonly used statistical tool. With the help of that we can get information about a possible "linear relation":http://en.wikipedia.org/wiki/Linear_independence between two variables. According to the definition of the correlation, one can call it also as the standardized "covariance":http://en.wikipedia.org/wiki/Covariance.
The maximum possible value of the correlation (the so-called "correlation coefficient":http://en.wikipedia.org/wiki/Correlation_coefficient) could be 1, the minimum could be -1. In the first case there is a perfect positive (thus in the second case there is a perfect negative) linear relationship between the two variables, though perfect relationships, especially in the social sciences, are quite rare. If two variables are independent from each other, the correlation between them is 0, but 0 correlation coefficient only means certainly a "linear independency":http://en.wikipedia.org/wiki/Correlation_and_dependence#Correlation_and_linearity.
Because extreme values occur seldom we have rule of thumbs for the coefficients, like other fields of statistics:
* we call two variables highly correlated if the absolute value of the correlation coefficient between them is higher than 0.7 and
* we call them uncorrelated if that is smaller than 0.2.
Please note that "correlation has nothing to do with causal models":http://en.wikipedia.org/wiki/Correlation_does_not_imply_causation, it only shows association but not effects.
h3(#variable-description-1). Variable description
_709_ variables with _3_ cases provided.
The highest correlation coefficient (_0.2273_) is between _edu_ and _age_ and the lowest (_-0.03377_) is between _leisure_ and _age_. It seems that the strongest association (r=_0.2273_) is between _edu_ and _age_.
There are no highly correlated (r < -0.7 or r > 0.7) variables.
Uncorrelated (-0.2 < r < 0.2) variables:
* _leisure_ and _age_ (-0.03)
* _leisure_ and _edu_ (0.17)
h3(#correlation-matrix-1). Correlation matrix
Correlation matrix
*age* |
|
0.2273 * * * |
-0.0338 |
*edu* |
0.2273 * * * |
|
0.1732 * * * |
*leisure* |
-0.0338 |
0.1732 * * * |
|
Where the stars represent the "significance levels":http://en.wikipedia.org/wiki/Statistical_significance of the bivariate correlation coefficients: one star for a "p value":http://en.wikipedia.org/wiki/P-value below @0.05@, two for below @0.01@ and three for below @0.001@.
On the plot one can see the correlation in two forms: below the "diagonal":http://en.wikipedia.org/wiki/Main_diagonal visually, above that one can find the coefficient(s).
"!plots/Correlation-2.png(Scatterplot matrix)!":plots/Correlation-2-hires.png
h2(#description-2). Description
This template will return the correlation matrix of supplied numerical variables.
h3(#introduction-2). Introduction
"Correlation":http://en.wikipedia.org/wiki/Correlation_and_dependence is one of the most commonly used statistical tool. With the help of that we can get information about a possible "linear relation":http://en.wikipedia.org/wiki/Linear_independence between two variables. According to the definition of the correlation, one can call it also as the standardized "covariance":http://en.wikipedia.org/wiki/Covariance.
The maximum possible value of the correlation (the so-called "correlation coefficient":http://en.wikipedia.org/wiki/Correlation_coefficient) could be 1, the minimum could be -1. In the first case there is a perfect positive (thus in the second case there is a perfect negative) linear relationship between the two variables, though perfect relationships, especially in the social sciences, are quite rare. If two variables are independent from each other, the correlation between them is 0, but 0 correlation coefficient only means certainly a "linear independency":http://en.wikipedia.org/wiki/Correlation_and_dependence#Correlation_and_linearity.
Because extreme values occur seldom we have rule of thumbs for the coefficients, like other fields of statistics:
* we call two variables highly correlated if the absolute value of the correlation coefficient between them is higher than 0.7 and
* we call them uncorrelated if that is smaller than 0.2.
Please note that "correlation has nothing to do with causal models":http://en.wikipedia.org/wiki/Correlation_does_not_imply_causation, it only shows association but not effects.
h3(#variable-description-2). Variable description
_32_ variables with _11_ cases provided.
The highest correlation coefficient (_0.902_) is between _disp_ and _cyl_ and the lowest (_-0.8677_) is between _wt_ and _mpg_. It seems that the strongest association (r=_0.902_) is between _disp_ and _cyl_.
Highly correlated (r < -0.7 or r > 0.7) variables:
* _cyl_ and _mpg_ (-0.85)
* _disp_ and _mpg_ (-0.85)
* _hp_ and _mpg_ (-0.78)
* _wt_ and _mpg_ (-0.87)
* _disp_ and _cyl_ (0.9)
* _hp_ and _cyl_ (0.83)
* _wt_ and _cyl_ (0.78)
* _vs_ and _cyl_ (-0.81)
* _hp_ and _disp_ (0.79)
* _drat_ and _disp_ (-0.71)
* _wt_ and _disp_ (0.89)
* _vs_ and _disp_ (-0.71)
* _qsec_ and _hp_ (-0.71)
* _vs_ and _hp_ (-0.72)
* _carb_ and _hp_ (0.75)
* _wt_ and _drat_ (-0.71)
* _am_ and _drat_ (0.71)
* _vs_ and _qsec_ (0.74)
* _gear_ and _am_ (0.79)
Uncorrelated (-0.2 < r < 0.2) variables:
* _gear_ and _hp_ (-0.13)
* _qsec_ and _drat_ (0.09)
* _carb_ and _drat_ (-0.09)
* _qsec_ and _wt_ (-0.17)
* _am_ and _vs_ (0.17)
* _carb_ and _am_ (0.06)
h3(#correlation-matrix-2). Correlation matrix
Correlation matrix (continued below)
*mpg* |
|
-0.8522 * * * |
-0.8476 * * * |
*cyl* |
-0.8522 * * * |
|
0.9020 * * * |
*disp* |
-0.8476 * * * |
0.9020 * * * |
|
*hp* |
-0.7762 * * * |
0.8324 * * * |
0.7909 * * * |
*drat* |
0.6812 * * * |
-0.6999 * * * |
-0.7102 * * * |
*wt* |
-0.8677 * * * |
0.7825 * * * |
0.8880 * * * |
*qsec* |
0.4187 * |
-0.5912 * * * |
-0.4337 * |
*vs* |
0.6640 * * * |
-0.8108 * * * |
-0.7104 * * * |
*am* |
0.5998 * * * |
-0.5226 * * |
-0.5912 * * * |
*gear* |
0.4803 * * |
-0.4927 * * |
-0.5556 * * * |
*carb* |
-0.5509 * * |
0.5270 * * |
0.3950 * |
Table continues below
*mpg* |
-0.7762 * * * |
0.6812 * * * |
-0.8677 * * * |
*cyl* |
0.8324 * * * |
-0.6999 * * * |
0.7825 * * * |
*disp* |
0.7909 * * * |
-0.7102 * * * |
0.8880 * * * |
*hp* |
|
-0.4488 * * |
0.6587 * * * |
*drat* |
-0.4488 * * |
|
-0.7124 * * * |
*wt* |
0.6587 * * * |
-0.7124 * * * |
|
*qsec* |
-0.7082 * * * |
0.0912 |
-0.1747 |
*vs* |
-0.7231 * * * |
0.4403 * |
-0.5549 * * * |
*am* |
-0.2432 |
0.7127 * * * |
-0.6925 * * * |
*gear* |
-0.1257 |
0.6996 * * * |
-0.5833 * * * |
*carb* |
0.7498 * * * |
-0.0908 |
0.4276 * |
Table continues below
*mpg* |
0.4187 * |
0.6640 * * * |
0.5998 * * * |
*cyl* |
-0.5912 * * * |
-0.8108 * * * |
-0.5226 * * |
*disp* |
-0.4337 * |
-0.7104 * * * |
-0.5912 * * * |
*hp* |
-0.7082 * * * |
-0.7231 * * * |
-0.2432 |
*drat* |
0.0912 |
0.4403 * |
0.7127 * * * |
*wt* |
-0.1747 |
-0.5549 * * * |
-0.6925 * * * |
*qsec* |
|
0.7445 * * * |
-0.2299 |
*vs* |
0.7445 * * * |
|
0.1683 |
*am* |
-0.2299 |
0.1683 |
|
*gear* |
-0.2127 |
0.2060 |
0.7941 * * * |
*carb* |
-0.6562 * * * |
-0.5696 * * * |
0.0575 |
*mpg* |
0.4803 * * |
-0.5509 * * |
*cyl* |
-0.4927 * * |
0.5270 * * |
*disp* |
-0.5556 * * * |
0.3950 * |
*hp* |
-0.1257 |
0.7498 * * * |
*drat* |
0.6996 * * * |
-0.0908 |
*wt* |
-0.5833 * * * |
0.4276 * |
*qsec* |
-0.2127 |
-0.6562 * * * |
*vs* |
0.2060 |
-0.5696 * * * |
*am* |
0.7941 * * * |
0.0575 |
*gear* |
|
0.2741 |
*carb* |
0.2741 |
|
Where the stars represent the "significance levels":http://en.wikipedia.org/wiki/Statistical_significance of the bivariate correlation coefficients: one star for a "p value":http://en.wikipedia.org/wiki/P-value below @0.05@, two for below @0.01@ and three for below @0.001@.
On the plot one can see the correlation in two forms: below the "diagonal":http://en.wikipedia.org/wiki/Main_diagonal visually, above that one can find the coefficient(s).
"!plots/Correlation-3.png(Scatterplot matrix)!":plots/Correlation-3-hires.png
This report was generated with "R":http://www.r-project.org/ (3.0.1) and "rapport":https://rapporter.github.io/rapport/ (0.51) in _4.769_ sec on x86_64-unknown-linux-gnu platform.
!images/logo.png!