Principal Component Analysis: Anxiety in Students

Do you need this or any other assignment done for you from scratch?
We have qualified writers to help you.
We assure you a quality paper that is 100% free from plagiarism and AI.
You can choose either format of your choice ( Apa, Mla, Havard, Chicago, or any other)

NB: We do not resell your papers. Upon ordering, we do an original paper exclusively for you.

NB: All your data is kept safe from the public.

Click Here To Order Now!

Statistical Test

The data analysis intends to employ SPSS in performing principal component analysis (PCA). Particularly, PCA aims to reduce the number of variables into principal components that explain most of the variation in SPSS anxiety among students. According to Field (2013), PCA eases exploration of data by establishing patterns, linear relationships, and the influence of each variable. In essence, PCA eliminates redundant variables and promotes dimensionality of variables (Jackson, 2015). For robust PCA, data ought to meet the assumptions of the multiplicity of continuous variables, the linearity of data, sample adequacy, and the absence of significant outliers. Elliott and Woodward (2015) elucidate that PCA applies orthogonal transformation in converting raw variables that highly correlate to principal components with low correlation coefficients. Thus, PCA enables the determination of principal components with major influence on the variation of data.

The Basis of Data

The data emanated from the questionnaire developed and utilized in collecting data regarding SPSS anxiety among 2571 students. The questionnaire comprises 23 closed-ended questions with a five-point Likert scale answers indicating the strength of agreement from 1 (strongly disagree) through strongly agree (5) (Field, 2013). Since students experience anxiety in the course of learning SPSS, the questionnaire aims to measure and ascertain the extent of SPSS anxiety. Through the interview of students with and without anxiety, the study formulated 23 questions with viability to measure SPSS anxiety. The examination of data indicates that they meet the assumptions of PCA because the questions are 23, variables have linear relationships, data do not have significant outliers, and the sample size is more than 300 (N = 2571). The use of the questionnaire would indicate the extent of anxiety in each student learning SPSS. Moreover, the questionnaire would reveal forms of SPSS anxiety existing among students. In essence, the study aims to use PCA in revealing principal variables that explain most of SPSS anxiety among students.

Research Questions

  • What are the principal components that account for the variation in SPSS anxiety among students?
  • What are the dominant themes in the extracted components?

Hypotheses

  • H0: The 23 questions in the questionnaire are not statistically significant variables in explaining the variation in SPSS anxiety among students.
  • H1: The 23 questions in the questionnaire are statistically significant variables in explaining the variation in SPSS anxiety among students.

Statistical Outcomes

Descriptive Statistics

The exploration of the data using descriptive statistics indicates that there are no missing data because 2571 students answered all the Likert items in the questionnaire. Denis (2016) asserts that descriptive statistics form the basis of data analysis for they reveal patterns and trends of data. Descriptive statistics also show that students neither agree nor disagree with most Likert statements (18) because their mean scores ranged from 2.23 to 2.89 as demonstrated in Table 1. Moreover, students agree with the remaining four Likert statements for their mean scores ranged from 3.16 to 3.62.

Table 1.

Descriptive Statistics
Mean Std. Deviation Analysis N
Statistics makes me cry 2.37 .828 2571
My friends will think I’m stupid for not being able to cope with SPSS 1.62 .851 2571
Standard deviations excite me 2.59 1.075 2571
I dream that Pearson is attacking me with correlation coefficients 2.79 .949 2571
I don’t understand statistics 2.72 .965 2571
I have little experience of computers 2.23 1.122 2571
All computers hate me 2.92 1.102 2571
I have never been good at mathematics 2.24 .873 2571
My friends are better at statistics than me 2.85 1.263 2571
Computers are useful only for playing games 2.28 .877 2571
I did badly at mathematics at school 2.26 .881 2571
People try to tell you that SPSS makes statistics easier to understand but it doesn’t 3.16 .916 2571
I worry that I will cause irreparable damage because of my incompetence with computers 2.45 .949 2571
Computers have minds of their own and deliberately go wrong whenever I use them 2.88 .999 2571
Computers are out to get me 2.77 1.009 2571
I weep openly at the mention of central tendency 2.88 .916 2571
I slip into a coma whenever I see an equation 2.47 .884 2571
SPSS always crashes when I try to use it 2.57 1.053 2571
Everybody looks at me when I use SPSS 2.29 1.101 2571
I can’t sleep for thoughts of eigenvectors 3.62 1.036 2571
I wake up under my duvet thinking that I am trapped under a normal distribution 3.17 .985 2571
My friends are better at SPSS than I am 2.89 1.041 2571
If I’m good at statistics my friends will think I’m a nerd 3.43 1.044 2571

Assumption Tests

The study employed the correlation in testing the assumption of multicollinearity. The scrutiny of the correlation matrix shows that the 23 Likert items have both negative and positive relationships (Appendix A). The correlation analysis indicates that the relationships of the Likert items range from moderate to very weak positive and negative relationships. Further scrutiny of significance coefficients shows that all Likert items have statistically significant relationships with one or more Likert items (p < 0.001). According to Field (2013), Likert items with correlation coefficients greater than 0.9 exhibit multicollinearity. Therefore, as correlation coefficients are less than 0.6, it implies that no variables exhibit multicollinearity, and thus, no elimination of Likert items.

Regarding the assumption of the adequacy of sample size, KMO statistic (0.93) indicates that the sample size is adequate for principal component analysis (Table 2). Since KMO statistic is considerably greater than the threshold of 0.5, Field (2013) describes the adequacy of sample size as marvelous. Anti-image correlation matrices (Appendix B) indicates that individual OKM values of anti-image covariance and anti-image correlation are greater than 0.5, which means all Likert items are viable for PCA. As illustrated in Table 2, Bartlett’s test of sphericity is statistically significant (p = 0.000), and thus, it rejects the null hypothesis that the correlation matrix is similar to the identity matrix.

Table 2.

KMO and Bartlett’s Test
Kaiser-Meyer-Olkin Measure of Sampling Adequacy. .930
Bartlett’s Test of Sphericity Approx. Chi-Square 19334.492
df 253
Sig. .000

Factor Extraction

From Table 3, it is apparent that four factors out of 23 factors have eigenvalues greater than one while the remaining 19 factors have eigenvalues less than one

Table 3.

Total Variance Explained
Component Initial Eigenvalues Extraction Sums of Squared Loadings Rotation Sums of Squared Loadings
Total % of Variance Cumulative % Total % of Variance Cumulative % Total % of Variance Cumulative %
1 7.290 31.696 31.696 7.290 31.696 31.696 3.730 16.219 16.219
2 1.739 7.560 39.256 1.739 7.560 39.256 3.340 14.523 30.742
3 1.317 5.725 44.981 1.317 5.725 44.981 2.553 11.099 41.841
4 1.227 5.336 50.317 1.227 5.336 50.317 1.950 8.476 50.317
5 .988 4.295 54.612
6 .895 3.893 58.504
7 .806 3.502 62.007
8 .783 3.404 65.410
9 .751 3.265 68.676
10 .717 3.117 71.793
11 .684 2.972 74.765
12 .670 2.911 77.676
13 .612 2.661 80.337
14 .578 2.512 82.849
15 .549 2.388 85.236
16 .523 2.275 87.511
17 .508 2.210 89.721
18 .456 1.982 91.704
19 .424 1.843 93.546
20 .408 1.773 95.319
21 .379 1.650 96.969
22 .364 1.583 98.552
23 .333 1.448 100.000
Extraction Method: Principal Component Analysis.

Before extraction, factors 1, 2, 3, and 4 had eigenvalues of 7.290, 1.739, 1.317, and 1.227, which accounted for 31.696%, 7.560%, 5.725%, and 5.336% of variance in that order. After extraction, factors 1, 2, 3, and 4 had the same eigenvalues that explained the same proportion of variance respectively. However, after extraction, factors 1, 2, 3, and 4 had eigenvalues of 3.730, 3.340, 2.553, and 1.950, which accounted for 16.219%, 14.523%, 11.099%, and 8.476% of variance correspondingly. Overall, the four factors cumulatively accounted for 50.317% of the variation in SPSS anxiety among students.

The table in Appendix C shows that shared variance of factors ranged from 0.343 to 0.739. The mean of common variance is 0.5032 (11.573/23), which is less than Kaiser’s threshold of 0.7. Component matrix (Appendix D) shows the distribution of factors’ loadings into the four components selected. According to Field (2013), factors with loadings greater than 0.4 are significant while factors with loadings less than 0.4 are not significant.

The scree plot demonstrates that the four components explain most of the variation in eigenvalues. According to McCormick, Salcedo, Peck, Wheeler, and Verlen (2017), the elbow of the scree plot provides a threshold of components in which further additions have no significant impact on the variation of data. Thus, the scree plot confirms the extraction of the four components in PCA.

Figure 1: Scree plot showing the distribution of eigenvalues among component numbers.

The analysis of loadings in the rotated component matrix (Appendix E) shows themes of anxiety that are in each component. Pallant (2016) holds that the threshold of suppressing loading coefficients determines the emergence of these themes. The type of anxiety that loaded onto the first component relates to the fear of computers while the form of anxiety that loaded onto the second component relates to the fear of statistics. The same number of questions (8) loaded onto the second and third components. The questions that loaded onto the third component relate to the fear of mathematics whereas the questions that loaded onto the fourth component relate to the fear of social perception.

Interpretation

PCA was performed to determine which among the 23 Likert items considerably explain SPSS anxiety among students. The analysis showed the data met the assumptions of sample adequacy (OKM = 0.93), the absence of multicollinearity (r < 0.6), and significance of Bartlett’s test of sphericity. Factor extraction established four components with eigenvalues greater than one. Moreover, the four eigenvalues collectively accounted for 50.317% of the variation in SPSS anxiety among students. The scree plot confirms the extraction of the four components because the major inflection point occurred at the fourth factor. The analysis of how each question loaded onto different components revealed varied themes in the questions. Eight questions that loaded onto the first component relate to the fear of computers while another eight questions that loaded onto the second component relate to the fear of statistics. Whereas three questions that loaded onto the third component relate to the fear of mathematics, the remaining five questions that loaded onto the fourth component relate to the fear of social perception. Thus, in answering the research question, PCA shows that fears of computers, statistics, mathematics, and social perception are principal factors in the questionnaire that influence the occurrence of anxiety among students. Moreover, PCA rejects the null hypothesis for it demonstrates that the 23 questions in the questionnaire are statistically significant variables in explaining the variation in SPSS anxiety among students.

References

Denis, D. (2016). Applied univariate, bivariate, and multivariate statistics. Hoboken, NJ:Wiley.

Elliott, A. C., & Woodward, W. A. (2015). IBM SPSS by example: A practical guide to statistical data. Thousand Oaks, CA: SAGE Publications.

Field, A. (2013). Discovering statistics using IBM SPSS statistics (4th ed.). Los Angeles, CA: SAGE Publications.

Jackson, S. J. (2015). Research methods and statistics: A critical thinking approach (5th ed.). Belmont, CA: Cengage Learning.

McCormick, K., Salcedo, J., Peck, J., Wheeler, A., & Verlen, J. (2017). SPSS statistics for data analysis and visualization. Indianapolis, IN: Wiley.

Pallant, J. (2016). SPSS survival manual: A step by step guide to data analysis using IBM SPSS. Sydney, Australia: Allen & Unwin.

Appendices

Appendix A: Correlation Matrix

Correlation Matrixa
Q_01 Q_02 Q_03 Q_04 Q_05 Q_06 Q_07 Q_08 Q_09 Q_10 Q_11 Q_12 Q_13 Q_14 Q_15 Q_16 Q_17 Q_18 Q_19 Q_20 Q_21 Q_22 Q_23
Correlation Q_01 1.000 -.099 -.337 .436 .402 .217 .305 .331 -.092 .214 .357 .345 .355 .338 .246 .499 .371 .347 -.189 .214 .329 -.104 -.004
Q_02 -.099 1.000 .318 -.112 -.119 -.074 -.159 -.050 .315 -.084 -.144 -.195 -.143 -.165 -.165 -.168 -.087 -.164 .203 -.202 -.205 .231 .100
Q_03 -.337 .318 1.000 -.380 -.310 -.227 -.382 -.259 .300 -.193 -.351 -.410 -.318 -.371 -.312 -.419 -.327 -.375 .342 -.325 -.417 .204 .150
Q_04 .436 -.112 -.380 1.000 .401 .278 .409 .349 -.125 .216 .369 .442 .344 .351 .334 .416 .383 .382 -.186 .243 .410 -.098 -.034
Q_05 .402 -.119 -.310 .401 1.000 .257 .339 .269 -.096 .258 .298 .347 .302 .315 .261 .395 .310 .322 -.165 .200 .335 -.133 -.042
Q_06 .217 -.074 -.227 .278 .257 1.000 .514 .223 -.113 .322 .328 .313 .466 .402 .360 .244 .282 .513 -.167 .101 .272 -.165 -.069
Q_07 .305 -.159 -.382 .409 .339 .514 1.000 .297 -.128 .284 .345 .423 .442 .441 .391 .389 .391 .501 -.269 .221 .483 -.168 -.070
Q_08 .331 -.050 -.259 .349 .269 .223 .297 1.000 .016 .159 .629 .252 .314 .281 .300 .321 .590 .280 -.159 .175 .296 -.079 -.050
Q_09 -.092 .315 .300 -.125 -.096 -.113 -.128 .016 1.000 -.134 -.116 -.167 -.167 -.122 -.187 -.189 -.037 -.150 .249 -.159 -.136 .257 .171
Q_10 .214 -.084 -.193 .216 .258 .322 .284 .159 -.134 1.000 .271 .246 .302 .255 .295 .291 .218 .293 -.127 .084 .193 -.131 -.062
Q_11 .357 -.144 -.351 .369 .298 .328 .345 .629 -.116 .271 1.000 .335 .423 .325 .365 .369 .587 .373 -.200 .255 .346 -.162 -.086
Q_12 .345 -.195 -.410 .442 .347 .313 .423 .252 -.167 .246 .335 1.000 .489 .433 .332 .408 .333 .493 -.267 .298 .441 -.167 -.046
Q_13 .355 -.143 -.318 .344 .302 .466 .442 .314 -.167 .302 .423 .489 1.000 .450 .342 .358 .408 .533 -.227 .204 .374 -.195 -.053
Q_14 .338 -.165 -.371 .351 .315 .402 .441 .281 -.122 .255 .325 .433 .450 1.000 .380 .418 .354 .498 -.254 .226 .399 -.170 -.048
Q_15 .246 -.165 -.312 .334 .261 .360 .391 .300 -.187 .295 .365 .332 .342 .380 1.000 .454 .373 .343 -.210 .206 .300 -.168 -.062
Q_16 .499 -.168 -.419 .416 .395 .244 .389 .321 -.189 .291 .369 .408 .358 .418 .454 1.000 .410 .422 -.267 .265 .421 -.156 -.082
Q_17 .371 -.087 -.327 .383 .310 .282 .391 .590 -.037 .218 .587 .333 .408 .354 .373 .410 1.000 .376 -.163 .205 .363 -.126 -.092
Q_18 .347 -.164 -.375 .382 .322 .513 .501 .280 -.150 .293 .373 .493 .533 .498 .343 .422 .376 1.000 -.257 .235 .430 -.160 -.080
Q_19 -.189 .203 .342 -.186 -.165 -.167 -.269 -.159 .249 -.127 -.200 -.267 -.227 -.254 -.210 -.267 -.163 -.257 1.000 -.249 -.275 .234 .122
Q_20 .214 -.202 -.325 .243 .200 .101 .221 .175 -.159 .084 .255 .298 .204 .226 .206 .265 .205 .235 -.249 1.000 .468 -.100 -.035
Q_21 .329 -.205 -.417 .410 .335 .272 .483 .296 -.136 .193 .346 .441 .374 .399 .300 .421 .363 .430 -.275 .468 1.000 -.129 -.068
Q_22 -.104 .231 .204 -.098 -.133 -.165 -.168 -.079 .257 -.131 -.162 -.167 -.195 -.170 -.168 -.156 -.126 -.160 .234 -.100 -.129 1.000 .230
Q_23 -.004 .100 .150 -.034 -.042 -.069 -.070 -.050 .171 -.062 -.086 -.046 -.053 -.048 -.062 -.082 -.092 -.080 .122 -.035 -.068 .230 1.000
Sig. (1-tailed) Q_01 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .410
Q_02 .000 .000 .000 .000 .000 .000 .006 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_03 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_04 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .043
Q_05 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .017
Q_06 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_07 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_08 .000 .006 .000 .000 .000 .000 .000 .213 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .005
Q_09 .000 .000 .000 .000 .000 .000 .000 .213 .000 .000 .000 .000 .000 .000 .000 .031 .000 .000 .000 .000 .000 .000
Q_10 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .001
Q_11 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_12 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .009
Q_13 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .004
Q_14 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .007
Q_15 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .001
Q_16 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_17 .000 .000 .000 .000 .000 .000 .000 .000 .031 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_18 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_19 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_20 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .039
Q_21 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_22 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000 .000
Q_23 .410 .000 .000 .043 .017 .000 .000 .005 .000 .001 .000 .009 .004 .007 .001 .000 .000 .000 .000 .039 .000 .000
a. Determinant =.001

Appendix B: Anti-Image Matrices

Anti-image Matrices
Q_01 Q_02 Q_03 Q_04 Q_05 Q_06 Q_07 Q_08 Q_09 Q_10 Q_11 Q_12 Q_13 Q_14 Q_15 Q_16 Q_17 Q_18 Q_19 Q_20 Q_21 Q_22 Q_23
Anti-image Covariance Q_01 .627 -.014 .033 -.103 -.104 .012 .013 -.028 -.011 -.009 -.022 -.004 -.050 -.025 .057 -.153 -.027 -.013 .009 -.011 .004 .001 -.045
Q_02 -.014 .812 -.109 -.029 .008 -.036 .011 -.021 -.153 -.010 .023 .021 -.005 .016 .027 -.008 -.018 .012 -.023 .045 .028 -.100 -.002
Q_03 .033 -.109 .602 .051 .024 -.025 .040 -.004 -.097 -.011 .034 .051 -.018 .042 .005 .046 .019 .021 -.083 .052 .040 -.005 -.057
Q_04 -.103 -.029 .051 .615 -.088 -.004 -.049 -.042 .020 .004 -.012 -.092 .013 -.003 -.039 -.021 -.020 -.014 -.024 -.003 -.050 -.023 -.013
Q_05 -.104 .008 .024 -.088 .709 -.022 -.027 -.016 -.015 -.070 -1.887E-005 -.037 .003 -.017 .010 -.059 -.011 .001 -.013 -.008 -.029 .027 -.004
Q_06 .012 -.036 -.025 -.004 -.022 .573 -.152 .013 .007 -.079 -.043 .026 -.092 -.059 -.079 .057 .022 -.132 -.010 .033 .022 .028 .013
Q_07 .013 .011 .040 -.049 -.027 -.152 .530 -.008 -.019 -.022 .023 -.024 -.021 -.031 -.045 -.011 -.042 -.045 .044 .030 -.112 .008 -.006
Q_08 -.028 -.021 -.004 -.042 -.016 .013 -.008 .510 -.062 .033 -.202 .018 .001 -.013 -.019 -.003 -.150 .012 .030 .013 -.011 -.015 .002
Q_09 -.011 -.153 -.097 .020 -.015 .007 -.019 -.062 .780 .034 .022 -.002 .040 -.030 .049 .034 -.043 -.003 -.087 .029 -.020 -.101 -.078
Q_10 -.009 -.010 -.011 .004 -.070 -.079 -.022 .033 .034 .803 -.057 -.013 -.040 -.008 -.068 -.054 .008 -.017 -.007 .033 .011 .015 .013
Q_11 -.022 .023 .034 -.012 -1.887E-005 -.043 .023 -.202 .022 -.057 .470 -.003 -.050 .019 -.029 .002 -.112 -.011 -.004 -.048 -.002 .021 .006
Q_12 -.004 .021 .051 -.092 -.037 .026 -.024 .018 -.002 -.013 -.003 .576 -.111 -.048 -.016 -.022 .003 -.079 .027 -.042 -.044 .012 -.020
Q_13 -.050 -.005 -.018 .013 .003 -.092 -.021 .001 .040 -.040 -.050 -.111 .549 -.057 -.005 .014 -.048 -.090 .006 .011 -.018 .035 -.021
Q_14 -.025 .016 .042 -.003 -.017 -.059 -.031 -.013 -.030 -.008 .019 -.048 -.057 .607 -.059 -.046 -.016 -.081 .030 .001 -.036 .021 -.019
Q_15 .057 .027 .005 -.039 .010 -.079 -.045 -.019 .049 -.068 -.029 -.016 -.005 -.059 .656 -.138 -.049 .022 .007 -.026 .021 .018 -.018
Q_16 -.153 -.008 .046 -.021 -.059 .057 -.011 -.003 .034 -.054 .002 -.022 .014 -.046 -.138 .537 -.039 -.047 .030 -.003 -.046 -.002 .016
Q_17 -.027 -.018 .019 -.020 -.011 .022 -.042 -.150 -.043 .008 -.112 .003 -.048 -.016 -.049 -.039 .506 -.017 -.030 .009 -.021 .007 .037
Q_18 -.013 .012 .021 -.014 .001 -.132 -.045 .012 -.003 -.017 -.011 -.079 -.090 -.081 .022 -.047 -.017 .508 .019 -.002 -.038 -.016 .015
Q_19 .009 -.023 -.083 -.024 -.013 -.010 .044 .030 -.087 -.007 -.004 .027 .006 .030 .007 .030 -.030 .019 .791 .069 .021 -.093 -.033
Q_20 -.011 .045 .052 -.003 -.008 .033 .030 .013 .029 .033 -.048 -.042 .011 .001 -.026 -.003 .009 -.002 .069 .730 -.204 -.009 -.023
Q_21 .004 .028 .040 -.050 -.029 .022 -.112 -.011 -.020 .011 -.002 -.044 -.018 -.036 .021 -.046 -.021 -.038 .021 -.204 .546 -.016 .009
Q_22 .001 -.100 -.005 -.023 .027 .028 .008 -.015 -.101 .015 .021 .012 .035 .021 .018 -.002 .007 -.016 -.093 -.009 -.016 .833 -.154
Q_23 -.045 -.002 -.057 -.013 -.004 .013 -.006 .002 -.078 .013 .006 -.020 -.021 -.019 -.018 .016 .037 .015 -.033 -.023 .009 -.154 .914
Anti-image Correlation Q_01 .930a -.020 .053 -.167 -.156 .020 .023 -.049 -.016 -.012 -.041 -.007 -.085 -.040 .089 -.264 -.047 -.023 .012 -.016 .006 .001 -.059
Q_02 -.020 .875a -.157 -.041 .010 -.053 .016 -.033 -.193 -.012 .038 .031 -.008 .023 .037 -.011 -.029 .018 -.029 .059 .041 -.121 -.002
Q_03 .053 -.157 .951a .084 .037 -.042 .072 -.007 -.142 -.016 .064 .087 -.032 .069 .008 .081 .035 .039 -.121 .078 .070 -.007 -.076
Q_04 -.167 -.041 .084 .955a -.134 -.007 -.087 -.075 .030 .006 -.022 -.154 .023 -.004 -.062 -.036 -.035 -.025 -.034 -.004 -.086 -.033 -.017
Q_05 -.156 .010 .037 -.134 .960a -.035 -.044 -.027 -.020 -.093 -3.269E-005 -.058 .004 -.026 .014 -.096 -.018 .002 -.018 -.011 -.046 .035 -.005
Q_06 .020 -.053 -.042 -.007 -.035 .891a -.275 .024 .011 -.116 -.084 .045 -.164 -.099 -.128 .102 .041 -.244 -.015 .051 .039 .040 .018
Q_07 .023 .016 .072 -.087 -.044 -.275 .942a -.015 -.030 -.033 .045 -.043 -.039 -.054 -.077 -.020 -.080 -.087 .068 .048 -.208 .013 -.008
Q_08 -.049 -.033 -.007 -.075 -.027 .024 -.015 .871a -.099 .051 -.412 .033 .002 -.023 -.033 -.006 -.296 .024 .047 .021 -.020 -.023 .002
Q_09 -.016 -.193 -.142 .030 -.020 .011 -.030 -.099 .834a .043 .037 -.003 .061 -.043 .068 .052 -.068 -.006 -.111 .038 -.031 -.126 -.092
Q_10 -.012 -.012 -.016 .006 -.093 -.116 -.033 .051 .043 .949a -.092 -.019 -.060 -.012 -.093 -.082 .012 -.026 -.009 .043 .017 .019 .015
Q_11 -.041 .038 .064 -.022 -3.269E-005 -.084 .045 -.412 .037 -.092 .906a -.005 -.099 .035 -.052 .005 -.230 -.022 -.006 -.082 -.005 .034 .010
Q_12 -.007 .031 .087 -.154 -.058 .045 -.043 .033 -.003 -.019 -.005 .955a -.198 -.082 -.026 -.040 .006 -.146 .040 -.065 -.079 .018 -.028
Q_13 -.085 -.008 -.032 .023 .004 -.164 -.039 .002 .061 -.060 -.099 -.198 .948a -.099 -.008 .026 -.090 -.170 .009 .018 -.033 .052 -.030
Q_14 -.040 .023 .069 -.004 -.026 -.099 -.054 -.023 -.043 -.012 .035 -.082 -.099 .967a -.093 -.081 -.028 -.145 .044 .001 -.063 .029 -.026
Q_15 .089 .037 .008 -.062 .014 -.128 -.077 -.033 .068 -.093 -.052 -.026 -.008 -.093 .940a -.232 -.085 .038 .009 -.037 .035 .025 -.024
Q_16 -.264 -.011 .081 -.036 -.096 .102 -.020 -.006 .052 -.082 .005 -.040 .026 -.081 -.232 .934a -.076 -.090 .047 -.005 -.085 -.003 .023
Q_17 -.047 -.029 .035 -.035 -.018 .041 -.080 -.296 -.068 .012 -.230 .006 -.090 -.028 -.085 -.076 .931a -.034 -.047 .015 -.041 .010 .055
Q_18 -.023 .018 .039 -.025 .002 -.244 -.087 .024 -.006 -.026 -.022 -.146 -.170 -.145 .038 -.090 -.034 .948a .030 -.003 -.072 -.024 .023
Q_19 .012 -.029 -.121 -.034 -.018 -.015 .068 .047 -.111 -.009 -.006 .040 .009 .044 .009 .047 -.047 .030 .941a .091 .031 -.115 -.038
Q_20 -.016 .059 .078 -.004 -.011 .051 .048 .021 .038 .043 -.082 -.065 .018 .001 -.037 -.005 .015 -.003 .091 .889a -.323 -.011 -.028
Q_21 .006 .041 .070 -.086 -.046 .039 -.208 -.020 -.031 .017 -.005 -.079 -.033 -.063 .035 -.085 -.041 -.072 .031 -.323 .929a -.024 .013
Q_22 .001 -.121 -.007 -.033 .035 .040 .013 -.023 -.126 .019 .034 .018 .052 .029 .025 -.003 .010 -.024 -.115 -.011 -.024 .878a -.176
Q_23 -.059 -.002 -.076 -.017 -.005 .018 -.008 .002 -.092 .015 .010 -.028 -.030 -.026 -.024 .023 .055 .023 -.038 -.028 .013 -.176 .766a
a. Measures of Sampling Adequacy(MSA)

Appendix C: Communalities

Communalities
Initial Extraction
Q_01 1.000 .435
Q_02 1.000 .414
Q_03 1.000 .530
Q_04 1.000 .469
Q_05 1.000 .343
Q_06 1.000 .654
Q_07 1.000 .545
Q_08 1.000 .739
Q_09 1.000 .484
Q_10 1.000 .335
Q_11 1.000 .690
Q_12 1.000 .513
Q_13 1.000 .536
Q_14 1.000 .488
Q_15 1.000 .378
Q_16 1.000 .487
Q_17 1.000 .683
Q_18 1.000 .597
Q_19 1.000 .343
Q_20 1.000 .484
Q_21 1.000 .550
Q_22 1.000 .464
Q_23 1.000 .412
Extraction Method: Principal Component Analysis.

Appendix D: Component Matrix

Component Matrixa
Component
1 2 3 4
Q_18 .701
Q_07 .685
Q_16 .679
Q_13 .673
Q_12 .669
Q_21 .658
Q_14 .656
Q_11 .652 -.400
Q_17 .643
Q_04 .634
Q_03 -.629
Q_15 .593
Q_01 .586
Q_05 .556
Q_08 .549 .401 -.417
Q_10 .437
Q_20 .436 -.404
Q_19 -.427
Q_09 .627
Q_02 .548
Q_22 .465
Q_06 .562 .571
Q_23 .507
Extraction Method: Principal Component Analysis.
a. 4 components extracted.

Appendix E: Rotated Component Matrix

Rotated Component Matrixa
Component
1 2 3 4
I have little experience of computers .800
SPSS always crashes when I try to use it .684
I worry that I will cause irreparable damage because of my incompetence with computers .647
All computers hate me .638
Computers have minds of their own and deliberately go wrong whenever I use them .579
Computers are useful only for playing games .550
Computers are out to get me .459
I can’t sleep for thoughts of eigenvectors .677
I wake up under my duvet thinking that I am trapped under a normal distribution .661
Standard deviations excite me -.567
People try to tell you that SPSS makes statistics easier to understand but it doesn’t .473 .523
I dream that Pearson is attacking me with correlation coefficients .516
I weep openly at the mention of central tendency .514
Statistics makes me cry .496
I don’t understand statistics .429
I have never been good at mathematics .833
I slip into a coma whenever I see an equation .747
I did badly at mathematics at school .747
My friends are better at statistics than me .648
My friends are better at SPSS than I am .645
If I’m good at statistics my friends will think I’m a nerd .586
My friends will think I’m stupid for not being able to cope with SPSS .543
Everybody looks at me when I use SPSS .428
Extraction Method: Principal Component Analysis.
Rotation Method: Varimax with Kaiser Normalization.
a. Rotation converged in 8 iterations.
Do you need this or any other assignment done for you from scratch?
We have qualified writers to help you.
We assure you a quality paper that is 100% free from plagiarism and AI.
You can choose either format of your choice ( Apa, Mla, Havard, Chicago, or any other)

NB: We do not resell your papers. Upon ordering, we do an original paper exclusively for you.

NB: All your data is kept safe from the public.

Click Here To Order Now!