Activity Theory & TPCK Theory
23 views | +0 today
Follow
Your new post is loading...
Scooped by mbalenhle
Scoop.it!

Technological Pedagogical Content Knowledge (TPACK) Framework

Technological Pedagogical Content Knowledge (TPACK) Framework | Activity Theory & TPCK Theory | Scoop.it
Instructional Design Certificate (Fully Online) This fully online program is for anyone developing and/or teaching an online course. Learn how to build and teach engaging online courses (learn more...) What is TPACK?
mbalenhle's insight:
The article explains what the framework is in TPACK. It stipulates that there are certain tools that work well in providing guidance for a subject. The application of technology has to have a relationship with content, technology and pedagogy. This would assist in the cultures that are dynamic in the classroom as well.
1
more...
Prudence Matsega's curator insight, May 19, 5:07 AM
The TPACK theory illustrates the importance of fusing the PACK which the pedagogical content knowledge withe the Technological aspect. In these times we are teaching in , its not just enough to have the skills and knowledge to teach a matter but you need to use the right tools that will enhance the learning process for the learners. Everything in the world is moving more and more into the technological side of things so it only makes sense for the education sector to also increase it knowledge and expertise in technology. That then sends a strong message to teachers to familiarize themselves with the use of technology. 
Scooped by mbalenhle
Scoop.it!

How Education And Technology Are Evolving Together

How Education And Technology Are Evolving Together | Activity Theory & TPCK Theory | Scoop.it
What will education and learning look like ten years from now? This question was originally answered on Quora by Mike Silagadze.
mbalenhle's insight:
This article reminds us of the learning that also takes place outside of the classroom. It motivates the use of technology in the classroom in innovative ways which will assure productive relations with learners. It talks about balance for the preparation of people for the learning environment. Where technical skills, general critical thinking and communication skills should function collectively.
more...
Prudence Matsega's curator insight, May 19, 5:28 AM
Technology places an important role in the world that we live in. We are slowly moving from huge textbooks and multiple exercise books. We need to reach a point where education is paperless. Paper frustrates both the educator and the learner. The use of gadgets makes it possible manage lessons better and in my opinion i think the use of technology saves time.
Scooped by mbalenhle
Scoop.it!

When knowing is believing: A multi‐trait analysis of self‐reported TPCK - Krauskopf - - Journal of Computer Assisted Learning - Wiley Online Library

Abstract In an effort to understand teachers' technology use, recent scholarship has explored the idea of technological pedagogical content knowledge (TPCK or TPACK). Many studies have used self‐reports to measure this knowledge (SR TPCK). Several studies have examined the construct validity of these assessments by analysing the internal relationships of the knowledge domains, but little attention has been paid to how SR TPCK relates to external criteria. We tackled this question of discriminant validity by reanalysing 2 data sets. We used correlation and multiple regression analyses to explore whether conceptually related constructs explain any variance in participants' SR TPCK. In Study 1, we applied this strategy to German pre‐service teachers using technology use, attitudinal variables, and objective measures of teachers' knowledge of technology and pedagogy as external criteria. In Study 2, we examined measures of technology knowledge, experience, and pro‐technology beliefs for in‐service teachers in the United States. Across both studies, a sizeable amount of the variance in SR TPCK is explained by teachers' prior technology use and pro‐technology attitudes. In contrast, fact‐based tests of technology and pedagogy are distinct from SR TPCK. We discuss implications for these findings and argue that researchers should gather complementary measures in concert. Lay Description What is already known about this topic: Technological pedagogical content knowledge (TPCK) is theorized to be an important aspect of teachers' use of technology. TPCK self‐reports seem intuitively helpful, and they are an easy way to quantify teachers' perceptions. The validity of TPCK self‐reports as knowledge measures is unproven. What this paper adds: Results of 2 studies show that TPCK self‐reports relate to beliefs and experience more than to knowledge of technology facts. Pre‐service teachers' self‐reports of their knowledge may be influenced by their enthusiasm for teaching with technology. In‐service teachers' beliefs about the value of technology appear to contribute meaningfully to TPCK self‐reports. Implications for practice and/or policy: Measures of experience and beliefs should be included when studying teachers' knowledge of technology. TPCK self‐reports may be used to complement factual knowledge measures as indicators of readiness to teach with technology. 1 INTRODUCTION Teachers play a key role in deciding what tools are used in the classroom. To better understand the conditions under which emerging digital technologies are adopted, the technological pedagogical content knowledge (TPCK or TPACK) framework 2 has focused scholarly attention on the types of knowledge teachers need in order to adopt new technologies (see Angeli & Valanides, 2009; Cox & Graham, 2009; Koehler & Mishra, 2008; Krauskopf, Zahn, & Hesse, 2015; Mishra & Koehler, 2006). Over the last decade, a number of researchers have tackled the task of measuring these forms of knowledge. In a review of 141 studies, Koehler, Shin, and Mishra (2012) identified five major approaches. Among these, self‐report measures accounted for a substantial portion of the research. Self‐report measures complement other approaches by allowing researchers to quickly and economically answer the question “How much did participants change?” in response to interventions. It becomes critical that researchers understand the nature of the construct underlying these survey instruments, to understand the true impact of those interventions. 1.1 Validity of self‐report measures An early self‐report TPCK (SR TPCK) instrument developed by Schmidt et al. (2009) has seen wide adoption and also scrutiny regarding its validity. Researchers have examined several types of validity, with mixed results. Studies have addressed content validity through the use of conceptual analysis by experts (e.g., Cox & Graham, 2009; Yeh, Hsu, Wu, Hwang, & Lin, 2014). These studies generally show strong alignment between the questions and scholars' understanding of the theory. Construct validity has been investigated by means of testing the internal consistency (e.g., Schmidt et al., 2009) and factor structure (e.g., Archambault & Barnett, 2010; Koh, Chai, & Tsai, 2010) of questionnaires. Although internal consistency is typically high, factor analyses have often shown that survey items load on factors that do not clearly align with their intended domain in the TPACK conceptual framework. However, those of us using these measures in studies know little about the criterion validity of SR TPCK, that is, how it aligns to other conceptually related constructs. In their review of the TPACK literature, Voogt, Fisser, Pareja Roblin, Tondeur, and van Braak (2013) concluded that “teacher knowledge and beliefs are closely related, [so we] need further research focused on the complex relationship between TPACK (teacher knowledge), teacher practical knowledge and teacher beliefs” (p. 12). This underscores the need to examine whether variance in teachers' SR TPCK is explained by conceptually related constructs. 1.2 Convergent and divergent validity Convergent validity is the degree to which constructs that are related in theory also correlate when measured. Correlations should be higher with theoretically related constructs, than with constructs that are dissimilar (divergent validity). Further, Campbell and Fiske (1959) argue for examining correlations of related constructs across different methods of data collection. We suggest that TPACK research needs to start following this rationale of multi‐trait and multi‐method approaches. A recent study by Drummond and Sweeney (2017) attempted a mono‐trait, multi‐method analysis of TPCK‐Deep (Kabakci Yurdakul et al., 2012) by comparing self‐reported TPCK‐Deep scores with a battery of true/false knowledge questions. Their study provided initial evidence that a SR TPCK measure shows only small correlations with a fact‐based knowledge measure. Theirs is the only study we have found in peer‐reviewed journals that links an SR TPCK measure with a fact‐based TPCK measure (mono‐trait, multi‐method). We found two studies investigating correlations between SR TPCK and technology‐related beliefs (multi‐trait, mono‐method). Both studies focused on comparing the self‐reports of the various TPACK subdomains to beliefs. Abbitt (2011) reports correlations and regression analyses between SR TPCK and self‐efficacy beliefs. He found that SR TPCK uniquely accounted for approximately 63% of the variance in self‐efficacy beliefs in the pretest. However, SR TPCK no longer accounted for any variance in self‐efficacy at the post‐test. Messina and Tabone (2013) found positive correlations between SR TPCK, self‐rated computer software proficiency, and positive beliefs about using technology. Unfortunately, they did not identify the unique relationships of each variable to SR TPCK, controlling for the others. These preliminary but inconclusive findings suggest that the SR TPCK measure operates in concert with beliefs, self‐efficacy, and personal experience with certain kinds of software. Clearly, research is needed to further explore the convergent and discriminant validity of SR TPCK instruments to understand what scores imply for interpreting results. This need is acute given the frequency with which these instruments are used in current research. 1.3 The current studies We sought to contribute to the convergent and divergent validity of a popular SR TPCK measure (Schmidt et al., 2009). We took an approach to fill in some of the gaps in a communal multi‐trait, multi‐method (Campbell & Fiske, 1959) matrix. We examined SR TPCK simultaneously with measures of knowledge, beliefs, and experience (multi‐trait). As a first step toward this goal, we reanalysed two data sets, both of which included similar SR TPCK items. We employed correlation and multiple regression analyses to explore to what extent conceptually related constructs explained the variance in participants' SR TPCK. We compared these results across the two samples. Our goal was to determine how much of the variance in SR TPCK is explained by other constructs. We wanted to know if SR TPCK is conceptually closer to other knowledge measures (as would be suggested by an integrative interpretation of the TPACK framework, see Angeli & Valanides, 2009) or to non‐knowledge constructs related to teaching with technology (beliefs, attitudes, and experiences). We investigated two different groups of teachers. Study 1 focused on German pre‐service teachers, and Study 2 focused on accomplished teachers in the United States. We aimed to strengthen our findings by uncovering patterns across two different groups. We included data collected in different phases of teacher training and in distinct cultural contexts. Both studies included the SR TPCK measure developed by Schmidt et al. (2009). Both studies included measures of beliefs or attitudes about technology, and personal as well as professional experience with technology in the classroom. The two studies handled the variability of content and technology in complementary ways. Study 1 specified a sample technology (a collaborative video tool), while leaving the subject area of application undefined. Study 2 left the technology unspecified but focused the SR TPCK questions on the particular subject(s) taught by the respondent. The similar result patterns that appeared across these two groups provide key insights into what SR TPCK measures measure. This contributes to a clearer understanding of what conclusions researchers can draw when using SR TPCK instruments. 2 STUDY 1 Teachers' pedagogical knowledge (PK), technological knowledge (TK), pedagogical beliefs (cf. Law, 2008), and attitudes towards technology (enthusiasm) were assessed concurrently with the SR TPCK scale. The measures were based on the general model of aspects of teacher competence by Baumert and colleagues (Kunter & Baumert, 2011; Kunter et al., 2007). 2.1 Participants Pre‐service teachers of different subject areas who had completed their fourth semester at the University of Tuebingen in Germany were recruited via the mailing list of the University. The final sample consisted of 82 pre‐service teachers (mean [M] = 24.85 years old standard deviation [SD] = 4.36, 80% female). All were enrolled in the university's teacher training to become secondary educators (academic track high school). Ninety‐six per cent had already completed teaching internships where they themselves had taught (M = 34.6 hr/week, SD = 2.7). In Germany, teachers usually teach two subjects; in this sample, 53% of participants studied German language arts, 34% sciences, 18% history, 11% mathematics, and 59% other subjects. 2.2 Procedure Participants responded to three online questionnaires about 2 weeks apart. Most measures reported here are from the first questionnaire. The TK assessment was administered as part of the second questionnaire (for more details, see Krauskopf, 2012). Participants who completed the full study received 25€ and general information on the study's results. 2.3 Measures 2.3.1 SR TPCK The TPCK subscale of the self‐assessment instrument developed by Schmidt et al. (2009) was translated into German (translation by corresponding author). Because this study was directed to participants teaching a range of subjects, the general terms “in my classroom” or “lesson” were used. For example, “I can teach lessons that appropriately combine content, technologies and teaching approaches” (back translation from German). Overall, the scale showed good internal consistency, Cronbach's α = .88 see Table 1 for descriptive statistics. 2 3 4 5 6 7 8 9 10 11 M SD 1 SR TPCK .03 −.15 .17 .38*** .00 −.29** .16 .39*** .03 .59*** 4.31 0.92 Personal 2 Gendera −.29** .04 .14 .00 −.23* −.03 .23* .02 .00 1.79 0.41 3 Age −.05 −.67*** .18 .86*** −.04 −.27* −.14 −.26* 24.85 4.35 4 High school grade averageb .12 .08 −.16 −.03 .01 .32** .06 2.17 0.51 5 Personal computer use frequency −.12 −.80*** .19 .21 .22* .34** 4.90 0.51 6 TKc .13 .21 .02 −.05 .05 7.12 2.22 Teaching 7 Time enrolled in teacher ed. program −.15 −.31** −.24* −.34** 9.54 6.63 8 PKd .12 .01 .08 9.05 2.38 Beliefs and attitudes 9 TPB constructivist orientation −.49*** .29** 4.71 1.19 10 TPB explicit instruction orientation −.02 3.47 1.03 11 Enthusiasm for teaching with technology 4.67 0.95 Note. SR TPCK = self‐report technological pedagogical content knowledge; TK = technological knowledge; PK = pedagogical knowledge; TPB = technological pedagogical beliefs; M = mean; SD = standard deviation. a Gender was dummy coded: male = 1, female = 2. b 1 = highest grade. c N = 74. Theoretical maximum = 10. d Theoretical maximum = 18. * p < .05. ** p < .01. *** p < .001. 2.3.2 Personal variables Participants provided information on their age, gender, and high school grade point average. Personal computer use frequency Personal computer use frequency was assessed by self‐reports and rated on one item using a scale from 1 = less than once a week to 5 = daily. Specific technological knowledge A subset of participants (n = 74) also completed an online tutorial. They were introduced to a sample video technology for collaborative learning, WebDIVER (http://diver.stanford.edu/what.html). The written introduction to WebDIVER covered 10 technological functions, such as playing and pausing a video, selecting and cutting out still images, zooming in on details before cutting out, or commenting on own and other users' cut outs (for more details, see Krauskopf, Zahn, Hesse, & Pea, 2014). After a short overview of WebDIVER's graphical user interface and its general features, each of the 10 technological functions was introduced in more detail. After reading the instructions, participants explored WebDIVER individually without any specific instruction. They were asked to recall as many of the 10 technological functions as possible. The number of functions recalled was summed up into a TK score indicating their basic TK of WebDIVER. 2.3.3 Teaching variables Pedagogical experience We used the number of semesters participants had been enrolled in the teacher education program as a measure of pedagogical experience. All participants had been studying for at least five semesters. Pedagogical knowledge Declarative aspects of participants' general PK were assessed using 18 items from the German teacher training guidelines (Schulte, Bögeholz, & Watermann, 2008) and the Educational Testing Service (2006) Praxis Test. These items are available online but generally unknown to German students. They show internal consistency (Cronbach's α = .70). The multiple choice items had one correct answer each. As an example, Which of the following would be the best indication to a teacher that students are beginning to think critically about science? (a) They talk about earthquakes, space probes, and science‐related information in the news. (b) They begin to read more books and articles about science on their own. (c—correct answer) They successfully plan and carry out simple experiments to test questions raised in classroom discussions. (d) They correctly answer the teacher's questions about the procedures used after observing science experiments being done. A PK indicator was computed from the overall sum (for more on these measures, see Krauskopf, Zahn, & Hesse, 2012). 2.3.4 Beliefs and attitudes Technological pedagogical (TP) beliefs about video These beliefs were assessed in order to be able to differentiate between knowledge and more global pedagogical assumptions, here about using a specific technology, namely, video. The measure was adapted from Souvignier and Mokhlesgerami (2005), and participants rated items on two subscales: constructivist orientation items (two items, e.g., “Students should be allowed to explore their own ways of dealing with video material before you show them how to approach it”) and explicit instruction orientation items (three items, e.g., “Students learn how to deal with video material most effectively when you provide them with instructions on how to go about working such material”). Responses were given on a 4‐point Likert scale from 1 = completely disagree to 4 = completely agree. Both scales showed sufficient internal consistencies (Cronbach's α ≥ .75). Enthusiasm for technology in teaching Pre‐service teachers' enthusiasm for the use of technology was derived from the average of two items based on Kunter et al. (2008): “I myself am enthusiastic about the possibilities of new media” and “I am enjoying it a lot to use new media in my teaching.” Items were rated on a 4‐point Likert scale from 1 = completely disagree to 4 = completely agree and showed sufficient internal consistency (Cronbach's α = .76). 2.4 Results 2.4.1 Correlations In order to determine the construct validity of the SR TPCK measure in the German sample, we first computed zero‐order correlations between SR TPCK and the demographic information and the other conceptually relevant constructs (see Table 1). Personal variables There was a significant positive correlation of SR TPCK and the reported personal computer use frequency, r(82) = .38, p < .001. Participants reporting a higher frequency of personal computer also reported higher SR TPCK. There were no significant correlations between SR TPCK and age, r(82) = −.15, p = .19, or gender, r(82) = .03, p = .78. Participants recalled on average 7.12 (SD = 2.22) of the 10 technological features introduced to them about the sample technology. There was no significant correlation of this specific TK indicator and SR TPCK, r(74) = .00, p = .97. Teaching variables There was a negative correlation between SR TPCK and the length of time participants had been enrolled in their teacher program at university, r(82) = −.29, p = .01. The longer participants had studied, the lower they rated their SR TPCK. In addition, longer enrollment correlated highly with less frequent personal computer use, r(82) = −.80, p < .001, older age, r(82) = .86, p < .001, and the likelihood of being a male participant, r(82) = −.23, p = .04. In sum, time enrolled in the teacher program seems a very unspecific indicator and confounded with personal demographic indicators. We chose to exclude it from further analyses due to collinearity issues caused by the high negative correlation with personal computer use. There was no significant correlation between PK and SR TPCK, r(82) = .16, p = .15. Beliefs and attitudes We found positive relations of SR TPCK with a constructivist orientation regarding the use of video in teaching, r(82) = .39, p < .001, and with enthusiasm for technology in teaching, r(82) = .59, p < .001. This indicates that pre‐service teachers who reported a stronger constructivist orientation and more enthusiasm about technology for teaching also reported higher SR TPCK. 2.4.2 Regression analyses To determine how much variance in participants' SR TPCK could be explained by each of the variables that were significantly correlated with this scale, we ran multiple hierarchical regression analyses. We use regressions to understand how each construct related to SR TPCK while controlling for the other variables (see Table 2), not to predict SR TPCK in a causal sense. In the first model, participants' personal computer use was entered; in the second model, participants' beliefs about constructivist teaching using video were entered; and the final model additionally included their reported enthusiasm for technology in teaching. Model 1 Model 2 Model 3 Variable B SE B β B SE B β B SE B β Personal Personal computer use frequency 0.68 0.19 .38*** 0.55 0.18 .31** 0.31 0.16 .17 Beliefs and attitudes TPB constructivist orientation 0.25 0.08 .33** 0.17 0.07 .22* Enthusiasm for teaching with technology 0.46 0.09 .47*** Adjusted R2 .13 .23 .41 F for change in R2 13.19*** 10.75** 25.57*** Note. TPB = technological pedagogical beliefs; SE = standard error; SR TPCK = self‐report technological pedagogical content knowledge. * p < .05. ** p < .01. *** p < .001. In the first model, personal computer use was significantly related to SR TPCK, β = .38, t(80) = 3.63, p < .001. This effect remained significant after entering the constructivist orientation, β = .31, t(79) = 3.33, p = .003 in Model 2. After entering enthusiasm ratings into the regression in Model 3, however, the effect of personal computer use was rendered non‐significant, β = .17, t(78) = 1.87, p = .07, and that of participants' constructivist orientation reduced in size, β = .22, t(78) = 2.44, p = .02, whereas enthusiasm ratings showed a significant relationship to SR TPCK, β = .47, t(78) = 5.06, p < .001. Adding enthusiasm for technology in teaching to the model accounted for an additional 18% of the variance in SR TPCK. In sum, each step in the regression added a significant amount of explained variance in participants' SR TPCK, resulting in 41% explained variance (adjusted R2) in the final model. 2.5 Study 1 discussion Study 1 investigated a sample of German pre‐service teachers. Results show that gender and age were not related to SR TPCK scores, suggesting that sociodemographic factors belong to an unrelated construct. Furthermore, SR TPCK is not related to performance on a test of PK nor to recall of facts about a sample technology (TK). We conclude that SR TPCK shows discriminant validity from these sub‐constructs of the TPACK framework, when measured by fact‐based tests. We find that SR TPCK is significantly related to reports of personal computer use, beliefs, and attitudes. When other variables were held constant, two showed significant relationships with SR TPCK: constructivist teaching beliefs and enthusiasm for teaching with technology. In this study, enthusiasm for new technologies accounts for the largest part of SR TPCK variance (22.1%). Our multi‐trait analysis indicates that SR TPCK captures a construct reflecting attitudes towards using technology above and beyond prior experience. It is an important contribution to identify the convergent validity of pre‐service teachers' SR TPCK with this attitudinal variable. 3 STUDY 2 The findings in Study 1 were true for less skilled, relatively inexperienced teachers. It is possible that inexperienced teachers rely more heavily on beliefs about the value of technology given their relatively lower levels of knowledge and experience. Yet if we find the same pattern with experienced teachers, the construct validity of SR TPCK is more firmly established. We now ask the question: How do the relationships between SR TPCK, knowledge, experience, and beliefs relate for teachers with substantial teaching experience? Study 2 reanalyzed data from a two‐part online survey of teachers in the United States who had been certified as accomplished teachers (for more on this study, see Forssell, 2011). The focus on accomplished teachers allowed for the study of the variability in SR TPCK while limiting the variability in pedagogical content knowledge. 3.1 Participants Of the 307 in‐service teachers who completed the SR TPCK scale items, 81% were female and 19% male. They ranged in age from 30 to 66 (M = 48.9, SD = 9.1). Their classroom experience ranged from 6 to 46 years, with an average of over 19 years of experience (M = 19.1, SD = 7.9). They represented a broad mix of subjects, grades, and school populations. Not all of the 307 respondents to the survey who completed the SR TPCK measure also completed the other focal measures of this study; therefore, the number of respondents ranged from 209 to 307 on the various measures. A total of 178 participants responded to all eight measures in this study. 3.2 Measures 3.2.1 SR TPCK Respondents rated the degree to which they agreed with statements about their TPCK on five items, slightly modified from the survey of pre‐service teachers designed by Schmidt et al. (2009). References to pre‐service education were removed or modified to reflect the experiences of in‐service teachers. Furthermore, each respondent indicated what subjects were taught, and those subject areas were specified in the SR TPCK survey items. For example, “I can use strategies that combine [social studies] content, technologies, and teaching approaches that I learned about elsewhere, in my classroom.” Analyses of the five items for each subject area showed that they were internally consistent (Cronbach's α ≥ .94). Responses to the 5‐point scale (1 = strongly disagree to 5 = strongly agree) were averaged for each subject area taught. In cases where respondents taught more than one subject, the highest SR TPCK score was used (see Table 3 for descriptive statistics for all variables). 2 3 4 5 6 7 8 9 10 M SD 1 SR TPCK −.08 −.04 .42** .46** .20** .30** .38** .30** .39** 3.91 0.80 Personal 2 Gender −.05 −.21** −.14* .07 −.07 −.17** .04 .05 1.81 0.39 3 Age −.16** −.07 .04 .13 −.01 −.03 .06 48.88 9.03 4 Technology knowledge .60** .01 .21** .41** .15* .14* 2.97 0.90 5 Personal activities .06 .32** .51** .20** .15* 6.78 3.08 Teaching 6 Frequency .25** .08 .39** .32** 2.44 1.32 7 Internet for teaching .27** .26** .25** 2.85 1.11 8 Student activities .09 .00 3.95 3.39 Beliefs 9 TP belief .48** 3.46 0.58 10 TPC beliefs 3.93 0.57 Note. TP = technological pedagogical; TPC = technological pedagogical content; SR TPCK = self‐report technological pedagogical content knowledge; M = mean; SD = standard deviation. * p < .05. ** p < .01. 3.2.2 Personal variables In addition to age and gender, participants reported on their experience using computers for a variety of activities. Personal activities Items developed by Barron (2004) were used to measure the extent to which participants had engaged in 16 computer‐based activities. The activities included creativity, communication, and problem solving (e.g., create a piece of art, start a blog, or design a two‐dimensional or three‐dimensional model). Respondents were asked to indicate the number of times they had participated in each activity in their personal lives. The responses were recoded into 0 = never and 1 = at least once and then summed. Technology knowledge Participants rated their familiarity with 27 Internet‐related terms from 1 = none to 5 = full on a scale developed and validated by Hargittai (2005). The scale has been shown to correlate highly with observed search behaviours on the Internet. Thus, this knowledge measure has demonstrated criterion validity, even though it is self‐reported. Each participant received a score based on the average of all items completed. 3.2.3 Teaching variables Several variables captured aspects of the ways in which participants used new technologies in their work with students. Student activities At the same time participants reported on the activities they engaged in personally, they also reported whether they had assigned those activities to students. The total number of activities respondents had asked their students to engage in during class time at least once became the student activities score. Frequency A follow‐up survey was sent to the respondents who had indicated they were teaching at the time of the survey. It included a measure of frequency of computer use with students. Participants were asked “On average, how often do you plan for a typical student to use a computer during class?” and asked to choose from 0 = Never, 1 = Less than Once a Month, 2 = 1–2 Times a Month, 3 = 1–2 Times a Week, and 4 = 3 Times a Week or more. Internet for teaching The follow‐up survey included a measure of how often teachers use the Internet to plan their teaching (Markow & Cooper, 2008). Participants were asked “This school year, how often have you used an Internet resource to get teaching ideas?” (0 = Never, 1 = Less than Once a Month, 2 = 1–2 Times a Month, 3 = 1–2 Times a Week, and 4 = 3 Times a Week or more). 3.2.4 Beliefs variables Two measures addressed the degree to which participants believed that the use of technology in the classroom is beneficial to student learning. Technological pedagogical (TP) belief The follow‐up survey included the statement “Digital resources such as classroom technology and Web‐based programs help my students' academic achievement” (from Mayer & Phillips, 2010), which participants rated on a scale from 1 = strongly disagree to 4 = strongly agree. Technological pedagogical content (TPC) beliefs Participants were asked to rate eight items regarding positive beliefs on a 5‐point scale from 1 = strongly disagree to 5 = strongly agree. This scale drew on items from prior studies such as, for example, “Technology helps students grasp difficult [subject] concepts more easily” (Russell, Bebell, O'Dwyer, & O'Connor, 2003). Of the items, half (four) reflected positive and half negative impacts of technology use on student learning. Codes for the negative statements were reversed before averaging the responses. The scale showed high internal consistency (Cronbach's α coefficients ranged from .84 to .89 depending on subject). The highest of all subject‐specific scales was used for each participant. 3.3 Results 3.3.1 Correlations Parallel to Study 1, we first computed zero‐order correlations of SR TPCK with the personal information and other teaching‐related constructs (see Table 3). Personal variables SR TPCK showed no relationship to age or gender. In contrast, there was a significant positive correlation between the SR TPCK scale and the breadth of personal activities, r(304) = .46, p < .01. Participants reporting a broader range of personal activities reported higher SR TPCK. They also reported higher TK, r(302) = .42, p < .01. Teachers who were familiar with more Internet terms tended to have higher SR TPCK scores. Teaching variables SR TPCK consistently showed statistically significant relationships to technology‐related teaching variables. SR TPCK correlated most strongly with the student activities score, r(302) = .38, p < .01. Participants reporting higher SR TPCK also assigned a broader range of activities to their students. There was a significant positive correlation between the SR TPCK scale and the frequency with which the teacher used the Internet for teaching, r(211) = .30, p < .01, with participants reporting higher SR TPCK also reporting getting teaching ideas from the Internet more frequently. Similarly, participants reporting higher SR TPCK reported more frequent use of computers in class, r(209) = .20, p < .01. Beliefs variables Finally, participants with higher SR TPCK reported more positive beliefs about the value of technology to support students' learning, both on the TP belief item, r(209) = .30, p < .01, and on the TPC beliefs scale r(264) = .39, p < .01. 3.3.2 Regression analyses Similar to Study 1, we ran multiple hierarchical regression analyses to determine how much variance in participants' SR TPCK could be explained by each of the variables, while controlling for the other variables (see Table 4). We included all variables that showed significant correlations with SR TPCK. We ran three different models. In the first, we entered the strongest correlates, which were the two personal technology variables: breadth of personal activities and TK. In Model 2, we added the teaching‐related variables: student activities and use of the Internet for teaching. In the final model, we added the TP and TPC beliefs variables. Model 1 Model 2 Model 3 Variable B SE B β B SE B β B SE B β Personal Personal activities 0.07 0.02 .33*** 0.05 0.02 .22* 0.04 0.02 .19* Technology knowledge 0.17 0.07 .21* 0.17 0.06 .22** 0.15 0.06 .20* Teaching Frequency 0.09 0.04 .17* 0.03 0.04 .05 Student activities 0.02 0.02 .10 0.03 0.02 .15* Internet for teaching 0.06 0.04 .11 0.02 0.04 .04 Beliefs TPC beliefs 0.26 0.08 .22** TP belief 0.15 0.08 .15** Adjusted R2 .23 .28 .35 F for change in R2 27.52*** 4.76** 10.47*** Note. TP = technological pedagogical; TPC = technological pedagogical content; SE = standard error; SR TPCK = self‐report technological pedagogical content knowledge. * p < .05. ** p < .01. *** p < .001. In Model 1, the variables related to technology in a teacher's personal life explained 23% of the variance in SR TPCK scores. Each of the variables contributed significantly: number of personal activities, β = .33, t(177) = 3.99, p < .001, and TK, β = .21, t(177) = 2.59, p = .01. The addition of teaching‐related variables in Model 2 added only 5% to the overall explained variance. The breadth of student activities did not contribute significantly, β = .10, t(174) = 1.28, p = .20. Neither did the frequency with which teachers used the Internet to get teaching ideas, β = .11, t(174) = 1.56, p = .12. The frequency with which teachers assigned a typical student to use a computer in class did provide a statistically significant contribution to explaining the variance in SR TPCK scores, β = .17, t(174) = 2.55, p = .01. In Model 3, the TPC beliefs scale, β = .22, t(172) = 3.10, p < .01, and TP belief item, β = .15, t(172) = 2.03, p = .04, showed significant relationships to SR TPCK. The number of personal activities, β = .19, t(172) = 2.22, p = .03, TK, β = .19, t(172) = 2.52, p = .01, and breadth of student activities, β = .15, t(172) = 2.04, p = .04, retained significant relationships to SR TPCK when controlling for beliefs, whereas frequency of classroom use did not, β = .05, t(172) = 0.79, p = .43. Adding the beliefs variables to the model accounted for an additional 7% of explained variance. Together, personal, teaching, and beliefs variables explained 35% of the variance in SR TPCK. 3.4 Study 2 discussion Study 2 investigated a large sample of accomplished teachers in the United States. Findings show that age or gender are not related to the SR TPCK score, suggesting that it captures a construct that is independent of these sociodemographic factors. In contrast, teachers' beliefs about the value of technology for teaching and learning show significant relationships with SR TPCK when other variables are held constant. Personal technology knowledge and experience also contribute to explaining the variability in SR TPCK. Of the variables capturing technology use in teaching, only the breadth of activities shows a small, statistically significant relationship with SR TPCK when beliefs are held constant. 4 GENERAL DISCUSSION We investigated the nature of a broadly used SR TPCK measure by exploring to what extent assessments of constructs beyond TPCK can explain variance in participants' SR TPCK. Recent research has shown that information captured by SR TPCK surveys only minimally converge with objective measures of TPCK (Drummond & Sweeney, 2017). We extend this work, following the rationale of a multi‐trait approach (cf. Campbell & Fiske, 1959). We provide evidence for convergent and divergent validity of SR TPCK when analysed together with constructs outside of TPCK. Our analyses indicate that SR TPCK is entirely unrelated to gender or age. Neither objective measures of PK nor TK of a sample technology relate to SR TPCK among pre‐service teachers, which we interpret as evidence of discriminant validity. Both studies found that personal computer use and teachers' beliefs and attitudes about technology use in teaching each explained a meaningful portion of the variability in SR TPCK. These results suggest some convergent validity based on relationships with experience and beliefs. These findings were found in two studies, despite differences in population and the measures used. The similar patterns in responses by pre‐ and in‐service teachers in distinct cultural contexts add strength to the findings and suggest that these results help to uncover general underlying relations between TPCK and conceptually related constructs. 4.1 Other knowledge domains Researchers must carefully consider the types of knowledge about technology measured. The tool‐specific measure used in Study 1 appears to be more distinct from SR TPCK than a more general technological awareness used in Study 2. Some level of TK and PK are arguably prerequisites for developing actual TPCK. We consider the low correlations between SR TPCK, TK, and PK to be evidence for the discriminant validity of SR TPCK. Future studies should use these measures in concert when examining the complex implementation of technology in the classroom. 4.2 Beliefs and attitudes Both studies show that teachers' beliefs and attitudes about technology for learning explain meaningful proportions of variance in SR TPCK. Our analyses indicate that correlations between SR TPCK and other variables such as computer use at home and in the classroom are mediated by beliefs; when beliefs variables are added to a regression, classroom use variables no longer explain variability in SR TPCK scores. This provides evidence for convergent validity of SR TPCK, suggesting that it represents positive beliefs about the use of technology in teaching. 4.3 Experience with technology The evidence for SR TPCK convergent validity with actual practical experiences is mixed. Though home computer use in the first study appeared to have a strong relationship to SR TPCK in the first regression model, controlling for beliefs in subsequent models weakened the association. In Study 2, the relationships of home and classroom to SR TPCK were smaller but significant when controlling for positive technology beliefs in Model 3. These mixed findings suggest that it is important to gather a range of evidence of actual experience. It is possible that there is a complex mutual relationship between beliefs and a tendency to seek out corresponding experiences, which in turn contribute in objectively verifiable knowledge. This hypothesis should be empirically tested in longitudinal studies that relate SR TPCK, objective TPCK measures, belief scales, and accounts of experience over time. 4.4 Implications Placing SR TPCK within the constellation of knowledge and beliefs and experience is urgently needed. It has implications for judging the effectiveness of interventions aimed at increasing technology use in teaching. It would be counterproductive to assess a course or workshop using a measure that does not align with the learning outcomes of the endeavour. Our findings suggest that SR TPCK will be more impacted by an intervention that specifically targets beliefs and attitudes around technology use, than by one focused solely on facts and skills. 4.5 Limitations Both studies followed a correlational design, and we cannot draw conclusions about causality from the presented findings. This means that we cannot determine whether higher TPCK renders teachers more enthusiastic about using technology in teaching, or if teachers with positive attitudes toward a tool, or a pre‐existing disposition in favor of technology, seek out opportunities to engage with it. It is possible that some other factor influences all these variables. The biggest limitation is the lack of an objective TPCK measure in these data sets. Such measures were not yet developed at the time these data were collected. New studies could and should build on more recent scholarship to confirm or challenge Drummond and Sweeney's (2017) multi‐method, mono‐trait findings in a multi‐trait, multi‐method approach. Such studies will complement our multi‐trait approach by dissociating variance attributable to shared methods (objective measure vs. self‐reports) from that attributable to the respective constructs. 5 CONCLUSIONS What do researchers actually measure when applying SR TPCK measures? Overall, the results of two studies across two different populations suggest that SR TPCK captures a conglomerate of several constructs, which is dominantly explained by positive beliefs about engaging with digital technologies to support student learning. Additionally, SR TPCK seems to cover aspects of actual technology experience in personal and teaching contexts. Our first study provides some evidence that SR TPCK is distinct from other knowledge domains, such as TK or PK. This is an important first contribution to a multi‐trait approach to validating SR TPCK. Future research should also integrate a multi‐method strategy, that is, include multiple measures of the investigated constructs. We would generally encourage researchers to include related constructs, such as beliefs and attitude or objective TPCK measures, when employing SR TPCK instruments. This holds especially true when evaluating the effectiveness of interventions that aim at fostering TPCK development. ACKNOWLEDGEMENTS This project is part of the “Qualitaetsoffensive Lehrerbildung,” a joint initiative of the Federal Government and the Laender that aims to improve the quality of teacher training. The programme is funded by the Federal Ministry of Education and Research. The authors are responsible for the content of this publication. REFERENCES Notes : 2 To disambiguate the overall framework and the knowledge domain at its center, we will use TPACK when referring to the framework and TPCK when referring to the knowledge domain.
mbalenhle's insight:
This article highlights the importance of not forgetting Self-reported TPCK. This implies how the teachers were using technology before and how they use it in the present moment. It is of a view that Technology and pedagogy are different from SR TPCK.
 
more...
No comment yet.
Scooped by mbalenhle
Scoop.it!

How to Create an Intuitive Design | Interaction Design Foundation

How to Create an Intuitive Design | Interaction Design Foundation | Activity Theory & TPCK Theory | Scoop.it
“The main thing in our design is that we have to make things intuitively obvious,” the founder and former CEO of Apple, Steve Jobs, explained. We can easily agree that design should be intuitive. We can also easily agree that something is intuitive when we can use it without thinking about it.
mbalenhle's insight:
The articles highlights how designing can improve the physical and cultural environment. It settles that human cognition, physical and cultural environment are intertwined. People grow up using technology physically and they are using it in their cultural environments.
more...
No comment yet.