Revisiting technological pedagogical content knowledge: Exploring the TPACK framework

7
Revisiting technological pedagogical content knowledge: Exploring the TPACK framework Leanna M. Archambault a, * , Joshua H. Barnett a a Arizona State University, Mary Lou FultonTeachers College, PO Box 37100, Mail Code 3151, Phoenix, AZ 85069, United States article info Article history: Received 16 November 2009 Received in revised form 18 July 2010 Accepted 19 July 2010 Keywords: Technological pedagogical content knowledge TPACK Technology framework Online learning Factor analysis abstract This study examines the nature of technological pedagogical content knowledge (TPACK) through the use of a factor analysis. Using a survey with 24 items designed to measure each of the areas described by the TPACK framework, and measuring the responses of 596 online teachers from across the United States, data suggest that while the framework is helpful from an organizational standpoint, it is difcult to separate out each of the domains, calling into question their existence in practice. Three major factors become evident, but rather than being comprised of pedagogy, content, and technology, the only clear domain that distinguishes itself is that of technology. This research examines the validity of the TPACK model and suggests that measuring each of these domains is complicated and convoluted, potentially due to the notion that they are not separate. Ó 2010 Elsevier Ltd. All rights reserved. 1. Introduction Prior to the articulation of technological pedagogical content knowledge (TPACK) (Mishra & Koehler, 2006), the notion of a unifying conceptual framework was lacking in the educational technology literature. As a result, the development of the TPACK framework has taken the technology eld by storm (Cox & Graham, 2009), and various researchers have developed related curriculum, texts, professional development models, methods of measurement, as well as advancements to the framework itself (Angeli & Valanides, 2009; Harris, Mishra, & Koehler, 2009; Niess, 2008; Schmidt et al., 2009). However, while TPACK is potentially useful, especially when conceptualizing how the affordances of technology might be leveraged to improve teaching and learning, it requires additional examination to understand if technology, content, and pedagogy meld together to form the unique domains described by framework. The purpose of this study is to explore the nature of technological pedagogical content knowledge (TPACK), dened as understanding the connections and interactions between and among content knowledge (subject-matter that is to be taught), technological knowledge (computers, the Internet, digital video, etc.), and pedagogical knowledge (practices, processes, strategies, procedures and methods of teaching and learning) to improve student learning (Koehler & Mishra, 2005). This framework describes seven unique factors: pedagogy, content, technology, pedagogical content, technological pedagogy, technological content, and technological pedagogical content. To date, two survey instruments have been created to measure TPACK, one tailored toward undergraduate students (Schmidt et al., 2009) and the other specic to online teaching (Archambault & Crippen, 2009). However, further analysis is necessary to determine if data from either survey support the identication of the seven factors as described by the TPACK framework. This study uses survey responses collected from 596 K-12 online teachers to explore the nature of the factors comprising the TPACK model. 2. Related literature In order to understand the origins of the TPACK framework and its impact on the eld of educational technology, it is necessary to examine its roots in pedagogical content knowledge (PCK). Introduced by Shulman (1986), he recognized the need for a more coherent * Corresponding author. Tel.: þ1 602 543 6338; fax: þ1 602 543 6350. E-mail addresses: [email protected] (L.M. Archambault), [email protected] (J.H. Barnett). Contents lists available at ScienceDirect Computers & Education journal homepage: www.elsevier.com/locate/compedu 0360-1315/$ see front matter Ó 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.compedu.2010.07.009 Computers & Education 55 (2010) 16561662

Transcript of Revisiting technological pedagogical content knowledge: Exploring the TPACK framework

Computers & Education 55 (2010) 1656–1662

Contents lists available at ScienceDirect

Computers & Education

journal homepage: www.elsevier .com/locate/compedu

Revisiting technological pedagogical content knowledge: Exploring the TPACKframework

Leanna M. Archambault a,*, Joshua H. Barnett a

aArizona State University, Mary Lou Fulton Teachers College, PO Box 37100, Mail Code 3151, Phoenix, AZ 85069, United States

a r t i c l e i n f o

Article history:Received 16 November 2009Received in revised form18 July 2010Accepted 19 July 2010

Keywords:Technological pedagogical contentknowledgeTPACKTechnology frameworkOnline learningFactor analysis

* Corresponding author. Tel.: þ1 602 543 6338; faxE-mail addresses: [email protected] (L.

0360-1315/$ – see front matter � 2010 Elsevier Ltd. Adoi:10.1016/j.compedu.2010.07.009

a b s t r a c t

This study examines the nature of technological pedagogical content knowledge (TPACK) through the useof a factor analysis. Using a survey with 24 items designed to measure each of the areas described by theTPACK framework, and measuring the responses of 596 online teachers from across the United States,data suggest that while the framework is helpful from an organizational standpoint, it is difficult toseparate out each of the domains, calling into question their existence in practice. Three major factorsbecome evident, but rather than being comprised of pedagogy, content, and technology, the only cleardomain that distinguishes itself is that of technology. This research examines the validity of the TPACKmodel and suggests that measuring each of these domains is complicated and convoluted, potentiallydue to the notion that they are not separate.

� 2010 Elsevier Ltd. All rights reserved.

1. Introduction

Prior to the articulation of technological pedagogical content knowledge (TPACK) (Mishra & Koehler, 2006), the notion of a unifyingconceptual framework was lacking in the educational technology literature. As a result, the development of the TPACK framework has takenthe technology field by storm (Cox & Graham, 2009), and various researchers have developed related curriculum, texts, professionaldevelopment models, methods of measurement, as well as advancements to the framework itself (Angeli & Valanides, 2009; Harris, Mishra,& Koehler, 2009; Niess, 2008; Schmidt et al., 2009). However, while TPACK is potentially useful, especially when conceptualizing how theaffordances of technology might be leveraged to improve teaching and learning, it requires additional examination to understand iftechnology, content, and pedagogy meld together to form the unique domains described by framework.

The purpose of this study is to explore the nature of technological pedagogical content knowledge (TPACK), defined as understanding theconnections and interactions between and among content knowledge (subject-matter that is to be taught), technological knowledge(computers, the Internet, digital video, etc.), and pedagogical knowledge (practices, processes, strategies, procedures and methods ofteaching and learning) to improve student learning (Koehler & Mishra, 2005). This framework describes seven unique factors: pedagogy,content, technology, pedagogical content, technological pedagogy, technological content, and technological pedagogical content. To date,two survey instruments have been created to measure TPACK, one tailored toward undergraduate students (Schmidt et al., 2009) and theother specific to online teaching (Archambault & Crippen, 2009). However, further analysis is necessary to determine if data from eithersurvey support the identification of the seven factors as described by the TPACK framework. This study uses survey responses collected from596 K-12 online teachers to explore the nature of the factors comprising the TPACK model.

2. Related literature

In order to understand the origins of the TPACK framework and its impact on the field of educational technology, it is necessary toexamine its roots in pedagogical content knowledge (PCK). Introduced by Shulman (1986), he recognized the need for a more coherent

: þ1 602 543 6350.M. Archambault), [email protected] (J.H. Barnett).

ll rights reserved.

L.M. Archambault, J.H. Barnett / Computers & Education 55 (2010) 1656–1662 1657

theoretical framework concerning what teachers should know and be able to, including what content knowledge they needed to possessand how this knowledge is related to that of good teaching practices. Shulman developed the idea of pedagogical content knowledge (PCK)to describe the relationship between the amount and organization of knowledge of a particular subject-matter (content) and knowledgerelated to how to teach various content (pedagogy). According to Shulman, PCK includes knowledge on how to teach a specific content orsubject-matter knowledge, extending beyond simply knowing the content alone. PCK is described as encompassing “the most useful formsof representation of those ideas, the most powerful analogies, illustrations, examples, explanations, and demonstrationsdin a word, theways of representing and formulating the subject that make it comprehensible to others” (p. 9).

Shulman’s articulation of pedagogical content knowledge has become “common currency” in the field of teacher education and in therelated literature (Segall, 2004). However, as Segall points out, “Yet while it [pedagogical content knowledge] has often been cited, muchused, seldom has the term or the lens it provides for the educative endeavor been questioned, engaged critically” (p. 490). While the teachereducation community acknowledges the usefulness of the framework, especially with examining what teachers know and how that mightimpact the ways in which they teach, there are some valid concerns, especially concerning the distinct nature of each of the domains,pedagogy and content. Are they two distinct areas or are they inherently meshed? Can teachers consider a content area without thinkingabout how they might go about teaching it? According to McEwan and Bull (1991), “We are concerned, however, that his [Shulman’s]distinction between content knowledge and pedagogical content knowledge introduces an unnecessary and untenable complication to theconceptual framework onwhich the research is based.” (p. 318). The authors go on to argue that content, in the form of scholarship, cannotexist without pedagogy, and that explanations of concepts are inherently pedagogical in nature (McEwan & Bull, 1991; Segall, 2004). Thisperplexity has made it difficult to validate pedagogical content knowledge as a framework, to define what constitutes knowledge from eachof the domains of pedagogy, content, and the complex notion of pedagogical content knowledge.

Despite problems with the initial framework, Koehler and Mishra built on PCK, and added technology as a key component to theframework, creating technological pedagogical content knowledge (TPACK). TPACK, as described in the literature involves an understandingof the complexity of relationships among students, teachers, content, technologies, practices, and tools. According to Koehler and Mishra(2005), “We view technology as a knowledge system that comes with its own biases, and affordances that make some technologiesmore applicable in some situations than others” (p. 132). Koehler and Mishra define TPACK as the connections and interactions betweencontent knowledge (subject-matter that is to be taught), technological knowledge (computers, the Internet, digital video, etc.), pedagogicalknowledge (practices, processes, strategies, procedures and methods of teaching and learning), and the transformation that occurs whencombining these domains: “Good teaching is not simply adding technology to the existing teaching and content domain. Rather, theintroduction of technology causes the representation of new concepts and requires developing a sensitivity to the dynamic, transactionalrelationship between all three components suggested by the TPCK framework” (p. 134).

The TPACK framework considers three distinct and interrelated areas of teaching, as represented by Fig. 1.The notion of TPACK is quickly becoming ubiquitouswithin the educational technologycommunity, becoming popular among researchers

and practitioners alike, as it attempts to describe the complex-relationship between and among the domains of content, pedagogy, andtechnology-related knowledge. However, while the theory of TPACK is compelling, more work measuring the relationship between thesedomains is necessary before curriculum and textbooks are re-written. Specifically, before this model is offered as the proverbial panacea forredressing the challenges of teaching the 21st century student, scholarship investigating the confusion between and among each of thedomains described by the framework is needed. Cox and Graham (2009) acknowledge the difficulty and necessity in conducting such work:

While Koehler, Mishra, and others have attempted to define and measure TPACK, the framework is not yet fully understood (Angeli &Valanides, 2009). Thus far, the explanations of technological pedagogical content knowledge and its associated constructs that have

Fig. 1. Graphic representation of technological pedagogical content knowledge (TPACK).

L.M. Archambault, J.H. Barnett / Computers & Education 55 (2010) 1656–16621658

been provided are not clear enough for researchers to agree onwhat is and is not an example of each construct.the boundaries betweenthem are still quite fuzzy, thus making it difficult to categorize borderline cases (p. 60).

Specifically, more research regarding the validity and applicability of the framework is needed. This study provides one step towards thatgoal in asking the following research question: What do online teachers’ ratings of their perceived knowledge levels related to TPACKsuggest regarding the nature of the framework itself?

3. Methodology

AWeb-based survey instrument was developed encompassing questions regarding 24 items concerning online teachers’ technologicalpedagogical content knowledge (Archambault & Crippen, 2009). Responses were on a Likert-type scale, ranging from 1 ¼ Poor, 2 ¼ Fair,3 ¼ Good, 4 ¼ Very Good, and 5 ¼ Excellent. To establish construct validity, the instrument underwent expert review and two rounds ofthink-aloud piloting (Archambault & Crippen, 2009).

3.1. Procedure

The survey was deployed to 1795 online teachers employed at virtual schools from across the nation using Dillman’s (2007) TailoredDesign survey methodology. Email addresses for these teachers were gathered via virtual school websites. A total of 596 responses from 25different states were gathered, which represented an overall response rate of 33%. While the response rate is modest, it is recognized asacceptable for a web-based survey (Manfreda, Bosnjak, Berzelak, Hass, & Vehovar, 2008; Shih & Fan, 2008). In addition, this surveyrepresents a unique examination of practitioner’s responses to the TPACK theoretical framework and should be regarded as an initial yetintegral step in understanding if this theory is tenable.

3.2. Respondents

Participants were predominantly female, with 456 responses (77%) versus 139 (23%) male (consistent with the averages amongeducators), and were between the ages of 26 and 45 (63%). The majority of respondents (559, 92%) reported having a bachelor’s degree and380 (62%) indicated that they had earned a master’s degree, while 7 (2%) reported they were currently working toward their master’sdegrees. Of the 62%withmaster’s degrees,148 (48%) were education (M Ed) degrees, including those in curriculum and instruction, while 73(19%) reported having a degree in a particular content area, such as mathematics, science, social studies, or English.

3.3. Analytic strategy

The responses to the survey were analyzed using the Statistical Program for Social Sciences version 16. A factor analysis using varimaxrotation was performed on the total survey. The purpose of a factor analysis, according to Gorsuch (1983), is to “summarize the inter-relationships among the variables in a concise but accuratemanner as an aid in conceptualization” (p. 2). This method assists the researchersin establishing a level of construct validity (Bryman & Cramer, 1990). Coefficients of internal consistency were obtained for the total surveyand by the seven expected constructs. Additionally, the relationship between each of the 24 items in the survey with each of the computedsubscale variables was analyzed using a Pearson r correlation to conduct a Corrected Item–Total Correlation analysis.

4. Results

To determine if the survey items were reliable, internal consistency (Cronbach’s alpha) values were computed. The values are presentedalongside descriptive statistics in Table 1 and indicate that the subscales, which have alpha values from 0.89 to 0.70, are reasonable forinternal reliability (Morgan, Leech, Gloeckner, & Barrett, 2004).

Additionally, a factor analysis was performed on the 24 item survey. This analysis confirmed the existence of three separate factors withinthe survey, using the Kaiser Normalization as indicated by the components with eigenvalues greater than one. The amount of varianceexplained by the three factors was 58.21%. Tables 2–4 illustrate how the survey items loaded by factor, as indicated by the rotatedcomponent matrix, which converged in five iterations. The communalities for each item are also presented.

These results indicate that thehighlyaccepted sevenmutuallyexclusive domainsof the TPACK theorymaynotexist inpractice. Specifically,the respondents reported the existence of three factors: pedagogical content knowledge, technological–curricular content knowledge, andtechnological knowledge. From the responses provided, practitioners indicated strong connections between content knowledge and peda-gogical knowledge, noted by the interconnection of responses to the content, pedagogy, and pedagogical content questions. Respondents alsoreported a connection between technological content, technological pedagogy, and technological pedagogical content questions. However,

Table 1Summary of descriptive statistics for subscales.

Domain Number of survey items Mean Standard deviation Cronbach’s alpha

Pedagogy 3 4.04 0.78 0.77Technology 3 3.23 1.12 0.89Content 3 4.02 0.89 0.76Pedagogical content 4 4.04 0.81 0.80Technological content 3 3.87 1.03 0.70Technological pedagogy 4 3.65 1.03 0.77Technological content pedagogy 4 3.79 0.95 0.79

Table 2Rotated component matrix – factor 1: pedagogical content knowledge.

Survey item Subscale Factor 1 Communalities

(m) My ability to plan the sequence of concepts taught within my class Content 0.75 0.61(s) My ability to comfortably produce lesson plans with an appreciation for the topic Pedagogical content 0.74 0.61(j) My ability to determine a particular strategy best suited to teach a specific concept Pedagogical content 0.74 0.59(d) My ability to decide on the scope of concepts taught within in my class Content 0.72 0.55(c) My ability to use a variety of teaching strategies to relate various concepts to students Pedagogy 0.71 0.56(u) My ability to assist students in noticing connections between various concepts in a curriculum Pedagogical content 0.69 0.55(r) My ability to adjust teaching methodology based on student performance/feedback Pedagogy 0.69 0.51(f) My ability to distinguish between correct and incorrect problem solving attempts by students Pedagogical content 0.67 0.52(i) My ability to anticipate likely student misconceptions within a particular topic Pedagogical content 0.64 0.45(b) My ability to create materials that map to specific district/state standards Content 0.55 0.39

L.M. Archambault, J.H. Barnett / Computers & Education 55 (2010) 1656–1662 1659

respondents did not distinguish among these constructs. Instead, responses to these items loaded together with no clear separation. Finally,respondents did separate the technological knowledge items, where no reference to content or pedagogy was used.

In an effort to test if these findings were a reflection of the items within the instrument or an accurate reflection of the perception ofrespondents, each of the24 itemswas analyzed using a Pearson r correlation to conduct a Corrected Item–Total Correlation analysis. The overallinternal consistency of the instrumentwas, r¼ 0.94, and as indicated by Table 5, the removal of any one itemdoes not improve the reliability ofthesurvey,which is an indication that the survey ismost reliable retainingall of the24 items. This alsogives credibility to the conclusion that thepreviously described factors accurately reflect the perceptions of the respondents rather than any issues with the instrument.

5. Discussion and implications

Although the TPACK framework is helpful from an organizational standpoint, the data from this study suggest that it faces the sameproblems as that of pedagogical content knowledge in that it is difficult to separate out each of the domains, calling into question theirexistence in practice. The fact that three major factors become evident is noteworthy, but rather than being comprised of pedagogy, content,and technology, the only clear domain that distinguishes itself is that of technology. Based on these results, it seems that the technologicalpedagogical content knowledge (TPACK) framework experiences the same difficulty as Shulman’s quarter century old conception ofpedagogical content knowledge.

It is possible that when experienced educators consider teaching a particular topic, the methods of doing so are considered as part andparcel of the content, and when considering an online context, the domain of technology is added to the equation as a natural part of themedium, making it difficult to separate aspects of content, pedagogy, and technology. This was illustrated by the second phase of the think-aloud pilot, for which the lead researcher met with three different teachers who all taught various classes online. After being provided withdefinitions of each TPACK construct, the online teachers were asked to read each item of the instrument and decide under which TPACKdomain they thought the item fits. In doing so, they encountered difficulty when trying to decipher issues of pedagogy and content. Threeonline teachers were challenged with separating out specific issues of content and pedagogy. For example, Item d – “My ability to decide onthe scope of concepts taught within my class” was interpreted by two teachers as belonging to the domain of pedagogical content ratherthan content alone. The same misinterpretation happened with Item b – “My ability to create materials that map to specific district/statestandards.” The participants saw this item as a part of pedagogy content (Archambault & Crippen, 2009). These examples, coupled with theresults from the factor analysis, support the notion that TPACK creates additional boundaries along and already ambiguous lines drawnbetween pedagogy and content knowledge.

Confounding the measurement of TPACK is the difficulty in developing an instrument or methodology to assess for each of the domainsdescribed by the framework that will apply in different contexts. One of the major problems surfaces when attempting to measure contentknowledge, defined by Mishra and Koehler (2006) as knowledge of the subject-matter to be taught (e.g. earth science, mathematics,language arts, etc.). Items that were developed to measure this construct within the current instrument were written with the intent ofbeing generalizable so that teachers could apply them to their own subject-matter. These included questions regarding the ability to creatematerials that map to specific district/state standards and the ability to decide on the scope and sequence of concepts taught within in

Table 3Rotated component matrix – factor 2: technological-curricular content knowledge.

Survey item Subscale Factor 2 Communalities

(n) My ability to moderate online interactivity among students Technological content 0.79 0.65(p) My ability to encourage online interactivity among students Technological pedagogy 0.78 0.63(l) My ability to implement different methods of teaching online Technological pedagogy 0.69 0.66(v) My ability to use various courseware programs to deliver instruction

(e.g., Blackboard, Centra)Technological content 0.67 0.56

(h) My ability to create an online environment which allows students tobuild new knowledge and skills

Technological pedagogy 0.65 0.62

(x) My ability to meet the overall demands of online teaching Technological pedagogical content 0.60 0.53(o) My ability to use technological representations (i.e. multimedia,

visual demonstrations, etc.) to demonstrate specific concepts in my content area)Technological content 0.59 0.53

(w) My ability to use technology to create effective representations of content thatdepart from textbook knowledge

Technological pedagogical content 0.58 0.50

(e) My ability to use online student assessment to modify instruction Technological pedagogical content 0.58 0.50(t) My ability to implement district curriculum in an online environment Technological content 0.54 0.53(k) My ability to use technology to predict students’ skill/understanding of a particular topic Technological pedagogical content 0.48 0.51

Table 4Rotated component matrix – factor 3: technological knowledge.

Survey item Subscale Factor 3 Communalities

(a) My ability to troubleshoot technical problems associated with hardware Technology 0.88 0.81(g) My ability to address various computer issues related to software Technology 0.85 0.81(q) My ability to assist students with troubleshooting technical problems with their personal computers Technology 0.82 0.79

L.M. Archambault, J.H. Barnett / Computers & Education 55 (2010) 1656–16621660

a class. The challenge becomes creating and validating an instrument that is applicable in a multitude of contexts, including differentcontent areas. If this is not possible, then the conceptualization of TPACK may need to be different for every imaginable content area,including subject domains within each of these areas. This questions the value of the framework itself as a cohesive, overarching model.

Despite the issue with content-related items, the inability to differentiate between and among the constructs of the TPACK framework issignificant, and it calls into question the its precision, namely whether or not the domains described by the model exist independently(Gess-Newsome & Lederman,1999).Without the ability to separate the components of the framework, it suggests that further refinement tothe framework may be necessary. This is echoed by Angeli and Valanides (2009):

.Koehler et al.’s (2007) conceptualization of TPACK needs further theoretical clarity. It is argued that if TPACK is to be considered as ananalytical theoretical framework for guiding and explaining teachers’ thinking about technology integration in teaching and learning,then TPACK’s degree of precision needs to be put under scrutiny.Furthermore, the boundaries between some components of TPACK,such as, for example, what they define as Technological content knowledge and Technological pedagogical knowledge, are fuzzy indi-cating a weakness in accurate knowledge categorization or discrimination, and, consequently, a lack of precision in the framework(p. 157).

Because the TPACK domains do not statistically distinguish themselves, this also leads to the heuristic value of the model beingdiminished. Specifically, the heuristic value describes the extent to which the framework helps researchers predict outcomes or reveal newknowledge. This is a weakness in the current model, as effective models can be judged on their ability to explain and predict variousphenomena (Järvelin & Wilson, 2003).

In addition to a model’s explanatory power, Järvelin and Wilson (2003) lay out the following criteria for evaluating effective conceptualmodels:

� Simplicity: simpler is better, other things being equal.� Accuracy: accuracy and explicitness in concepts are desirable.� Scope: a broader scope is better because it subsumes narrower ones� Systematic power: the ability to organize concepts, relationships, and data in meaningful, systematic ways is desirable.� Reliability: the ability, within the range of the model, to provide valid representations across the full range of possible situations.� Validity: the ability to provide valid representations and findings is desirable.� Fruitfulness: the ability to suggest problems for solving and hypotheses for testing is desirable.

In addition to weaknesses in TPACK’s precision and heuristic value, the framework is also limited in its ability to assist researchers inpredicting outcomes or revealing new knowledge. While it focuses on three major areas of teaching, namely content, pedagogy, andtechnology, it does not represent the causative interaction or the direction of the relationship between and among these domains. Thismakes it difficult for TPACK to be a fruitful model, as it does not suggest problems for solving or hypotheses for testing within the field of

Table 5Online teacher TPACK survey item analysis.

Survey item Subscale Corrected item–total correlation Cronbach’s alpha if item deleted

a Technology 0.474 0.935b Content 0.531 0.933c Pedagogy 0.586 0.933d Content 0.536 0.933e Technological pedagogical content 0.644 0.931f Pedagogical content 0.582 0.933g Technology 0.561 0.933h Technological pedagogy 0.723 0.930i Pedagogical content 0.517 0.933j Pedagogy 0.565 0.933k Technological pedagogical content 0.653 0.931l Technological pedagogy 0.734 0.930m Content 0.596 0.932n Technological content 0.602 0.932o Technological content 0.642 0.932p Technological pedagogy 0.587 0.933q Technology 0.577 0.933r Pedagogy 0.549 0.933s Pedagogical content 0.549 0.933t Technological content 0.645 0.932u Pedagogical content 0.599 0.932v Technological content 0.582 0.933w Technological pedagogical content 0.633 0.932x Technological pedagogical content 0.660 0.931

L.M. Archambault, J.H. Barnett / Computers & Education 55 (2010) 1656–1662 1661

educational technology. It would appear from this study that there is room to continue to build on TPACK or even conceptualize othermodels that provide a less complex, more precise way of representing the effective integration of technology to improve student learning.

Despite its fuzzy boundaries, the TPACK framework has theoretical appeal, providing an analytical structure highlighting the importanceof content knowledgewhen incorporating the use of technology. As Koehler andMishra (2008) recognize, “Instead of applying technologicaltools to every content area uniformly, teachers should come to understand that the various affordances and constraints of technology differby curricular subject-matter content or pedagogical approach” (p. 22). This focus on subject-matter content is important when consideringthe effective use of technology.

However, this appeal is tempered by the difficulty in measuring each of the constructs described by the framework. Further, using thismodel, what changes can colleges of education enact to produce more skilled teachers? As Harris et al. (2009) point out:

TPACK is a framework for teacher knowledge, and as such, it may be helpful to those planning professional development for teachers byilluminating what teachers need to know about technology, pedagogy, and content and their interrelationships. The TPACK frameworkdoes not specify how this should be accomplished, recognizing that there aremay possible approaches to knowledge development of thistype. (p. 403)

There is confusion among the field of educational technology, not only concerning the definitions, but also the specific activities andmethods to develop TPACK. This makes it difficult to implement knowledge from a framework that is yet to be fully defined, which limits itspractical application. This is an important area for future research, including detailed examples of TPACK as it pertains to teacher practice(Cox & Graham, 2009).

5.1. Limitations

A national quantitative study, although rich in data, is not without its drawbacks. Because survey research consists of self-report ratherthan the measurement of observable behavior, concerns of accuracy are intrinsic. Although a survey methodology is appropriate whenseeking to examine characteristics from a given population, it is not as accurate as actual observable behavior. This is a dilemma now facingthe field as various methodologies are being developed in an attempt to measure and validate the TPACKmodel. That is, how do researchersbalance the need for a valid and reliable instrument that can be readily distributed across settings along with alternate, field-basedmethodsthat can document the notion of technological pedagogical content knowledge in practice?

In an effort to create a quantitative instrument to measure TPACK, it is worth noting that any survey is limited by its items and scales. Tobe certain, items contained within the current survey would benefit from additional refinement, including the consideration of content-specific questions and an update of the technology-related questions to include the advancements in the read/writeWeb andWeb 2.0 tools.Along with the limitations of instrument development and survey research, there are also inherent difficulties in attempting to measurea relatively new conceptual model with a relatively small, but burgeoning literature base. This study represents an important step inbeginning to question the model itself, how the data might support or refute the notion of TPACK, and how the field of educationaltechnology might critically consider what has become a widely known and accepted framework.

6. Conclusion

This research examines the validity of the TPACK model, which might be effective in the hallways of academia, but perhaps provideslimited benefit to administrators, teachers, and most importantly, students. From the practitioner data contained within this research, itseems that from the onset, measuring each of these domains is complicated and convoluted, potentially due to the notion that they are notseparate. The data emerging from the current study support such a conclusion. This leads the researchers to consider what type of modelmight more accurately describe teachers’ content, pedagogical, and technological knowledge, and how this model might better informcolleges of education and teacher education programs in preparing future educators for the challenges of teaching in the 21st century.

Appendix. Supplementary dataSupplementary data associated with this article can be found, in the online version, at doi:10.1016/j.compedu.2010.07.009

References

Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for the conceptualization, development, and assessment of ICT–TPCK: advances in technologicalpedagogical content knowledge (TPCK). Computers & Education, 52(1), 154–168.

Archambault, L., & Crippen, K. (2009). Examining TPACK among K-12 online distance educators in the United States. Contemporary Issues in Technology and Teacher Education,9(1). Retrieved from. http://www.citejournal.org/vol9/iss1/general/article2.cfm.

Bryman, A., & Cramer, D. (1990). Quantitative data analysis for social scientists. London: Routledge.Cox, S., & Graham, C. R. (2009). Diagramming TPACK in practice: using an elaborated model of the TPACK framework to analyze and depict teacher knowledge. Tech-Trends, 53

(5), 60–69.Dillman, D. A. (2007). Mail and Internet surveys: The tailored design method (2nd ed.). New York: Wiley.Gess-Newsome, J., & Lederman, N. G. (Eds.). (1999). Examining pedagogical content knowledge. Dordrecht: Kluwer.Gorsuch, R. L. (1983). Factor analysis (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.Harris, J., Mishra, P., & Koehler, M. (2009). Teachers’ technological pedagogical content knowledge and learning activity types: curriculum-based technology integration

reframed. Journal of Research on Technology in Education, 41(4), 393–416.Järvelin, K., & Wilson, T. D. (2003). On conceptual models for information seeking and retrieval research. Information Research, 9(1). http://InformationR.net/ir/9-1/paperl63.

html Retrieved from.Koehler, M., & Mishra, P. (2005). What happens when teachers design educational technology? The development of technological pedagogical content knowledge. Journal of

Educational Computing Research, 32(2), 131–152.Koehler, M., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation and Technology. (Ed.), Handbook of technological pedagogical content knowledge (TPCK).

New York: Routledge.Manfreda, K. L., Bosnjak, M., Berzelak, J., Hass, I., & Vehovar, V. (2008). Web surveys versus other survey modes: a meta-analysis comparing response rate. International Journal

of Market Research, 50(1), 79–104.McEwan, H., & Bull, B. (1991). The pedagogic nature of subject matter knowledge. American Educational Research Journal, 28(2), 316–334.

L.M. Archambault, J.H. Barnett / Computers & Education 55 (2010) 1656–16621662

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: a framework for integrating technology in teacher knowledge. Teachers College Record, 108(6),1017–1054.

Morgan, G. A., Leech, N. L., Gloeckner, G. W., & Barrett, K. C. (2004). SPSS for introductory statistics: Use and interpretation. Mahwah, NJ: Lawrence Erlbaum Associates.Niess, M. L. (2008). Guiding preservice teachers in developing TPCK. In N. Silverman (Ed.), Handbook of Technological Pedagogical Content Knowledge (TPCK) for Educators.

(pp. 223–250). New York: Routledge.Schmidt, D., Baran, E., Thompson, A., Koehler, M. J., Shin, T, & Mishra, P. (2009, April). Technological Pedagogical Content Knowledge (TPACK): the development and validation

of an assessment instrument for preservice teachers. Paper presented at the 2009 annual meeting of the American educational research association. April 13–17, SanDiego, California.

Segall, A. (2004). Revisiting pedagogical content knowledge: the pedagogy of content/the content of pedagogy. Teaching and Teacher Education, 20(5), 489–504.Shih, T., & Fan, X. (2008). Comparing response rates from Web and mail surveys: a meta-analysis. Field Methods, 20(3), 249–271.Shulman, L. (1986). Paradigms and research programs in the study of teaching: a contemporary perspective. In M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed.).

(pp. 3–36) New York: MacMillan.