Intention to Use Biometric Systems
Transcript of Intention to Use Biometric Systems
20 © 2011 e-Service Journal. All rights reserved. No copies of this work may be distributed in print or electronically without express written permission from Indiana University Press.
Intention to Use Biometric Systems
Benjamin NgugiSuffolk University
Arnold KamisSuffolk University
Marilyn TremaineRutgers University
ABsTrAcT
This study set out to investigate the critical factors that determine user intention to use a
biometric system. We integrated previous research in the technology acceptance and biomet-
ric engineering literatures and identified six important factors: Perceived System Security,
Perceived False Acceptance rates, Perceived False Rejection rates, Perceived System Invasive-
ness, System Trust and Facilitating Conditions. Eleven hypotheses were developed regarding
the six factors to formulate direct and indirect influences of the six factors in a biometric
intention-to-use model. The results support the eleven hypotheses and validate the model,
which has an average variance explained of 31.1%. The contributions of the paper are
discussed and future research directions are suggested.
Keywords: Biometrics, Security, IT adoption, User attitudes, Trust, Privacy, Survey
INTroducTIoN
A lot of effort and money are being invested in the research and development of new infor-
mation system technologies. Unfortunately, about fifty percent of such systems are consid-
ered failures i.e., fall short of meeting the expectations set forth by technology managers.
It would help if managers could anticipate the critical factors that determine the success of
such technologies and act accordingly. A case in point is the unexpectedly slow rate of
biometric technologies acceptance. Biometric technologies offer several advantages over
traditional authentication methods. They are more accurate, harder to copy and they do
not require remembering passwords or carrying tokens, comparing favorably to knowl-
edge-based (e.g., password) or token-based (e.g., smart card) systems (O’Gorman, 2004).
Intention to Use Biometric Systems
21
The Biometric community expected that biometric technologies would be quickly
and widely accepted by the public. However, the actual acceptance of biometric technolo-
gies has been considerably lower than expected (Albrecht et al., 2003). This slow rate of
adoption suggests that some critical factors need to be addressed to help the user decide to
either use or not use a new biometric technology after doing a cost/benefit analysis. Vari-
ous technical and non-technical challenges have been suggested to explain the slow up-
take. Technical challenges include the fact that the performance of biometrics in the field
has been much lower than corresponding laboratory tests due to background noise, poor
quality of biometric scans, lack of scalability and lack of access to all individuals. Non-
technical challenges include issues with invasiveness, security concerns and user non-ac-
ceptance, for various reasons (Etter, 2003; Jain et al., 2004; James et al., 2006; National
Center for State Courts, 2002).
Table 1 shows a comparison of different biometrics. Biometrics can be grouped
into two categories. The first type is physical biometrics, for example, fingerprint and
face recognition systems which depend on the physical characteristics of an individual
(Clarke, 1994). The second type is behavioral biometrics, such as the typing pattern bio-
metric and voice print recognition, which depend on the behavioral characteristics of an
individual. Behavioral biometrics are shaded gray in Table 1. Physical biometrics, as seen
in Table 1, are more accurate, due to their stability and many years of research and refine-
ment. However they are proprietary, more expensive and invasive (National Center for
Table 1: Comparison of Different Biometrics (National Center for State Courts, 2002)
Biometric Security Level
Accuracy User Acceptance
Non- Invasive
Hardware
Fingerprint *** **** ** *** Special, Inexpensive
Facial Recognition ** *** ** **** Common, Inexpensive
Hand Geometric ** *** ** **** Special, Mid-price
Speaker Recognition
** ** *** **** Common, Inexpensive
Iris Scan *** **** ** **** Special, Expensive
Retinal Scan *** **** ** * Special, Expensive
Signature ** ** ** **** Special, Mid-price
Keystroke recognition
** * *** **** Common, Inexpensive
DNA **** **** * * Special, Mid-price
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
22 e-Service Journal Volume 7 Issue 3
State Courts, 2002). Further, they cannot be changed (revoked) if the biometric database
is compromised, since a human being has but one set.
As such, researchers are increasingly looking to behavioral biometrics as an alterna-
tive because of their acceptability and low cost. Since they are not as accurate as physical
biometrics, a two or three factor combination is recommended, in which the behavioral
biometric layer is combined with the current authentication system made of knowledge
and token layers (O’Gorman, 2004). This follows the “defense-in-depth” strategy which
utilizes several security controls with each layer, compensating the weakness of the other
to produce a much stronger overall defense (Whitman & Mattord, 2005). However
more research is required to better understand the critical factors that would result in
successful adoption of behavioral biometrics and the relationships between these factors.
The next section of this paper identifies several critical factors for the acceptance of
biometric. Twelve hypotheses are proposed incorporating relevant variables from the bio-
metric and technology acceptance literatures. A methodology for validating the proposed
model is then suggested by use of a future based scenario involving ATM systems up-
graded with a biometric system. The study then explains the design of measurement
scales, the creation of viable constructs and the incorporation of these constructs into a
survey. We document development and administration of the survey as well as present
the results. Finally, we discuss the major contributions of the paper and suggest future
research directions.
reseArch Model developMeNT
Scholars have for a long time been interested in the factors that users consider in making
the decision to accept or reject new technologies. A substantial stream of technology ac-
ceptance research in Information Systems is dedicated to investigating technology accep-
tance (Ash, 1997; Azjen & Fishbein, 1980; Azjen, 1991; Davis, 1989; Davis et al., 1992;
Gefen et al., 2003; Mathieson, 1991; Venkatesh, 2000; Venkatesh et al., 2003; Whit-
worth & Zaic, 2006). This stream has produced different technology acceptance mod-
els, each with different strengths and weaknesses. The goals of most of the studies have
been to find factors that can be used to motivate the user to accept and start using the
new technology. The behavioral intention to use a new technology has been identified as
the best predictor of technology use; hence it will be the main dependent variable of this
study (Azjen & Fishbein, 1980; Azjen, 1991).
Biometrics are new technologies that have not been well explored in the technology
acceptance literature compared to some of the more established technologies, e.g., data-
bases (Tanniru & Agarwal, 2002). Thus there is need to integrate the perspectives of the
biometric researchers to produce a more complete model. Biometric researchers believe
that biometric technologies have the potential to become an irreplaceable part of authen-
tication systems (Jain et al., 2004). However, several factors need to be addressed before
Intention to Use Biometric Systems
23
the technology is widely accepted. The next subsections analyze the main factors from
previous work and their impact on the behavioral intention to use typing pattern
biometrics.
Biometric system Trust
A biometric system is a specialized information system with the goal of differentiating the
genuine user from impostors. It has all the components of a generic information system,
which is defined as an organized combination of people, hardware, software, communica-
tions networks, data resources, policies and procedures, that stores, retrieves, transforms,
and disseminates information in an organization (O’Brien & Marakas, 2007). A serious
challenge in implementing biometric systems is building public confidence (Charndra &
Calderor, 2005). Building public confidence in biometric technology requires users to
trust the biometric system with all its components as defined above. Trust is defined as the
confidence in the goodwill of others not to cause harm to oneself when one is vulnerable
to them (Ring & Van de Ven, 1992). There are various types of trust (McKnight et al.,
2002); this study concentrates on biometric technology system trust.
A biometric system must provide a trusted computing platform to users (Chen et
al., 2000). Such a platform must provide “protection against theft and misuse of secrets
held in the biometric system and also provide a mechanism to prove its identity while
maintaining anonymity; and a mechanism to prove that it is executing in the expected
manner” (International Standards Organization, 2009). In deciding to use the new bio-
metric system, the user does not have complete information on the benefits and adverse
effects of using the new technology. This means that the user makes a leap of faith to
commit to the new technology, referred to as system trust (Bahmanziari et al., 2003).
The users are unlikely to accept the system unless they trust it, as previous research has
found. Trust was found to be generally important in the adoption of new technologies
(Fukuyama, 1995). For example, trust was found to be a predictor of an individual’s be-
havioral intention to use new RFID technologies (Sill et al., 2007). RFID has some simi-
larity with a biometric system in terms of the trust and privacy issues, but it is not known
whether the users will respond the same way, which leads to the following hypothesis:
H1: System Trust will positively influence the Behavioral Intention to Use biometric technology.
system Invasiveness
Civil Libertarians associate biometric technology with the stigma of being invasive to
personal privacy and having the potential of mass profiling (Charndra & Calderor,
2005). Likewise, biometric researchers believe that privacy is one of the factors inhibiting
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
24 e-Service Journal Volume 7 Issue 3
broad usage of biometrics (Jain et al., 2004). Privacy refers to the control over when and
by whom the various biometric aspects of an individual can be sensed by others (Parker,
1983). This suggests that the user should have as much control over when and who can
take his or her biometric patterns. Biometrics, by design, can provide irrefutable proof of
the identity of a person (Jain et al., 2004). This irrefutable proof can be used positively to
prevent unauthorized entry, but it can also be used negatively to invade personal privacy.
Privacy can be examined from two perspectives (Cavoukian & Tapscott, 1997). First
is the privacy of the person from the collection of personal behavior patterns. This is also
called system Invasiveness, a key construct in this study. The fact that biometrics can iden-
tify the owner of a given pattern can be abused to covertly identify and monitor users going
about their business without their consent and without them being aware that they are
being monitored. For example, previous studies have shown that the typing patterns made
when typing are unique for each individual (Monrose & Rubin, 2000). This means that a
biometric system has the potential to continuously monitor what the user is doing on his or
her computer, e.g., a key-logging spyware. Good biometric technologies need to conform to
the limits of social and ethical acceptability (Albrecht, 2001). The second perspective is the
privacy of the individual’s stored data. Many people fear that the system may either be col-
lecting unauthorized or unnecessary data or may be using authorized data for other unin-
tended purposes (Cavoukian & Tapscott, 1997). The current best practice is that the actual
biometric collected from a person should not be stored. Rather, only a sanitized template
consisting of the required features need be stored (Reid, 2004). Physical invasiveness was
found to have a negative impact on an individual’s intention to adopt new biometric tech-
nologies (James et al., 2006). The typing pattern biometric is less invasive compared to
physical biometrics (National Center for State Courts, 2002); hence the effect of this factor
on different biometrics is not known. This suggests the following hypothesis:
H2: Perceived System Invasiveness will negatively influence the Behavioral Intention to Use a biometric system.
Trust and invasiveness have an inverse relationship in social interactions. Previous
work found that trust and relational quality between college students and their parents
decreases when the parents becomes highly invasive to the students lifestyle (Spitzberg &
Cupach, 1998). This happens with technology too. For example, privacy invasiveness
was found to be a negative predictor of trust during the adoption of RFID technologies
(Sill et al., 2007). Trust can be earned which will increase the consumers’ willingness to
give their private data. Over 72% of Web users said they “would give Web sites their de-
mographic information if the sites would only provide a statement regarding how the in-
formation collected would be used” (Hoffman et al., 1999). This suggests that the
information collection statement would increase trust and thereby reduce the
Intention to Use Biometric Systems
25
invasiveness feeling. The above experience suggests that the more invasive a biometric,
the less trustworthy users find it to be. This leads to the following hypothesis:
H3: Perceived System Invasiveness will negatively influence the Trust placed in a biometric system.
Biometric system security
A biometric system should be secure to make sure that the stored patterns do not end up
in the wrong hands. Security refers to the assurance that the sensed measurements are
not fraudulent (Jain et al., 2004). Biometric systems can fail from two categories of
weaknesses. The first is intrinsic failure, which results from a weakness in any of the bio-
metric systems key modules (sensing, feature extraction and matching modules) and
thus resulting in an incorrect decision. The second failure can result from an adversary
attack. The adversary may be an insider who uses the given privileges to tamper with the
biometric settings and thus compromise the system. The adversary could also be an ex-
ternal attacker who hijacks the system and fools it using fake biometrics (Jain et al.,
2008). The biometric system, then, must be secure enough to prevent the insertion of
fraudulently obtained patterns by the hacker. The less secure a biometric system is, the
higher the chance that hackers can hijack the system and monitor user behavior, thus in-
vading personal privacy. This suggests an inverse relationship between Perceived System
Invasiveness and Perceived System Security leading to the following hypothesis:
H4: Perceived System Security will negatively influence Perceived System Invasiveness.
Trust and security have been suggested as the two most critical factors that determine the
acceptability of a website (Ruppel et al., 2003). Biometric system security is also linked
to trust as a good system design discourages misuse and advocates the interest of the user
(Bernecker, 2006). A strong biometric system can be used to enhance security, which in
turn can reduce system mistrust (Kleist, 2007). This will enhance consumers’ trust,
which has been found to depend on the perceived security of the technology (Chellappa
& Pavlou, 2002). This suggest that users will trust a biometric system more if they feel
that it is secure, which leads to the following hypothesis:
H5: Perceived System Security will positively influence the user’s trust in a biometric system.
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
26 e-Service Journal Volume 7 Issue 3
system Accuracy
Biometric system accuracy is another factor that biometric researchers believe is inhibit-
ing broad usage of biometrics. Accuracy refers to the ability to accurately and efficiently
represent and recognize biometric patterns (Jain et al., 2004). Biometric accuracy can be
hindered by two main types of errors; 1) False acceptances and 2) False rejections. False
acceptance rate (FAR) is the probability that an impostor will be falsely accepted as the
correct user, also called Type II error. False rejection rate (FRR), also called Type I error,
is the probability that the correct user will be falsely rejected by the system in the authen-
tication of his or her biometric features (Graeventiz, 2003). There is an inverse relation-
ship between the two errors such that reducing one type of error results in the increase of
the other. Thus the normal procedure is to settle on a compromise according to the secu-
rity posture of an organization. For example a military organization which is highly secu-
rity conscious may sacrifice FRR for FAR while a university system may sacrifice FAR for
FRR. Further a good biometric should be scalable (Jain et al., 2004). Scale refers to the
number of subjects using the system. There is usually a tendency for the systems accuracy
to decrease with an increasing number of users. This is because the higher the number of
users, the higher the probability that patterns of two or more users may overlap. How-
ever, a good system should degrade gracefully, i.e., in a gradual manner. FAR is impor-
tant because it determines the strength of the matching algorithm, suggesting that the
stronger the algorithm, the less likely that a false authentication will happen (Reid,
2004). A biometric system with high FAR errors is highly insecure, since it allows hack-
ers to access the protected resource. Users would perceive such a system as insecure,
which leads to the following hypothesis:
H6: Perceived False Acceptance Rate will negatively influence the Perceived System Security offered by a biometric system.
Further, security is a critical concern to most customers, especially if personal or financial
information is concerned (Stafford, 2001). This means that organizations must convince
their customers that their systems are secure and the privacy of the customers informa-
tion has been safeguarded (Nath et al., 2001). This would not be possible if the said sys-
tems are giving access to hackers due to high false acceptance rates. Rather, such a high
level of false acceptance rate will breed mistrust in the users if we are to judge from expe-
rience in other fields of technology. For example, customer trust in mobile commerce
devices was found to positively correlate with accuracy (Siau & Shen, 2003). This leads
to the following hypothesis:
H7: Perceived False Acceptance Rate will negatively influence the Biometric System Trust.
Intention to Use Biometric Systems
27
The three key pillars of security require that a system ensures confidentiality, integrity
and availability to the genuine user whenever the services are needed (Whitman & Mat-
tord, 2005). But if a system is rejecting the user due to high FRR, then it means it falls
short on the availability pillar. This is annoying to the user and will make the user doubt
the overall system. Doubts create mistrust. Thus, trust, ease of use and accuracy are
complementary which is supported by previous work. For example, researchers found
that quick adoption of personalization systems required that the system be easy to use, be
highly accurate and build trust in the delivered personalization (Buczak et al., 2002). A
system with a high level of FRR will be rejecting and thus locking out the genuine user
frequently. This means it is not easy to use and it is not accurate, as it cannot recognize
the genuine user. Failure on the ease of use and on the accuracy criteria suggest that it
will also fail on the trust criterion. Thus the user will not trust such a system to do the job
of providing security to protected assets, which leads to the next hypothesis:
H8: Perceived False Rejection Rate will negatively influence the Trust placed on a Biometric System.
Facilitating conditions
A facilitating condition is defined as the degree to which an individual believes that an
organizational and technical infrastructure exists to support the use of a system (Ven-
katesh, 2000). The existence of a supporting infrastructure can be a precondition for the
success of a new technology. For example, typing pattern biometric technology needs to
be standardized and supported by mainstream manufacturers. The technology needs to
be compatible with current hardware and software (Clarke, 1994). The more experience
a user has with a similar biometric system, the more confident he or she will feel about
using it without being rejected erroneously, which leads to the following hypothesis:
H9: Facilitating Conditions will negatively influence the False Rejection Rates.
Previous work also suggests a relationship between trust and facilitating conditions. For
example, having safety mechanisms in the online world was shown to increase the trust
that a customer has on a web site (Gefen et al., 2003). Thus safety mechanisms serve as
facilitating conditions, which enhance trust. The perception by the user of the environ-
ment and the institution in which the technology is being used has been found to in-
crease trust (McKnight et al., 2002). McKnight (2002) found that users were more
hesitant to engage in online shopping because they feared the environment. Facilitating
conditions were also found to influence the perceived trust in mobile devices (Liu et al.,
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
28 e-Service Journal Volume 7 Issue 3
2005), which suggests that the same should happen for the biometric systems. This leads
to the following hypothesis:
H10: Facilitating Conditions will positively influence the Trust in a biometric system.
The amount of expertise that a customer needs to acquire in order to use a new technology
has been shown to affect the customer’s intention to adopt a technology (Chircu et al.,
2000). Thus a customer would be quicker to adopt a new technology if he or she already
possesses most of the required expertise. Thus in line with previous research (Venkatesh et
al., 2003), particularly in the early stages of training and introduction, we suggest that fa-
cilitating conditions will be positively correlated with the behavioral intention to use the
typing pattern biometric. This effect should disappear in later stages of adoption. Thus,
the following hypothesis is proposed:
H11: Facilitating Conditions will positively influence the Behavioral Intention to use a biometric system.
A structural model was then built to represent the hypothesis connecting the six con-
structs, as shown in Figure 1.
reseArch desIgN
The previous section helped develop a biometric model that shows the role of trust and
other related factors in the formation of the intention to use a new typing pattern biomet-
ric technology. This study investigates the important factors that would determine the
Figure 1: Biometric Intention to Use Model
Intention to Use Biometric Systems
29
intention-to-use new behavioral biometrics as represented by typing pattern biometrics.
The typing pattern biometric technology creates unique profiles of the patterns made by
a user when typing as a basis for later identification or verification. This biometric was
chosen for this study because the technology can be added easily to an existing authenti-
cation system (password or token-based system) simply by adding software. Further, this
biometric is inexpensive, easy to use and less invasive than other technology. A practical
three factor authentication application could add a typing pattern biometric layer to an
ATM smart card with a PIN system. This can be achieved by installing computer soft-
ware that monitors the typing duration patterns and building a unique profile to verify
each individual. The resulting model should apply to general biometrics and other infor-
mation security technologies although they may load differently on the various factors.
Previous research has shown that well designed scenarios can be used in experi-
ments to trigger user attitudes towards some future event (Gregory & Duran, 2001). In
order to validate the model in Figure 1, users were shown a future biometric ATM-based
scenario. A questionnaire to measure the users’ attitudes toward a typing pattern biomet-
ric was designed and developed while taking steps to ensure its reliability and validity.
The questionnaire was developed in iterative phases (Churchill, 1979; Malhotra &
Groover, 1998) and refined following the methods of previous instrument development
research (Mohtadi, 1998; Templeton et al., 2002). The development phases were; (1)
conceptual specification and definition of constructs, (2) construction of items, (3) data
collection, and (4) measurement purification.
The seven constructs were specified and defined as in the previous work section.
They are perceived system security, perceived false acceptance rate, perceived false rejection
rate, perceived system invasiveness, system trust, facilitating conditions and the behavioral
intention to use a new biometric technology. The relationships between the seven con-
structs were also given and twelve hypotheses suggested as shown in the model in Figure 1.
The original draft of the questionnaire included a total of 21 question items. Each
question item was designed using a five-point Likert scale. Responses ranged from strongly
agree (4 points) to strongly disagree (0 point). This research used validated question items
from previous research as much as possible. However, there were several instances where
no items were available from previous work. New question items were created in the latter
case, and care was taken to ground these items in previous theory, as explained below.
1. System Trust—This consisted of two new question items grounded in previous
trust research (McKnight et al., 2002; McKnight et al., 1998).
2. Perceived System Invasiveness—This consisted of three new question items
grounded in previous research (Cavoukian & Tapscott, 1997).
3. Perceived False Acceptance Rate- Two question items were designed grounded in
the biometric literature (Graeventiz, 2003).
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
30 e-Service Journal Volume 7 Issue 3
4. Perceived False Rejection Rate—Two question items were designed grounded in
the biometric literature (Graeventiz, 2003).
5. Behavioral intention to Use—This construct consisted of three question items
which were adapted from the original TAM study (Davis, 1989).
6. Facilitating Conditions—This construct consisted of one question item on timing
from the marketing literature (Moore & McKenna, 1999) and two question items
based on standardization and interoperability literature (Thompson et al., 1991).
7. Perceived System Security—This consisted of three new question items grounded
on the biometric literature (Jain et al., 2004).
Issues of validity and reliability were addressed by evaluating the questionnaire
items against the recommended ideal survey attributes yardstick (Malhotra & Groover,
1998). This authoritative best practice is used in information systems and marketing to
give guidance on how question items should be designed. Care was taken to satisfy reli-
ability and validity concerns in each phase.
The unit of analysis was defined as potential users of biometric keypad bank ATMs.
All question items were refined to directly address the unit of analysis. One question from
the determinant of facilitating conditions construct (compatibility and standards) was
dropped as the target audience was deemed unknowledgeable in such matters. Every effort
was taken to avoid measurement errors through the use of multi-item scales, content valida-
tion and pretesting. The first step was a clear definition of each construct and their determi-
nants as explained in the above section on model development. The second step was the use
of existing measurement scales wherever possible or alternatively making sure that new ques-
tion items were grounded in previous literature in the few cases where there were no existing
scales. The questions were pretested and refined iteratively until a final set was reached.
subjects and Task
The target population of the survey was defined to be users of Bank ATMs. However due
to time constraints, the sample was drawn from the college student population. They are
technically savvy and look optimistically to the use of new technologies (Schooley,
2005). Hence they are normally early adopters of technology, e.g., mobile technology,
video games and social networking technology. This group represents the future; hence
its attitude to biometrics is a critical predictor of technology adoption in general. They
are targeted first when introducing such a new innovation (Rogers, 1995).
Subjects were drawn from a four year university college and a two year community
college in the Northeastern USA, totally 159 students drawn from 8 undergraduate classes
of 270 students invited. This gave an overall response rate of 59 percent, above the required
minimum of 20 percent to be a representative sample of the population surveyed (Yu &
Cooper, 1983).
Intention to Use Biometric Systems
31
The subjects were given a brochure with a future-based scenario. In the scenario,
users were told that the current bank ATM machines were going to be replaced with new
biometric keypads. The only difference being that the user typing patterns will be cap-
tured and used along with the PIN to verify that the user is indeed the genuine user and
not an imposter (See Appendix A). The goal of the scenario was to create a mental image
of the biometric keypad in the subject’s mind. The scenario was designed following rec-
ommended best practice (Gregory & Duran, 2001). A key requirement from the best
practice was that the scenario should depict the occurrence of a future event using a con-
crete example that was representative. This study used a bank ATM example, which is a
common example of an application which could be substantially improved by the addi-
tion of a typing pattern biometric layer in a two or three factor authentication (O’Gorman,
2004). The subjects were then asked to complete a questionnaire on their attitudes to-
ward such a typing pattern biometric ATM.
resulTs
This section gives results from the user attitude survey including descriptive statistics, a
validation of the biometric acceptance model and the diagnostic statistics.
descriptive statistics
All the questions used a Likert scale ranging from 0–4 with 4 being the highest score.
Negatively phrased question scores were reversed so that all the scores were in one direc-
tion. Some constructs had more items than other constructs. The mean of each construct
was computed so that the constructs could be compared. Table 2 summarizes the mean
and standard deviation results from this survey.
Test for Normality
Parametric statistics assume that sample data is from a normal distribution; hence it is
prudent to confirm whether this assumption is true before performing parametric tests.
A normality test was conducted to investigate whether the sample data was normally dis-
tributed. The Shapiro-Wilk test is the standard measure for testing normality when N is
less than 2000 (Conover, 1999). The test works by assuming that the data is normally
distributed (H0). The null Hypothesis is rejected if the p-value is less than 0.05, and an
inference made that the data is not normally distributed.
Table 3 shows the Shapiro-Wilk values and the p-values calculated for the con-
structs examined in this study. All constructs had p-values less than 0.0001, meaning
that their data was not normally distributed. Therefore, we chose to deploy PLS, a distri-
bution-free approach, for validating the biometric acceptance model (Garson, 2009). In
addition, it was noted that some of the items were similar but not loading together. On
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
32 e-Service Journal Volume 7 Issue 3
further analysis, we realized that they were formative rather than reflective. Reflective
measures indicate the latent construct while formative measures combine to form the la-
tent construct (Chin & Newsted, 1998). Thus PLS software was also used because sev-
eral of the constructs were not reflective but formative, which LISREL cannot handle,
and the estimates of path weights tend to be more conservative than those of LISREL
(Chin, 1998; Chin & Newsted, 1998). For our model, we needed a minimum sample
size of only 40, since no variable has more than four incoming paths or four formative
indicators. The sample size, 159, was substantially greater than the minimum for the
model. The measurement model was validated by testing in three ways; (1) Content va-
lidity, (2) Construct reliability, and (3) Discriminant/Convergent validity.
Table 2: Descriptive Statistics for the Six Constructs
Construct Item mean (stdev)
BehavIntentToUse
BUI1 2.491 (1.037)
BUI2 2.421 (0.970)
BUI3 2.748 (0.849)
P. System Security
SV1 2.019 (1.128)
SV2 2.145 (1.2265)
SV3 1.931 (1.197)
P. System Invasiveness
SI1 2.101 (1.186)
SI2 2.409 (0.949)
SI3 2.598 (0.880)
FacConditionsTM1 2.528 (0.899)
FC1 2.642 (0.844)
System TrustTST1 1.994 (1.094)
TST2 2.541 (0.862)
Perceived False Rejection RateFRR1 2.818 (0.967)
FRR2 1.667 (0.979)
Perceived False Acceptance RateFAR1 2.642 (0.888)
FAR2 2.572 (0.958)
Intention to Use Biometric Systems
33
content validity
Content validity requires that the questionnaire items represent the materials (or content
areas) that they are supposed to represent (Rosenthal & Rosnow, 1991). This was en-
sured in two ways: (1) By adapting measurement scales tested in prior studies whenever
such scales could be found. (2) By grounding new measurement scales in previous litera-
ture whenever previous measurement scales could not be found.
construct validity and reliability
Tables 4 and 5 show that the construct reliability values for all the constructs tend to be
above the recommended minimum of 0.7 (Nunnally, 1978). Thus, we see a correspondence
between the measurement variables and the theoretical behavioral intention to use a new
biometric construct. This validates that the manifest variables are good measures of the
constructs.
discriminant/convergent validity
Discriminant validity checks whether the construct is different from the other con-
structs. Measures that are supposed to be related should, in practice, correlate well to-
gether and not correlate to dissimilar measures (Rosenthal & Rosnow, 1991). Table 6
shows the association values for the manifest variables used to measure their constructs as
well as their associations with other constructs. It is observed that measures of each of the
constructs relate well with their theoretical construct through high associations and do
not relate as much to the other constructs. Note that convergence tends to be lower for
formative constructs than for reflective constructs.
Table 3: Tests for Normality
Construct Shapiro-Wilk Statistic p-value
Perceived False Rejection Rate 0.951 <0.0001
Perceived False Acceptance Rate 0.907 <0.0001
Perceived System Security 0.958 <0.0001
Perceived System Invasiveness 0.957 <0.0001
System Trust 0.957 <0.0001
Facilitating Conditions 0.974 <0.0001
Behavioral Intention to Use 0.956 <0.0001
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
34 e-Service Journal Volume 7 Issue 3
These values indicate that the measures used in this study have adequate discrimi-
nant validity. The only measure that was below 0.6 was that of the facilitating conditions
and this is most likely because it is also a formative construct, and they typically converge
more weakly than reflective constructs (Chin & Newsted, 1998; Gefen et al., 2000).
Since the measures have been shown to have both a high convergent validity and
high discriminant validity, we can conclude that they have adequate construct validity.
Test of structural Model
A partial least squares analysis resulted in the model shown in figure 2, degrees of free-
dom = 499.
The model shows 17.4 percent variance explained for Perceived System Security,
15.7 percent variance explained for Perceived System Invasiveness, 53.5 percent variance
explained for System Trust, and 55.3 percent variance explained for the Behavioral In-
tention to Use the biometric system.
The beta coefficient for H1 was 0.414, statistically significant at a level of p < 0.001.
This was not surprising as the expectation from the biometric literature was that System
Table 4: Factor Analysis of Reflective Constructs, Principal Components, Varimax Rotation
Construct Item BehavIntP. System Security
P. System Invasiveness
P.FAR P. FRR
BehavInt
BUI1 0.849 0.085 -0.028 0.251 0.013
BUI2 0.704 0.278 0.352 -0.095 -0.284
BUI3 0.727 0.183 0.303 0.181 -0.206
P. System Security
SV1 0.061 0.774 0.34 0.216 0.019
SV2 0.258 0.841 -0.016 -0.007 0.012
SV3 0.11 0.842 0.114 0.212 -0.116
P. System Invasiveness
SI1 -0.338 -0.06 -0.637 -0.257 -0.246
SI2 -0.081 -0.195 -0.88 -0.022 -0.023
P. FARFAR1 -0.517 -0.206 -0.062 -0.641 0
FAR2 -0.108 -0.205 -0.147 -0.84 0.153
P. FRRFRR1 -0.048 -0.154 0.064 0.064 0.877
FRR2 -0.198 0.121 0.091 -0.315 0.727
Cron-bach Alpha
0.798 0.826 0.694 0.611 0.604
Intention to Use Biometric Systems
35
Trust is one of the most important factors in the intention to accept a new biometric tech-
nology. System Trust was the most influential predictor of Behavioral Intention to Use.
The beta coefficient for H2 was -0.166, statistically significant at a level of p < 0.01.
This demonstrates that Perceived System Invasiveness is important and a specific concern
apart from FRR and System Trust. This was the least influential predictor of Behavioral
Intention.
The beta coefficient for H3 was -0.246, statistically significant at a level of p <
0.001. This demonstrates that Perceived System Invasiveness, the annoyance associated
with physical intruding upon the user, is distinct from and antecedent to System Trust.
This was the second most influential predictor of System Trust. To the extent that the
Table 5: Construct Reliab. of the Measurement Model, df = 499; *** p < 0.001, ** p < 0.01
Construct TypeComposite Reliability
Item loading t-stat
BehavInt Reflective 0.882
BUI1 0.8142 28.4129***
BUI2 0.8404 23.4232***
BUI3 0.8785 30.5873***
P. Sys. Security Reflective 0.896
SV1 0.8800 41.4708***
SV2 0.8128 22.2421***
SV3 0.8896 44.7908***
P. System Invasiveness
Reflective 0.837SI1 0.8337 19.0957***
SI2 0.8438 19.3027***
FacCond Formative 0.708
TM1 0.8914 14.8905***
FC1 0.6602 6.3247***
FC2 0.2984 2.4914**
FC3 0.5436 4.3308***
System Trust Formative 0.782TST1 0.8204 15.7855***
TST2 0.7866 14.3015***
P. FAR Reflective 0.868FAR1 0.8949 29.7460***
FAR2 0.8546 24.9385***
P. FRR Reflective 0.828FRR1 0.7732 10.3746***
FRR2 0.9062 25.3571***
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
36 e-Service Journal Volume 7 Issue 3
system is perceived to be invasive, it is perceived to be less trustworthy. Conversely, to be a
good, trustworthy system it does not need to be invasive.
The beta coefficient for H4 was -0.396 was statistically significant at a level of p <
0.001. This demonstrates that system security is a technology pre-condition, necessary
but not sufficient for low Perceived System Invasiveness. We reason analogously to the
relationship with System Trust. If a system is perceived to be secure, it will be perceived
as being less invasive. There will be a positive association between Perceived System Se-
curity and the user’s trust in a biometric system.
The beta coefficient for H5 was 0.297, statistically significant at a level of p < 0.001.
This demonstrates that Perceived System Security is a technology pre-condition, distinct
from and antecedent to System Trust. Perceived System Security means to users the in-
trinsic capability of the technology, which, if high, is worthy of trust. Perceived System
Security is the most influential predictor of System Trust.
The beta coefficient for H6 was -0.417, statistically significant at a level of p <
0.001. This demonstrates that Perceived System Accuracy (FAR) refers to the technical
merit, i.e., the algorithms in the software. Thus, fundamentally, if the system is not only
difficult to hack, but also algorithmically robust, it will be accurate, not allowing impos-
ters to use the system.
Table 6: Convergent/Discriminant Validity, *formative constructs
AVEBehav.
Int.P. Sys.
SecurityP. Sys Inv
Fac. Cond.
Sys Trust
P. FAR
P. FRR
Behav. Int. 0.714
P. Sys. Security
0.416 0.742
P. Sys Inv -0.476 -0.435 0.600
Fac. Cond. 0.601 0.228 -0.272 0.404*
Trust 0.666 0.522 -0.475 0.522 0.646*
P. FAR -0.542 -0.417 0.381 -0.558 -0.528 0.766
P. FRR -0.288 -0.087 -0.09 -0.372 -0.349 0.228 0.710
Intention to Use Biometric Systems
37
The beta coefficient for H7 was -0.132, statistically significant at a level of p < 0.05.
This demonstrates the influence of the fundamental algorithms of the software having
direct influence on System Trust. That is, apart from the technology’s vulnerability to
hacking or assurance of sensitive personal information, the fundamental, technical ac-
curacy matters. Perceived System Accuracy (FAR) is the weakest influence on Trust.
The beta coefficient for H8 was -0.228, statistically significant at a level of p <
0.001. This demonstrates that this was a new technology which was overly vigilant. Any
system that rejects legitimate users will be perceived as untrustworthy. This was the sec-
ond most influential predictor of Behavioral Intention to Use, weaker only than Per-
ceived System Security.
The beta coefficient for H9 was -0.371, statistically significant at a level of p <
0.001. This is a very strong influence. The more positive the facilitating conditions, the
lower the perceived false rejection rate should be. That is, facilitating conditions strongly
influence the extent to which the system is perceived to be overly vigilant.
The beta coefficient for H10 was 0.223, statistically significant at p < 0.01. This is
the four strongest influences on System Trust. The more positive the facilitating condi-
tions, the more trustworthy the system is perceived to be.
The beta coefficient for H11 was 0.334, statistically significant at p < 0.001. This is
the moderately strong influence on behavioral intention to use, second only to the influ-
ence of System Trust. The more positive the Facilitating Conditions, the greater the Be-
havioral Intention to Use.
Figure 2: Results
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
38 e-Service Journal Volume 7 Issue 3
In conclusion, all eleven hypotheses were supported and the biometric model is
validated.
discussion
This study set out to investigate the critical factors that determine the intention to use
biometric technologies. This was done, first, by integrating previous research done in
both the technology acceptance and biometric engineering literature. Six important fac-
tors were identified and eleven hypotheses were proposed incorporating these factors into
a biometric acceptance model. Finally, we designed/conducted a survey and analyzed the
results with PLS analysis. The results supported the proposed hypotheses and the bio-
metric model was validated. We show in this research a parsimonious nomological net-
work, including both direct and indirect influences on the Behavioral Intention to Use a
biometric system.
The model demonstrated the great role played by facilitating conditions in reduc-
ing False Rejection Rate, in enhancing System Trust and the Behavioral Intention to Use
a biometric system. This shows that facilitating conditions cannot be taken for granted or
ignored. Users felt that the time is right to start using biometric ATMs, perhaps because
of the increasing cases of identity fraud. Further, users felt that they had the necessary
knowledge and skills to use a biometric system without making mistakes from their long
usage of similar looking ATMs. This suggest that the success of a technology is not only
dependent on its attributes but also on the readiness of the general environment to receive
such a technology. The fact that users are still apprehensive about accuracy, security and
invasiveness of the technology also suggests that the technology is still too new or un-
known and will take some time to mature and be widely accepted in the market.
The model shows that the loading from False Acceptance Rate explains 17.4 % of
the system security construct. This suggest a very strong relationship between FAR and
system security, in line with previous literature. This result is a stark reminder to the bio-
metric designers that the primary role of the biometrics is to provide security by control-
ling who gets access to the organizational, protected assets. There is no way that biometric
can provide this important service it if cannot accurately separate the impostors from the
authorized users. Thus the path to high security is through the reduction of the False Ac-
ceptance Rate.
Facilitating conditions, false acceptance rate, false rejection rate, perceived system
security and perceived system invasiveness are five distinct and important predictors of
system trust, the key mediating variable, and behavioral intention to use, the dependent
variable. Thus system trust is seen to be at the center of the model and has some mediating
effect between all the other factors and the behavioral intention construct. This demon-
strates the central role of system trust in forming the intention to use biometric systems.
Intention to Use Biometric Systems
39
System trust, to the best of our knowledge, has not been given the importance that it de-
serves in the biometric literature. Hence the lesson, particularly for biometric engineers, is
that the introduction of new biometric systems will be unsuccessful without trust, even if
the system is technologically accurate, secure and less invasive and even with all the facili-
tating conditions in place (in reality and the perceptions thereof). It is further proof to the
information systems community that trust remains the key to biometric acceptance.
Another interesting relationship is that between the typing pattern biometric sys-
tem security and system invasiveness. Typing pattern biometrics are usually taken to be
less invasive compared to other biometrics, e.g., the Iris Scan (National Center for State
Courts, 2002). Indeed, this construct was not expected to load highly in the overall
model as users have been using ATM systems for a long time. However, the fact that the
users perceive a negative association between security and invasiveness is striking. The
implication from the model is that the typing pattern biometric may be perceived as
more invasive than the reality if it is not secure enough, which means that this is a key
issue that needs to be addressed when implementing such a technology.
The model also demonstrates the cascading effect of one factor onto the next. For
example, perceived poor facilitating conditions will result in higher false acceptance rate,
which in turn will result in higher false acceptance rates. A perceived high level of false
acceptance rate will result in poor perceived system security, which results in high per-
ceived system invasiveness, which eventually results in poor behavioral intention to use
the biometric system. This suggests that the improvement of all factors is important. A
chain is only as strong as its weakest link. Successful acceptance will only result when the
facilitating conditions are in place, when the typing pattern biometric is accurate, secure
and not invasive to the user. This will lead to a high level of trust, a necessary require-
ment for users to start using the biometric system.
This model can be used to predict the behavioral intention to use general biometric
technologies and indeed of other security technologies which work on similar principles.
The model refines the general technology acceptance models, notably TAM (Venkatesh,
2000) and UTAUT (Venkatesh et al., 2003), by providing a model that is specific to se-
curity technologies. The constructs are more sensitive to security issues, building upon
the extensively validated acceptance models. For example, the usefulness of a biometric
system is in providing security to a protected asset. This suggests that the Perceived Sys-
tem Security construct is a particular type of Perceived Usefulness (from TAM) and
particular type of Performance Expectancy (from UTAUT). Similarly, Perceived False
Rejection Rates determines the ease of use of a biometric technology, making it a particu-
lar type of Ease of Use (from TAM) and Effort Expectancy (from UTAUT).
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
40 e-Service Journal Volume 7 Issue 3
contributions
This study makes several contributions. First, the successful validation of the model pro-
vides a yardstick that can be used in the design and evaluation of biometric technologies.
The designers can use the model to identify the factors that they should prioritize in
order to maximize the probability of the biometrics being accepted and used. Such a
model is useful, because the design of systems is usually a balancing act between several
opposing factors, and it is not always obvious what to focus on when faced with two op-
posing objectives.
Second, the study developed a measurement instrument for biometric technolo-
gies. This was done either by adapting questions used in previous research studies or cre-
ating new questions when none existed, but taking care to ground these new items in the
relevant literature. Thus the study generated and tested measurement scales for perceived
system accuracy, perceived system security, system trust and perceived system invasive-
ness. A new question item based on timing was added to the facilitating condition con-
struct. We recommend that this instrument be verified, used and extended by future
researchers in biometric technology.
Last, the overall study provides new insights for biometric technology researchers
and designers. If a new biometric technology is to be accepted by the by users, it is impor-
tant to make sure that the biometric system is accurate, secure, trusted and not invasive.
Further, the facilitating conditions need to be introduced at the right time. Several of the
factors were found to be dependent on another. For example, the perceived system inva-
siveness is dependent on the perceived system security, and, in turn, the perceived system
security provided by the system is dependent on the perceived false acceptance rate. This
means that the different factors should all be addressed in order to maximize the behav-
ioral intention to use the biometric technology.
lIMITATIoNs ANd FuTure dIrecTIoNs
This study surveyed a student population regarding their perceptions of a biometric key-
pad system. Students are among the most frequent users of ATM systems; future studies
will need to test the model on other sectors of the population. Another limitation is the
number of subjects used and their diversity. One-hundred fifty-nine subjects successfully
completed the questionnaire out of a total of 270 who were initially contacted. Students
are also early technology adopters; this study should be extended to include more diverse
populations, including early to late adopters and technology skeptics as well as technology
optimists. That is, this research should be extended to include more heterogeneous groups.
The biometric user acceptance model was validated using a biometric keypad sys-
tem. Several extension studies can be done to validate the model applicable to other types
of biometrics and security technologies. One possibility is to test the model against
Intention to Use Biometric Systems
41
physical biometric technologies such as fingerprint systems and retinal scan to examine
the generalizability of the model to biometrics which are more accurate but at the same
time more invasive than the biometric keypad system. Further, the above study was done
by testing the user acceptance before the introduction of the biometric technology. An-
other approach would be to do a longitudinal study that would test the biometric tech-
nology acceptance at three points (1) before use, (2) during use and (3) after use. The
benefits would include more measurements and tracings of the actual usage, as well as
observations of the usage patterns across the introduction and acceptance process.
The resulting set of critical factors can be used iteratively to meet two goals; first,
the engineering activities taking place in the design of the biometric technologies should
be re-aligned to enhance the positive and minimize the negative critical factors and sec-
ond, the resulting critical acceptance factors can be used to define and prioritize the engi-
neering activities that need to take place to ensure that the resulting products meet the
expected outcomes and thus increase the probability of acceptance by users.
Overall, more work is needed to integrate the gains made from both the biometric
engineering and technology acceptance research. For example, much of the work being
done in biometric engineering research is geared toward improving the technical perfor-
mance and scalability of biometric technologies. The integration would contribute
greatly to the improvement of factors such as systems accuracy and security, which were
identified in this study as important in the uptake of biometrics. This study also demon-
strated the central role of system trust, an important topic in the technology acceptance
literature. Trust was shown to depend upon perceived system invasiveness, facilitating
conditions, perceived systems accuracy (of FAR and FRR) and perceived system security.
Incorporating the gains from each of these streams of research will help fill in some miss-
ing pieces of the biometric puzzle. Further, the biometric system engineers frequently in-
teract with the end-users in the product development cycle, starting with the gathering of
user requirements and ending with the last stages of implementation, training and main-
tenance (Rogers, 1995). They are well positioned in the product development cycle to
understand user needs. This can provide critical feedback to the technology acceptance
researchers on user attitudes toward biometric technologies. The continued synthesis of
these two research streams provides an extensive agenda for future research.
reFereNcesAlbrecht, A. (2001). “Understanding the Issues behind User Acceptance”. Biometric Technology Today,
1(1), 7–8.
Albrecht, A., Behrens, M., Mansdfield, T., McMeechan, W., Rejman-Greene, M., Savastano, M., et al.
(2003). BioVision: Roadmap for Biometric in Europe to 2010 (No. PNA-E0303): Centrum voor
Wiskunde en Informatica.
Ash, J. (1997). Factors for Information Technology Innovation Diffusion and Infusion in Health Sciences Orga-nizations: A Systems Approach. Portland State University
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
42 e-Service Journal Volume 7 Issue 3
Azjen, A., & Fishbein, M. (1980). Understanding Attitudes and Predicting Social Behavior. New Jersey,
USA: Prentice-Hall.
Azjen, I. (1991). “Theory of Planned Behavior”. Organizational Behavior and Human Decision Processes, 50,
179–211.
Bahmanziari, T., Pearson, J. M., & Crosby, L. (2003). “Is Trust important in Technology Adoption? A
Policy Capturing Approach”. The Journal of Computer Information Systems, 43(4), 46.
Bernecker, O. (2006). “Biometrics : : Security: An end User Perspective “. Information Security Technical Report,, 11(3), 111–118.
Buczak, A. L., Zimmerman, J., & Kurapati, K. ( 2002). “Personalization: Improving Ease-of-Use, Trust
and Accuracy of a TV Show Recommender”. In Proceedings of the Proceedings of the TV’02 workshop on Personalization in TV, Malaga, Spain.
Cavoukian, A., & Tapscott, D. (1997). Who Knows: Safeguarding your Privacy in a Networked World. New
York: McGraw-Hill.
Charndra, A., & Calderor, T. (2005). “Challenges and Constraints to the Diffusion of Biometrics in Infor-
mation Systems”. Communications of the ACM, 48(12), 101–106.
Chellappa, R. K., & Pavlou, P. A. (2002). “Perceived Information Security, Financial Liability and Consumer
Trust in Electronic Commerce Transactions”. Logistics Information Management, 15(5/6), 358–368.
Chen, L., Pearson, S., & Vamvakas, A. (2000). “On Enhancing Biometric Authentication with Data Pro-
tection”. In Proceedings of the Fourth International Conference on Knowledge-Based Intelligent Engineer-ing Systems and Allied Technologies, Brighton, UK.
Chin, W. (1998). “Issues and Opinion on Structural Equation Modeling”. MIS Quarterly, 22(1), 7–16.
Chin, W. W., & Newsted, P. R. (1998). “Structural Equation Modeling Analysis with Small Samples
Using Partial Least Squares”. “In R. H. Hoyle (Ed.), Statistical Strategies for Small-Sample Research”.
Thousand Oaks, CA: Sage Publications, Inc.
Chircu, A. M., Davis, G. B., & J., K. R. (2000, August). “Trust, Expertise, and E-Commerce Intermediary
Adoption”. In Proceedings of the Americas Conference on Information Systems, Long Beach, CA, USA.
Churchill, G. A. (1979). “A Paradigm for Developing Better Measure of Marketing Constructs”. Journal Of Marketing research, 16(1), 64–73.
Clarke, R. (1994). “Human Identification in Information Systems”. Information Technology and People, 7(4), 6–37.
Conover, W. (1999). Practical Nonparametric Statistics (3rd ed.). New York: John Wiley .
Davis, F. D. (1989). “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information
Technology. “ MIS Quarterly, 13(3), 319–340.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1992). “Extrinsic and Intrinsic Motivation to Use Comput-
ers in the Workplace”. Journal of Applied Social Psychology, 22(14), 1111–1132.
Etter, D. (2003). Biometrics:The Promises and the Challenges. Paper presented at the The Biometric Consor-
tium Conference, Arlington, VA, USAFukuyama, F. (1995). Trust: The Social Virtues & the Creation of Prosperity. New York, NY: The Free Press.
Garson, D. (2009). Partial Least Squares Regression (PLS). from http://faculty.chass.ncsu.edu/garson/
PA765/pls.htm
Gefen, D., Karahanna, E., & Straub, D. (2003). “Trust and TAM in Online Shopping: An Integrated
Model”. MIS Quarterly, 27(1), 51–90.
Gefen, D., Straub, D. W., & Boudreau, M. C. (2000). “Structural Equation Modeling and Regression:
Guidelines for Research Practice”. Communications of the Association for Information Systems, 4(7), 1–77.
Graeventiz, G. V. (2003). “Biometrics in Access Control”. A&S International Automation & Security, 50,
102–104.
Gregory, L., & Duran, A. (2001). “Scenarios and Acceptance fo Forecasts”. “In J. S. Armstrong (Ed.),
Principles of Forecasting—A Handbook for Researchers and Practitioners” (First ed.): Springer.
Intention to Use Biometric Systems
43
Hoffman, D. L., Novak, T., P., & Peralta, M. (1999). “Building Consumer Trust Online”. Communication of the ACM, 42(4), 80–85.
International Standards Organization. (2009). ISO/IEC 11889–1:2009: Information Technology — Trusted Platform Module.
Jain, A., Prabhakar, S., Hong, L., Ross, A., & Wayman , J. (2004). “Biometrics: A Grand Challenge”. In
Proceedings of the International Conference on Pattern Recognition, Cambridge, UK.
Jain, A. K., Nandakumar, K., & Nagar, A. (2008). “Biometric Template Security”. EURASIP Journal of Advanced Signal Processing, 2008, 1–17.
James, T., Pirim, T., Boswell, K., Reithel, B., & Barkhi, R. (2006). “Determining the Intention to Use
Biometric Devices: An Application and Extension of the Technology Acceptance Model”. Journal of Organizational and End-User Computing, 18(3), 1–24.
Kleist, V. F. (2007). “Building Technologically Based Online Trust: Can the Biometrics Industry Deliver
the Online Trust Silver Bullet?” Information Systems Management, 24(4), 319–329.
Liu, C., Yu, C.-S., & lu, J. (2005). “Facilitating Conditions, Wireless Trust and Adoption Intention”. Jour-nal of Computer Information Systems, 46(1), 17–24.
Malhotra, M., & Groover, V. (1998). “An Assessment of Survey Research in POM: From Constructs to
Theory”. Journal of Operations Management, 16(4), 403–423.
Mathieson, K. (1991). “Predicting User Intentions: Comparing the Technology Acceptance Model with
the Theory of Planned Behavior”. Information Systems Research, 2(3), 173–191.
McKnight, H., Choudhury, V., & Kacmar, C. (2002). “Developing and Validating Trust Measures for e-
Commerce: An Integrative Typology”. Information Systems Research, 13(2), 334–359.
Mohtadi, N. (1998). “Development and Validation of the Quality of Life Outcome Measure (Question-
naire) for Chronic Anterior Cruciate Ligament Deficiency”. The American Journal of Sports Medicine, 26(3), 350–359.
Monrose, F., & Rubin, A. (2000). “Keystroke Dynamics as a Biometric for Authentication.” Future Gen-eration Computer Systems, 16(4), 351–359.
Moore, G. A., & McKenna, R. (1999). Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customers. New York, NY, USA: HarperCollins Publishers.
Nath, R., Schrick, P., & Parzinger, M. (2001). “Bankers’ Perspectives on Internet Banking”. e-Service Jour-nal, 1(1), 21–36.
National Center for State Courts. (2002). “Biometrics and the Courts”. In Proceedings of the E-Court Con-ference Las Vegas, CA, USA.
Nunnally, J. C. (1978). Psychometric Theory. New York, USA: McGraw-Hill
O’Gorman, L. (2004). “Securing Business’s Front Door- Passwords, Tokens and Authentication”. “In S.
Ghosh, M. Malek & E. A. Stohr (Eds.), Guarding Your Business: A Management Approach to Business”
(pp. Pages 119–149): Springer USA.
O’Brien, J., & Marakas, G. (2007). Introduction to Information Systems (13th ed.). New York, NY:
McGraw-Hill.
Parker, R. (1983). “A new Definition of Privacy for the Law”. Law and Philosophy, 2(3).
Reid, P. (2004). Biometrics for Network Security. Upper Saddle River, NJ: Prentice Hall.
Ring, R. S., & Van de Ven, A. H. (1992). “Structuring Cooperative Relationships between Organizations”.
Strategic Management Journal, 13, 483–498.
Rogers, E. (1995). Diffusion of Innovations (Fourth ed.). New York, USA: The Free Press.
Rosenthal, R., & Rosnow, R. (1991). Essential of Behavioral Research- Methods and Data Analysis. New
York: McGraw Hill.
Ruppel, C., Underwood-Queen, L., & Harrington, S. J. (2003). “e-Commerce: The Roles of Trust, Secu-
rity, and Type of e-Commerce Involvement”. e-Service Journal, 2(2), 25–45.
Schooley, C. (2005). Get Ready: The Millennials are Coming: Forrester Research.
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
44 e-Service Journal Volume 7 Issue 3
Siau, K., & Shen, Z. (2003). “Building Customer Trust in Mobile Commerce “. Communications of the ACM, 46(4), 91–94.
Sill, H., Fisher, S., & Wasserman, M. (2007). “Consumer reactions to potential intrusiveness and benefits
of RFID “. International Journal of Information Technology and Management, 7(1), 76–97.
Spitzberg, B. H., & Cupach, W. R. (1998). The Dark Side of Close Relationships. Routledge.
Stafford, B. ( 2001). “Risk Management and Internet Banking: What Every Banker Needs to Know”.
Community Banker, 10(2), 48–49.
Tanniru, M. R., & Agarwal, R. (2002). “Applied Technology in Business Program—An Industry-Aca-
demia Partnership to Support Student Learning”. e-Service Journal, 1(2), 5–23.
Templeton, G., Lewis, B., & Snyder, C. (2002). “Development of a Measure for the Organizational Learn-
ing Construct”. Journal of Management Information Systems 19(2), 175–218.
Thompson, R., Higgins, C. A., & Howell, J. (1991). “Personal Computing: Towards a Conceptual Model
of Utilization”. MIS Quarterly, 15(1), 124–143.
Venkatesh, V. (2000). “Determinants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation,
and Emotion into the Technology Acceptance Model”. Information Systems Research, 11(4),
342– 365.
Venkatesh, V., Morris, M., Davis, G., & Davis, F. (2003). “User Acceptance of Information Technology:
Toward a Unified View”. MIS Quarterly, 27(3), 425–478.
Whitman, M., & Mattord, H. (2005). Principles of Information Security (2nd ed.). Boston, Massachusetts,
USA: Course Technology.
Whitworth, B., & Zaic, M. (2006). “The WOSP Model: Balanced Information Systems Design and Eval-
uation”. Communications of the ACM, 49(5), 93–99.
Yu, J., & Cooper, H. (1983). “A Qualitative Review of Research Design Effects on Response Rates to
Questionnaires”. Journal of Marketing Research, 36, 36–44.
Intention to Use Biometric Systems
45
AppeNdIx A: descrIpTIoN oF BIoMeTrIc ATM
The information systems department is working on new technology that will sense the
pressure and time patterns exhibited by a user when typing their personal identification
number (PIN). This use of unique behavior patterns to identify a person is called a biomet-
ric measure. The first application of this biometric technology will be on ATM machines.
We will replace the current ATM keypads with new keypads that can detect a person’s typ-
ing pressure and time patterns. However the new keypads will look and feel exactly like the
existing keypads. Likewise the current ATM procedures will be the same. The only differ-
ence will be that the user typing pressure and time patterns will be captured and used along
with the PIN to verify that the user is indeed the genuine user and not an imposter.
AppeNdIx B: MeAsureMeNT scAles For user percepTIoNs
Measurement scales were developed to measure the constructs the structural model. All
constructs in the survey were measured using a 5 point Likert scale as follows.
Table 7: Measurement Scales for User Perceptions
Determinant Question items Source
Perceived False Acceptance Rate FAR1– I believe that a biometric ATM will be more likely to prevent thieves from using my ATM card than a standard ATM.FAR2– I believe that it will be harder for thieves to trick a biometric ATM
New
Perceived False Rejection Rate FRR1– I am worried that a biometric ATM will not always recognize me as a valid user.FRR2– I am sure that a biometric ATM will have no problems verifying me as the real user.
New
Behavioral Intention to use BIU1– I would avoid using a biometric ATM.BIU2– I intend to use biometric ATMs once they are installedBIU3– I would use a biometric ATM machine if it were conveniently located
(Venkatesh, 2000)
Perceived System Invasiveness SI1– The biometric ATM system is an invasion of personal privacy.SI2– The biometric ATM process of authenticating a user as valid via typing patterns is degrading
New
Benjamin Ngugi, Arnold Kamis, and Marilyn Tremaine
46 e-Service Journal Volume 7 Issue 3
Determinant Question items Source
Perceived System Security SV1– I worry that someone could steal my personal identity from a biometric ATMSV2– I am concerned that someone could steal my typing patterns from a biometric ATM
New
System Trust TST1–I really don’t believe banks are protecting me from identity fraud when they install the biometric ATMs.TST2–I fully believe that the sole reason the banks are introducing the biometric ATMs is for better protection of my money.
New
Facilitating Conditions TM1– Given the increasing cases of identity fraud, I feel that the time is right to start using the biometric ATMsFC1– My experience with computer keyboards and ATM keypads will make the switch to biometric ATMs easy.FC2–I believe that no special knowledge or skill is needed to use new biometric ATMs
(Moore & McKenna, 1999; Thompson et al., 1991)