Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Q1 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
Contents lists available at ScienceDirect
Journal of Purchasing & Supply Management journal homepage: www.elsevier.com/locate/pursup
A structured review of partial least squares in supply chain management research Lutz Kaufmann n, Julia Gaeckler International Business & Supply Management, University/Institution, WHU – Otto Beisheim School of Management, Burgplatz 2, 56179 Vallendar, Germany
art ic l e i nf o
a b s t r a c t
Article history: Received 10 June 2014 Received in revised form 21 March 2015 Accepted 22 April 2015
The application of structural equation modeling (SEM) in the supply chain management (SCM) context has experienced increasing popularity in recent years. Although most researchers are well equipped with a basic understanding of the traditional covariance-based SEM (CBSEM) techniques, they are less familiar with the appropriate use of partial least squares (PLS) SEM. To fill this gap, the current paper critically reviews the use of PLS in 75 articles published in leading SCM journals from 2002 until 2013. The review indicates the potential of PLS, but also its limitations. A comparison across PLS reviews from various disciplines suggests that SCM research applies the same or even higher reporting standards in performing a PLS analysis and reporting the results than other disciplines (e.g., marketing or strategic management) that use PLS. However, SCM researchers often do not fully exploit the method's capabilities, and sometimes they even misapply it. This review thus offers guidelines for the appropriate application of PLS for future SCM research. & 2015 Published by Elsevier Ltd.
Keywords: Partial least squares Structural equation modeling Supply chain management
1. Introduction Structural equation modeling (SEM) has become the norm for analyzing the cause–effect relations between latent constructs (Hair et al., 2011). SEM techniques can be divided into two general families: covariance-based techniques and variance-based techniques (Henseler et al., 2009). Researchers have so far concentrated primarily on covariance-based SEM (CBSEM) techniques (Medsker et al., 1994; Shook et al., 2004; Steenkamp and Baumgartner, 2000). However, one variance-based technique – partial least squares (PLS) – has gained in popularity, and various disciplines, including supply chain management (SCM) (Hartmann and De Grahl, 2011), marketing (O’Cass and Weerawardena, 2010), and management information systems (Furneaux and Wade, 2011), have increasingly used PLS in recent years because violations of some of the key assumptions of CBSEM limit its applicability. For example, the steady growth of the use of PLS can be attributed to the claim that the approach can estimate research models using small samples and can model both reflective and formative constructs (Peng and Lai, 2012). The application of PLS, however, is controversial: its opponents state that PLS is less rigorous than CBSEM and ineffective for testing theory (Rönkkö and Evermann, 2013). n
Corresponding author. E-mail addresses:
[email protected] (L. Kaufmann),
[email protected] (J. Gaeckler).
The growing number of articles published using PLS in SCM (e.g., Caniëls et al., 2013; Hartmann and de Grahl, 2012; Thornton et al., 2013) and the controversy regarding the application of PLS in various disciplines (e.g., Hair et al., 2011; Henseler et al., 2014; Rigdon, 2012; Rönkkö, 2014; Rönkkö and Evermann, 2013) suggest the need to compare and contrast how PLS is being used in the SCM literature. Thus, a structured review that is targeted directly at SCM research – one that includes the pros and cons of applying PLS – seems warranted. The discipline also faces particular challenges, such as less developed empirical research and increasing difficulties in collecting large samples (De Beuckelaer and Wagner, 2012; Peng and Lai, 2012), that enhances the value of such a review at this time. Guidelines and minimum reporting standards are crucial for advancing research (Ringle et al., 2012b); they not only help authors to develop and execute their own studies, but also help in evaluating the work of others (Gefen et al., 2011). The importance of guidelines for the use of PLS has already been recognized in other fields, including marketing, IT management and accounting, strategic management, and operations management (Hair et al., 2012a; Henseler et al., 2009; Hulland, 1999; Lee et al., 2011; Peng and Lai, 2012; Ringle et al., 2012b). Users of the PLS method can benefit from its use only if they fully understand the underlying principles, apply it correctly, and report the results properly (Hair et al., 2012c). The objective of this study, therefore, is to provide a comprehensive, detailed, and organized overview of the use of PLS in SCM. Specifically, we investigate 75 applications of PLS that were published in ten major SCM journals from 2002 to 2013.
http://dx.doi.org/10.1016/j.pursup.2015.04.005 1478-4092/& 2015 Published by Elsevier Ltd.
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Q2 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
Following Hair et al. (2012a), we comment on our findings and provide researchers and reviewers in our field with guidelines they can use as a checklist for effectively applying PLS and interpreting its results. Thus, this paper contributes to a more balanced and informed application of PLS in SCM. The remainder of the article is structured as follows: in the next section we give a concise overview of PLS and describe its working principles. In section three, we address PLS in the context of SCM research and present the findings from our extensive literature review. We weave into these findings our recommendations on how to evaluate and use PLS in the context of SCM research. Finally, section four sums up the key findings and draws a conclusion.
2. Overview of PLS Originally developed under the name nonlinear iterative partial least squares (NIPALS) by Wold in the 1960s (Wold, 1966) and extended by Lohmöller a few decades later (Lohmöller, 1989), PLS was designed as an alternative to CBSEM for modeling complex multivariate relationships among observed and latent variables (Esposito Vinzi et al., 2010). It gained popularity with the publications of Fornell and Bookstein (1982) and Chin (1998) and has been used frequently across disciplines ever since. A PLS path model consists of two components: a measurement model and a structural model (Henseler et al., 2009). The measurement model, also called an outer model, shows the unidirectional predictive relationships between each latent construct and its associated observed indicator variables. Indicator variables are always associated with a single latent construct. PLS distinguishes between two different measurement models: reflective and formative ones. Reflective indicators are seen as functions of the latent construct, meaning that changes in the latent construct lead to changes in the indicator variables. Formative indicators, in contrast, cause a latent construct, and changes in the indicator variables are visible in the changes in the latent construct (Hair et al., 2011). The structural model, also called the inner model, reflects the relationships that exist between the unobserved or latent constructs. In the structural model, differentiation arises between exogenous and endogenous constructs. Exogenous constructs do not have any structural path relationships pointing at them, thus, they are not caused by any other construct in the model. Endogenous constructs have at least one structural path relationship pointing at them, thus, they are caused by at least one construct in the model (Hair et al., 2011). The basic PLS algorithm includes three stages (Henseler et al., 2009). Stage 1 consists of the iterative estimation of latent variable scores. In this stage, a four-step iterative process is repeated until convergence is obtained: 1. The outer proxies of the latent construct scores are computed as linear combinations of the values of all (standardized) indicators associated with a particular latent construct. 2. The PLS algorithm computes proxies for the structural model relationships. 3. The inner proxies of the latent construct scores are calculated as linear combinations of the outer proxies' respective adjacent latent constructs using the previously determined inner weights. 4. The outer weights are calculated. The approach for this calculation differs based on the type of measurement model each construct represents: When a construct is measured reflectively, the outer weights are calculated as the correlations between the inner proxy of each latent construct and its indicator variables. When a construct is measured formatively, the outer
weights result from the ordinary least squares regression of the inner proxy of each latent variable on its indicators (Hair et al., 2011). These four steps are repeated until the sum of the change in outer weights between two iterations has decreased to a predefined limit. The recommended limit is a threshold value of 10 5 to ensure the convergence of the PLS algorithm. The algorithm ends after Stage 1, delivering latent variable scores for all latent variables. Stage 2 then comprises the estimations both of outer weights/loading and of path coefficients. The final stage, Stage 3, consists of the estimation of the mean and the location parameters (i.e., OLS intercepts) for the indicators and latent variables in the model (Henseler et al., 2009; Lee et al., 2011). PLS is aimed at maximizing the explained variance of the dependent latent constructs by estimating partial model relationships using composites in an iterative sequence of ordinary least squares regressions (Hair et al., 2011). Meanwhile, CBSEM's objective is to reproduce the theoretical covariance matrix, without focusing on explained variance (Hair et al., 2011). Latent variables in PLS, unlike in CBSEM, are estimated as exact linear combinations of their indicators and are therefore not true latent variables as they are defined in SEM (Marcoulides et al., 2009). CBSEM requires that a set of assumptions or conventions be fulfilled, including, for example, the multivariate normality of data and the minimum sample size (Hair et al., 2011). The use of PLS is advocated if these assumptions cannot be maintained because, as some researchers have argued, PLS has minimal requirements on sample size and data characteristics (Hair et al., 2011; Peng and Lai, 2012). Furthermore, while the inclusion of formative and reflective indicators in CBSEMs might cause identification problems, PLS is well suited for both formative and reflective indicators (Henseler et al., 2009). In addition, PLS can be used to estimate highly complex models. In more complex models, the number of latent and observed variables might be high, compared to the number of observations (Henseler et al., 2009). Moreover, some researchers argue that PLS is more suitable for prediction and theory development, while CBSEM is more appropriate for theory testing and confirmation (Hair et al., 2011). PLS thus has been the subject of much debate. One half of the research community argues that PLS has its advantages when it is correctly used, and might even be a “silver bullet” (Hair et al., 2011); the other half is strictly against its use, arguing that it is inferior to traditional CBSEM techniques (Antonakis et al., 2010; Rönkkö, 2014; Rönkkö and Evermann, 2013). Researchers opposing the use of PLS criticize the bias and inconsistency of parameter estimates, its inability to model measurement errors, and the lack of an over-identification test, which would allow for testing a model causally (Hwang et al., 2010; Peng and Lai, 2012; Rönkkö and Evermann, 2013). In their recent article about commonly held but wrong beliefs about PLS, Rönkkö and Evermann (2013) state that most of the common beliefs about PLS are not based on statistical theory or simulation studies but on previously published articles, which show no proofs for the claims made. Commenting on Rönkkö and Evermann's (2013) article, Henseler et al. (2014) argue that these claims about PLS in turn, are not justified and that PLS does offer advantages for exploratory research. McIntosh et al. (2014) take stock of the two articles, contending that PLS should divorce itself from the factor-analytic tradition and focus on developing itself further as a purely composite-based statistical methodology. The composite factor model differs from the common factor model by relaxing the strong assumption that all the covariation between a block of indicators is explained by a common factor. Thus, the composite factor model does not impose any restrictions on the covariances between indicators of the same construct. Rather, composites are formed as linear combinations of
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
their respective indicators (Henseler et al., 2014). However, McIntosh et al. (2014) question whether PLS can be superior to CBSEM in the composite case. In his latest paper, Rönkkö (2014, p. 170) notes that PLS is “unsuitable for any type of statistical inference” because it capitalizes on chance correlations between the error terms. Rigdon (2012), in contrast, emphasizes the strength of PLS as a tool for prediction but suggests that PLS should emphasize its status as a purely composite-based method to separate itself from factor-based SEM. Making this clear distinction would involve completing and validating a purely composite-based approach, as well as developing a different approach to measure validation. As a reaction to Rigdon (2012), Dijkstra (2014, p. 1) states “that when Professor Rigdon's advice is adhered to, we may waste a potentially very useful tool,” and Bentler and Huang (2014) propose to maintain and further improve PLS to mimic CBSEM results perfectly, while Sarstedt et al. (2014) argue that PLS should retain its predictive character. This capability could be achieved by extending PLS evaluation criteria to include new criteria that assess the predictive capabilities of the model and by developing model fit measures. As evidenced by this summary of the ongoing debate among its critics and proponents, PLS as a statistical method still must be developed further. Rather than reflecting on the key issues of the current heated debate about PLS, this article presents methodological guidelines intended to improve the use of PLS in its current form as applied by many SCM researchers. Despite the ongoing debate, the use of PLS in SCM is still growing; taking stock and deriving a guideline targeted at the SCM community – one that also points to the common problems associated with the use of PLS – thus seems warranted. Although acknowledging the current pitfalls of PLS and developing statistical approaches to fix them is important, we should not preclude the use of PLS from the outset because we know that no single empirical methodology is perfect (Hair et al., 2013; Rigdon, 2012).
3. A summary of the use of PLS in supply chain management research 3.1. Review methodology This paper aims to provide a critical analysis of the application of PLS in SCM. The use of PLS has been well documented in disciplines such as marketing (Henseler et al., 2009), accounting (Lee et al., 2011), and strategic management (Hair et al., 2012c; Hulland, 1999). Recently, Peng and Lai (2012) assessed the use of PLS in operations management (OM), reviewing 42 articles published between 2000 and 2011 dealing with OM-related topics. The use of PLS in SCM-related journals has not yet been evaluated. We therefore conducted a structured literature review to identify the status of the use of PLS in SCM research, following the methodology described by Miemczyk et al. (2012). Following Chen and Paulraj (2004), who define SCM as the combination of distribution, production, and purchasing, and using previously published literature reviews in SCM research (Carter et al., 2009; Giunipero et al., 2008; Rungtusanatham et al., 2003) as a guideline, we reviewed key journals in these specific areas. Specifically, we considered the following ten journals in our review: Decision Sciences (DS), Journal of Operations Management (JOM), Journal of Purchasing and Supply Management (JPSM), Journal of Supply Chain Management (JSCM), International Journal of Physical Distribution and Logistics Management (IJPDLM), Management Science (MS), Journal of Business Logistics (JBL), Supply Chain Management: An International Journal (SCM:IJ), International Journal of Logistics Management (IJLM), and Transportation Research Part E: The Logistics and Transportation Review (TRE). Prior research has repeatedly
3
documented the leading status of these journals in terms of relevance, and numerous citations justify the importance of these journals in the SCM discipline (see Appendix A) (Denk et al., 2012; Giunipero et al., 2008; Näslund et al., 2010; Zsidisin et al., 2007). All these journals have received an Impact Factor rating higher than 1.0 by Thomson Reuters in the recent past. We limited our search to articles published during the 12-year period between 2002 and 2013 because the use of PLS among business research communities, and specifically in the rather young discipline of SCM, is a relatively new phenomenon. In line with Peng and Lai (2012), we therefore concentrate on issues commonly observed in recent research. Moreover, many methodological advances in the PLS technique have only recently been introduced (Esposito Vinzi et al., 2010; Sarstedt et al., 2011). Using the keywords, “partial least squares” and “PLS,” a search of relevant articles in important academic publishing databases (e.g., ABI/Inform, Elsevier ScienceDirect, and Emerald Insight) and in the databases of the respective journal publishers yielded 113 articles. After manually reviewing each article, we deleted 37 articles because they did not contain applications of PLS. Moreover, we omitted one additional article that had a sample size of 4166 because the article clearly is an outlier and would distort the mean of our data. Accordingly, 75 studies constitute the basis for our review. Table 1 depicts the frequency and timing of the articles we identified that use PLS, by journal and year. A list of the reviewed articles is included in the appendix. A first glimpse at Table 1 reveals the timeliness of the topic: While 33 articles were published in the eight-year period from 2002 to 2009, 42 articles have been published during the four-year period from 2010 to 2013. This increase clearly shows that PLS as a means of doing statistical analysis has gained popularity, as also noted by Henseler et al. (2009), Hair et al. (2012a), and Ringle et al. (2012b) for marketing research, management information systems research, and strategic management research, respectively. Each article was evaluated based on the following criteria: methodological reasons for using PLS, outer and inner model evaluations, and reporting. We also differentiated between three time periods to evaluate whether the use of PLS has changed over time, looking at articles published from 2002 to 2005, from 2006 to 2009, and from 2010 to 2013. We opted for this structure to allow for a certain time lag in the study of PLS use, taking into consideration the early, influential PLS articles published in the late 1990s (e.g., Chin, 1998; Hulland, 1999), the changes apparent in articles published in the middle years of our study period (e.g., Petter et al., 2007; Tenenhaus et al., 2005), and the newest developments and methodological advancements in the community, as reflected in the most recent articles (e.g., Hair et al., 2012c; Peng and Lai, 2012). 3.2. Reasons for using PLS The initial specification of the model naturally is the first step in an SEM analysis. This step includes the definition of the presumed relations among the latent variables and the expression of the hypotheses in the form of a structural equation model. However, even before the initial specification of the research model, researchers should carefully evaluate whether to apply CBSEM or PLS, depending on the objectives of their study, model complexity, and the conceptualization of the constructs. Of the 75 articles anaylzed, 22 (29%) do not provide a rationale for using PLS, while the rest explicitly state why PLS was chosen. Among the most often stated reasons for the use of PLS are small sample size (n ¼31; 58%), non-normal data (n ¼22; 42%), the use of formative constructs (n ¼17; 32%), and the exploratory nature of the study (n ¼16; 30%) (see Table 2). Sample size is an important consideration in SEM because
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
4
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
Table 1 Frequency of PLS publications by journal and year.
Table 3 Sampling characteristics.
DS JOM JPSM JSCM IJPDLM MS JBL SCM:IJ IJLM TRE Total 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 Total
1 1 0 0 2 3 5 3 5 2 4 1 27
0 0 1 0 0 2 0 2 1 0 3 7 16
0 0 0 0 0 0 0 0 0 3 1 2 6
0 0 0 0 1 0 1 0 0 1 1 1 5
0 0 0 0 0 0 0 0 1 2 2 0 5
0 1 1 0 1 0 0 0 1 1 0 0 5
0 0 0 0 0 0 2 1 0 0 1 0 4
0 0 0 0 0 0 2 1 1 0 0 0 4
0 0 0 0 0 0 0 1 0 0 1 0 2
0 0 0 0 0 0 1 0 0 0 0 0 1
1 2 2 0 4 5 11 8 9 9 13 11 75
sample size directly affects the reliability of parameter estimates, model fit, and statistical power (Peng and Lai, 2012; Shah and Goldstein, 2006). Descriptive statistics of our sample reveal a mean sample size of 274.42, with values ranging from 35 to 2465 (see Table 3). However, we see 31 articles (41%) that have less than 150 observations and 13 articles (17%) with less than 100 observations. Studies of this size are more likely to lack statistical power and to produce unstable results (De Beuckelaer and Wagner, 2012). It has been argued that PLS performs better than traditional CBSEM techniques when dealing with considerably smaller sample sizes because CBSEM might lead to nonconvergence problems and improper solutions in small samples (Boomsma and Hoogland, 2001; Henseler et al., 2014). However, simulation studies that have compared the small sample performances of PLS and CBSEM have neglected several recent methodological innovations that have been developed to improve CBSEM's performance (McIntosh et al., 2014); a final judgment thus cannot be made. For example, researchers using CBSEM should recognize their ability to reduce the required sample size by reducing the number of indicators, which they can do by parceling. The technique of parceling includes summing or averaging item scores from two or more items and using these parcel scores, instead of item scores, in the analysis (Bandalos, 2002; Hall et al., 1999). The risk is that collapsing indicators into aggregates can conceal misspecification in the measurement portion of the model, resulting in overly optimistic fit statistics and inflated estimates of structural parameters (McIntosh et al., 2014). Parceling should therefore be used cautiously. Henseler et al. (2009) argue that the claim that PLS is more efficient with small sample sizes is generally misleading because the goal of achieving sufficient statistical power is neglected. Moreover, they maintain that certain common rules of thumb for
Criterion
Sample size Mean Median Range
Number of models
Proportion (%) 2002 to 2006 to 2009 2005 (n¼28) (n¼ 5)
274.42 168 (35; 2465)
Number of studies with Less than 100 13 observations Nonresponse bias 35
2010 to 2013 (n¼ 42)
512.80 223 (83; 1781)
207.79 181.50 (50; 651)
290.85 148 (35; 2465)
17
1
5
7
47
2
14
19
PLS are not valid, including that PLS requires a sample size of either (1) only ten times the number of indicators of the scale that has the largest number of formative constructs, or (2) only ten times the largest number of structural paths directed at a particular construct in the inner path model (Henseler et al., 2009). The rules of thumb do not take into account effect size, reliability, number of indicators, and other important factors that might affect statistical power. Researchers using PLS should therefore keep the fundamentals of sampling theory in mind and ensure that their sample size is large enough to achieve adequate levels of statistical power (Hair et al., 2013; Riedl et al., 2014). A minimum power level of 0.80 is thereby considered reasonable (Verma and Goodale, 1995). We thus advise researchers to calculate the minimum sample size a priori or to calculate statistical power in the analysis phase to determine sample size adequacy (Cohen, 1992; MacCallum et al., 1996). To calculate a minimum sample size for predetermined power levels, researchers may turn to specialized programs, such as SamplePower, nQuery Advisor (Elashoff, 2007), or they might use free sample size calculators provided on the Internet (De Beuckelaer and Wagner, 2012; Lane, 2008). Furthermore, researchers can use the SAS program of MacCallum et al. (1996), which originally was developed for calculating a minimum sample size for CBSEM. To summarize, sample size alone should never be the main criteria for the application of PLS. Of the analyzed articles, 29% identify non-normal data as the rationale for using PLS. One argument for PLS use has been that, in contrast to CBSEM, PLS allows for less strict assumptions on data distribution; it also does not require a multivariate normal data distribution. PLS can handle nominal-, ordinal-, and intervalscaled variables (Peng and Lai, 2012). In CBSEM, non-normal data can lead to underestimated standard errors and inflated goodnessof-fit statistics (Peng and Lai, 2012). Nevertheless, even though PLS is very robust when used with non-normal data, bootstrapping standard errors might become inflated, especially when the
Table 2 Reasons for using PLS. Number of studies
Proportion (%)
2002 to 2005 (n¼5)
2006 to 2009 (n¼28)
2010 to 2013 (n¼ 42)
Total
53
–
4
19
30
Specific reason Small sample size Non-normal data Formative measures Exploratory research Focus on prediction Model complexity Consistent with study objective Other
31 22 17 16 13 11 3 16
58 42 32 30 25 21 6 30
2 2 1 2 2 0 1 2
9 8 7 5 3 3 1 7
20 12 9 9 8 8 1 7
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
sample size is small. This, in turn, can result in lower levels of statistical power (Hair et al., 2012c). Researchers should therefore examine and report the degree to which data are non-normal, using, for example, the Kolmogorov–Smirnov test or the Shapiro– Wilk test (Lilliefors, 1967; Royston, 1983). Researchers should then consider the degree of non-normality, and in cases where the test is highly significant and the data exhibits extreme departures from normality, they might either try to increase sample size or refrain from using PLS altogether. Just like CBSEM, PLS suffers somewhat (i.e., loss of power) under extreme departures from normality, but it is relatively robust to moderate departures from normality (Goodhue et al., 2012). Alternative estimation procedures also are available for use with CBSEM when the data are non-normal, including weighted least squares and unweighted least squares, so that the choice of PLS over CBSEM should not be based solely on data distribution characteristics (Hair et al., 2012a). Another commonly stated reason for using PLS is that it allows for the use of formative constructs (23%). Even though CBSEM can accommodate formative indicators, their use often leads to unidentified models (Jarvis et al., 2003). Researchers thus must adhere to rules that impose specific constraints on the model to ensure model identification (Hair et al., 2012a). For example, researchers might choose between three different scaling options, follow the 2þ emitted path rule or the exogenous X rule, or include one more reflectively measured construct in an endogenous position to make the model theoretically defensible and operationally testable (see Bollen and Davis (2009) and Diamantopoulos and Riefler (2011) for more information). Researchers who include formative constructs in their models should also collect data on additional constructs because the original model might be modified (Diamantopoulos and Riefler, 2011). Using formative constructs in PLS, in contrast, does not lead to problems because identification is not a problem for recursive models; the reason is that the algorithms performed in a PLS analysis consist of a series of ordinary least squares analyses (Peng and Lai, 2012). This advantage of PLS over CBSEM suggests that PLS should be used when analyzing models with formative constructs. Formative constructs were used in 26 (35%) of the articles in our sample. However, the use of such constructs implies that the indicators determine the entire construct domain (Hair et al., 2012c). Researchers should therefore be careful when making decisions regarding construct specification when using formative constructs. Finally, 21% of the articles state that the exploratory nature of the study is the rationale for using PLS. PLS is prediction-oriented
5
and should be used if the theory on which the model at hand is based has not been well understood and if the predictive validity of exogenous constructs needs to be assessed. CBSEM, in contrast, is parameter-oriented and is more appropriate when the proposed research model is based on well-established theories (Peng and Lai, 2012). However, an ongoing debate questions whether PLS can be used for exploratory modeling (Henseler et al., 2014; Rönkkö and Evermann, 2013). According to McIntosh et al. (2014), who balance the viewpoint of both Henseler et al. (2014) and Rönkkö and Evermanns (2013), PLS can be used for exploratory modeling, but researchers should make use of local specification tests when the model is misspecified to detect the specific source of misspecification. We therefore conclude that PLS can be a valuable tool for exploratory research, but it should be applied with care. In addition, we advise researchers to stay on top of the ongoing debate and to pay attention to new approaches for exploratory modeling that are added to the methodological toolboxes of PLS. What is important to recognize, in light of our review, is that weaknesses of one technique do not imply that another technique is superior. Researchers should carefully choose between PLS and CBSEM and support their choice with adequate arguments, avoiding generalized statements about the ability of PLS to estimate models using small samples or non-normal data. PLS is definitely not a silver bullet and should not be applied without care. 3.3. Measurement model characteristics To see which latent constructs were studied in the 75 papers identified, we used text mining via tag clouds (Kaufmann and Saw, 2014; Rozemeijer et al., 2012). Our tool of choice was “Wordle”, a free software program developed by IBM (www.wordle.net), and we inserted into it more than 500 latent construct names. Tag clouds allow the visualization of key subjects in a direct form, giving prominence to words that appear more frequently. As Fig. 1 shows, researchers using PLS as a method to solve structural equation models are primarily concerned with analyzing the effect of various antecedents on a number of different performance measures accounting for environmental conditions. Moreover, the focus of research lies on “management”, “supplier”, “information/ knowledge”, and “relationship”. Descriptive statistics for the main elements of the measurement models included in our review can be found in Table 4. On average, the number of latent variables is 7.05, and the models incorporate 28.11 indicators. Of the 75 studies, 49 (65%) use only
Fig. 1. Tag cloud for latent constructs.
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
6
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
Table 4 Model descriptive statistics. Criterion
Number of models
Proportion (%) 2002 to 2006 to 2009 2005 (n¼28) (n ¼5)
2010 to 2013 (n¼ 42)
Number of latent variables Mean 7.05 Median 6.0 Range (3; 21)
5.8 5 (4; 9)
7.21 6 (3; 21)
7.1 7 (3; 17)
Number of structural model relations Mean 8.77 Median 8 Range (3; 25)
8.4 8 (4; 13)
9.36 7.5 (3; 25)
8.43 8 (3; 17)
3 0 2
17 0 11
29 0 13
Mode of measurement models Only reflective 49 Only formative 0 Reflective and 26 formative
65 – 35
Total number of indicators in models Mean 29.11 Median 26.5 Range (9; 70)
Number of studies Single-item constructs Higher order constructs Item wording reported Scales reported Scale means and standard deviations reported Correlation/covariance matrix
20.5 21.21 21 28.5 (17; 23) (11; 70)
28.52 26 (9; 57)
with 22
29
3
6
13
26
35
2
10
14
70
93
4
27
39
72 53
96 71
4 4
27 20
41 29
63
84
4
23
36
reflective constructs, and 26 (35%) use a combination of both reflective and formative constructs. Among the latter, 17 (65%) also mention the use of formative constructs as a reason for the application of PLS. Most studies that report the use of formative constructs measure more than one construct formatively (e.g., Gray and Meister, 2004; Klein, 2007; Xu et al., 2010). In total, 61 formatively measured constructs could be identified. We observed that some constructs are measured both formatively and respectively. For example, the construct environmental uncertainty is part of different models and most often is measured reflectively, although not always (e.g., Hoffmann et al., 2013; Yigitbasioglu, 2010; Yu et al., 2013). For scientific results to be valid, researchers must properly specify reflective and formative constructs (Petter et al., 2007). We advise researchers who model constructs as formative to perform a thorough evaluation and present a proper explanation as to why the constructs are measured formatively.
Our review further reveals that 22 of the 75 models (29%) use single-item measures. In the articles we reviewed, constructs such as “service delivery performance”, “ethical decision”, “repurchase intention” or “trusting relationships” are measured with only a single item. Single-item constructs might be problematic because the use of a small number of items for construct measurement aggravates PLS's tendency to underestimate inner model relationships. Researchers should thus avoid using single-item measures (Hair et al., 2012c). Similar to CBSEM, PLS has a set of procedures for evaluating the reliability and validity of constructs. Note that different evaluation criteria should be applied for reflective vs. formative constructs: For reflective constructs, researchers should generally evaluate construct reliability through Cronbach’s alpha and composite reliability; and convergent validity is evaluated using the average variance extracted (AVE) (Peng and Lai, 2012). In addition, the Fornell–Larcker criterion or cross-loadings are checked for discriminant validity (Henseler et al., 2009). In our sample, 63 articles (84%) report indicator loadings, 31 articles (41%) report Cronbach's alpha, 59 articles (79%) report composite reliability, and 61 articles (81%) state the AVE for reflective constructs. In addition, 60 articles (80%) report the Fornell–Larcker criterion, while 56 articles (75%) show cross-loadings. The overview of the reported reflective measurement model statistics are in Table 5. Cronbach's alpha should be greater than 0.70, composite reliability should pass the threshold of 0.60, and AVE should be greater than 0.50 (Henseler et al., 2009). Compared to Cronbach's alpha, composite reliability is regarded as the more appropriate criterion for establishing internal consistency because PLS emphasizes indicators with strong reliability levels more (Hair et al., 2012c). The Fornell–Larcker criterion postulates that the AVE of each latent variable should be higher than the squared correlations with all other latent variables (Fornell and Larcker, 1981). Based on cross-loadings, the appropriateness of the model under review might be reconsidered if an indicator has a higher correlation with another latent variable than with its respective latent variable. In cases where some of the evaluation criteria are violated, researchers have several options for moving forward, such as excluding individual indicators, reconsidering the position of the indicator in the model, or revising the path model altogether (Henseler et al., 2009; Martinez-Ruiz and Aluja-Banet, 2009). Formative indicators cannot be analyzed using the widely accepted standard procedures because internal consistency is not a useful validation criterion – that is, formative indicators should not be highly correlated (Hair et al., 2006). For formative constructs, the indicator's contribution to the construct should always be assessed by checking and reporting each formative item's weight, sign, and magnitude. Moreover, researchers should check multicollinearity because high multicollinearity implies that some items may be redundant (Peng and Lai, 2012). Multicollinearity can be evaluated by examining the variance inflation factor (VIF). Despite the importance of using the right criteria to evaluate formative constructs, only 19 of the 26 articles that use formative constructs (73%) report the indicator's weight, and only 5 of the 26 articles (19%) report the sign and magnitude of the indicator weights.
Table 5 Reported reflective measurement model statistics. Criterion
Number of models
Proportion (%)
2002 to 2005 (n ¼5)
2006 to 2009 (n¼ 28)
2010 to 2013 (n¼ 42)
Indicator loadings Composite reliability Cronbach's alpha AVE Fornell and Larcker criterion Cross-loadings
63 59 31 61 60 56
84 79 41 81 80 75
4 3 1 4 3 3
21 21 14 21 20 19
38 35 16 36 37 34
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
7
Table 6 Reported formative measurement model statistics. Criterion
Number of models Proportion (n¼ 26) (%) 2002 to 2005 (n¼ 5)
2006 to 2009 (n¼ 28)
2010 to 2013 (n ¼42)
Reflective criteria used to evaluate formative constructs Indicator weight Standard errors, significance levels, t-values/p-values for indicator weights VIF Other
6 19 5
23 73 19
0 1 0
3 6 1
3 12 4
10 5
38 19
0 0
2 2
8 3
Table 8 Technical reporting. Number of Proportion (%) 2002 to models 2005 (n¼ 5)
2006 to 2009 (n¼28)
2010 to 2013 (n¼ 42)
29 22 1 23
39 29 1 31
0 5 0 0
2 12 1 13
27 5 0 10
Resampling method Use mentioned 53 Algorithmic 38 options
71 51
4 4
19 12
30 22
Criterion
Software used SmartPLS PLS-Graph VisualPLS Not reported
Moreover, only 10 articles (38%) report the VIF. Also of concern is that 6 of the 26 research models that use formative constructs (23%) are inappropriately evaluated using reflective evaluation criteria (see Table 6). In these cases, measures such as composite reliability or AVE are reported, even though the constructs are defined as formative; in these cases, the researchers have neglected fundamental principles of outer model evaluation (Hair et al., 2012c). The item weight should be statistically significant, the sign should be consistent with the theory, and the value of the item weight should not be less than 0.10 (Peng and Lai, 2012). A VIF greater than 5 suggests the presence of multicollinearity (Hair et al., 2011). In general, researchers should be more careful when analyzing formative measures and follow established and more recently proposed guidelines (Hair et al., 2013). To check the validity of the intended set of formative indicators in measuring the construct of interest, researchers can, for example, evaluate the correlation between the formatively measured construct and a reflective measure of the same construct (Hair et al., 2012c) – a step called redundancy analysis (Chin, 1998). 3.4. Structural model characteristics After the assessment of the outer model, the evaluation of the inner path model must be conducted. The typical criteria for evaluating the structural model are the coefficient of determination (R2); the sign, magnitude, and significance of path coefficients; the effect size (f2); and the predictive relevance (Q2 or q2) (Henseler et al., 2009). The average number of structural model relationships in our sample is 8.77. Of the 75 studies, 71 (95%) report the coefficient of determination; 73 studies (97%) provide the absolute value of the path coefficient, as well as the significance level. However, only 53 studies (71%) mention the use of resampling methods for significance testing (e.g., bootstrapping and jackknifing). In addition, 38 of these 53 studies (72%) report the number of bootstrap samples (see Table 8). Meanwhile, 8 of 75 studies (11%) state the effect size f2, while another 10 (13%) report
the Stone–Geisser's Q2. Table 7 summarizes the review of the structural model evaluation. None of the studies reports computational settings, such as the weighting scheme or the abort criterion. In light of this review, we strongly recommend that researchers accurately report the main criteria and be cautious when interpreting structural model results. We further advise that alternative analyses – for example, a comparison of ordinary least squares path analysis results with PLS results (Peng and Lai, 2012) – be performed to check the robustness of the results. According to Chin (1998), R2 values of 0.67, 0.33, and 0.19 are deemed substantial, moderate, and weak, respectively. For the hypotheses to be supported, the sign of the structural path should be in line with the a priori postulated expectation. To estimate the significance of path coefficients, resampling techniques, such as bootstrapping or jackknifing, have to be used in PLS. We advise researchers to generate as many bootstrapping samples as possible (4500) because it reduces the effect of random sampling errors (Peng and Lai, 2012). The effect size can be evaluated using Cohen's f2 (Cohen, 1988). The effect size is calculated as the gain in R2 relative to the proportion of variance of the endogenous latent variable that remains unexplained. Values of 0.02, 0.15, and 0.35 represent small, medium, and large effects, respectively (Cohen, 1988). To illustrate, Setia and Patel (2013) report effect sizes between 0.104 and 0.139 for their model measuring the influence of information systems designs on absorptive capacity, and Yu et al. (2013) report effect sizes between 0.25 and 0.67 for their model analyzing the effect of network embeddedness on relationship performance. Finally, the predictive relevance of the structural model can be assessed with Stone–Geisser's Q2, which demands that the model be able to provide a prediction of the endogenous latent variable's indicators (Geisser, 1975; Stone, 1974). Although it provides a gauge for out-of-sample predictions, Stone–Geisser's Q2 is seldom reported in PLS studies, and even if it is reported, it is usually not interpreted (Sarstedt et al., 2014). Stone–Geisser's Q2 can be measured using blindfolding procedures, and values greater than zero show that the model has predictive relevance. The relative effect of the predictive relevance can be evaluated with the measure q2. Here, values of 0.02, 0.15, and 0.35 reveal a small, medium, or large predictive relevance of a certain latent variable, respectively (Henseler et al., 2009). Because no established test of overall model fit exists for PLS, researchers have started to compute the goodness of fit (GoF) (D’Arcy and Devaraj, 2012; Peng and Lai, 2012; Tenenhaus et al., 2005), which is defined as the geometric mean of both the average communality and the average R2. However, Henseler and Sarstedt (2013) have recently shown that the GoF is not suitable for model validation, and we therefore do not recommend applying this measure. Current developments in the research community might eventually reveal a more appropriate evaluation criterion, such as the standardized root mean square residual, which might be used to detect model misspecifications (Sarstedt et al., 2014).
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
8
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
Table 7 Reported structural model statistics. Criterion
Number of models
Proportion (%)
2002 to 2005 (n¼ 6)
2006 to 2009 (n¼ 28)
2010 to 2013 (n¼ 42)
Coefficient of determination R2 Effect size (f2) Q2 Effect size q2 Path coefficients Standard errors, significance levels, t-values, p-values Confidence intervals GoF
71 8 10 0 73 73 3 4
95 11 13 0 97 97 4 5
4 0 0 0 5 5 0 0
26 2 2 0 26 26 1 0
41 6 8 0 42 42 2 4
3.5. Reporting First, researchers should report the PLS software they use in the analysis (Peng and Lai, 2012). Only 52 articles in our sample (69%) report the specific PLS software (see Table 8). SmartPLS (n ¼29; 39%) and PLS Graph (n ¼22; 29%) are the most popular. Reporting the software used is important because applications differ in their default settings. Second, researchers should report item wording and item scales. Of the 75 articles in our sample, 70 (93%) report the item wording; 72 (96%) report item scales; and 63 (84%) include the correlation or covariance matrix, thereby creating a high level of transparency. Moreover, 53 studies (71%) report scale means and standard deviations. Third, we suggest that researchers calculate and report statistical power (e.g., as Hall et al. (2012) and Reuter et al. (2012) do) to demonstrate that their studies have an adequate amount of power (4 0.80) (MacCallum et al., 1996).
4. Discussion and conclusion On the basis of 26 key dimensions, we compared our SCM study results with the main findings of recent reviews of PLS applications from a variety of other disciplines, such as IT management (Ringle et al., 2012a), marketing (Hair et al., 2012a), operations management (Peng and Lai, 2012), and strategic management (Hair et al., 2012c). Our results suggest that SCM research applies the same or even higher reporting standards than other disciplines do (see Table 9). For example, the average sample size in the articles we reviewed is larger than the sample sizes found in any other discipline, as shown in Table 9; in addition, fewer studies in SCM have a sample size smaller than 100, and fewer make use of single-item constructs. Moreover, the proportion of SCM studies reporting the criteria used for the evaluation of the outer and inner models is equivalent to the proportion of studies from other disciplines reporting these criteria. Nevertheless, not all of the articles we reviewed provide test statistics that adequately document the results of the PLS analysis. Some researchers deliberately performed all tests and reported data on all necessary criteria, but others only ineffectually documented their approach. For example, criteria for evaluating the structural model, such as the predictive relevance or the effect size, seem to be underrepresented. The reviews of PLS in disciplines such as marketing and strategic management show similar results. Moreover, even if data on all necessary criteria are reported, researchers often do not critically reflect on or interpret their results (Hair et al., 2013). The use of PLS in SCM research has clearly increased in recent years. However, a comparison of the articles published from 2002 to 2005, from 2006 to 2009, and from 2010 to 2013 shows that some of the more recent articles continue to exhibit the same issues as previous ones in evaluating and reporting the results of the
PLS analysis. For example, sample size adequacy remains an issue: The minimum sample size is 83 in the 2002–2005 period, 50 in the 2006–2009 period, and 35 in the 2010–2013 period (see Table 3). The number of studies reporting the use of a resampling method has decreased from 80% to 71% (see Table 8), and three articles in both the 2006–2009 and 2010–2013 time periods inappropriately use reflective criteria to analyze formative constructs (see Table 6). Our advice to SCM researchers who rely on PLS is to follow rigorous standards when applying PLS and when reporting their results. Researchers should make use of the full range of established criteria to assess a PLS model (Hair et al., 2012c). Moreover, they should continuously follow the latest discussions in the community as the PLS toolbox is extended still further and as new evaluation criteria are developed (e.g., the use of root mean squared covariances, the PLS estimator PLSc, and PLSe2) (Bentler and Huang, 2014; Sarstedt et al., 2014). Our advice focuses on the practical use of PLS and should support researchers, reviewers, and editors alike. CBSEM is better grounded in statistical theory than PLS and should be the preferred method of choice if all assumptions are met (Chin, 1995; Peng and Lai, 2012). In cases where some of the key assumptions in CBSEM are violated, and the focus lies on predictive analysis, PLS is an acceptable analytical method and a realistic alternative, as long as researchers carefully report their reasons for choosing PLS (Henseler et al., 2009). However, PLS is “not a silver bullet” (Hair et al., 2011; Rönkkö and Evermann, 2013), and its application should be precisely executed and reported. Nevertheless, the disadvantages of PLS should not preclude its use; as recognized already, no single empirical methodology is perfect (Hair et al., 2013; Henseler et al., 2014; Rigdon, 2012). Moreover, any extreme position that neglects the beneficial features of another technique cannot be considered good research practice and does not advance the general understanding of methods (Hair et al., 2012b). In addition, recent methodological advances, such as confirmatory tetrad analysis, importance-performance matrix analysis, and the use of the PLSc estimator, equip researchers with more flexibility and allow for a more nuanced testing of theoretical concepts (Bentler and Huang, 2014; Hair et al., 2013). We acknowledge the following limitations of our study. First, this paper evaluates a subset of SCM journals. A wider journal search, including journals such as International Journal of Production Economics, International Journal of Production Research, and International Journal of Operations & Production Management, might reveal additional relevant articles and increase the generalizability of our findings. Second, we focus mainly on the drawbacks of PLS and do not examine the extent to which CBSEM or other methodologies also might be problematic (McQuitty, 2004; Ringle et al., 2012a). Third, this paper emphasizes basic criteria of PLS analysis, but researchers can benefit from a wider range of recent methodological extensions of PLS. For example, evaluating the contributions of confirmatory tetrad analysis, PLS multi-group analyses, and response-based segmentation approaches (Gudergan et al., 2008; Hair et al., 2011;
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
9
Table 9 Comparison of key criteria across disciplines. IT management
Marketing
2000–2011
Strategic management Hair et al. (2012c) 37 articles (112 models) 1981–2010
Time period
Ringle et al. (2012a) 65 articles (109 models) 1992–2011
Hair et al. (2012a) 204 arcticles (311 models) 1981–2010
2002–2013
1
Reasons for using PLS reported
70.77%
n.r.
71.43%
86.50%
71%
2
Number of latent variables Mean Median Range
8.12 7 (3; 36)
7.94 7 (2; 29)
n.r. n.r. n.r.
7.5 6 (2; 31)
7.05 6.0 (3; 21)
Number of structural model relations Mean 11.38 Median 8 Range (2; 64)
10.56 8 (1; 38)
n.r. n.r.
10.4 9 (2; 39)
8.77 8 (3; 25)
Mode of measurement model Only reflective Only formative Reflective and formative
42.20% 1.83% 30.28%
42.12% 6.43% 39.55%
n.r. n.r. n.r.
10.70% 10.70% 50%
65% 0 35%
5 6
Studies with single-item constructs Higher order constructs
47.69% 23.08%
46.30% n.r.
n.r. n.r.
67.90% n.r.
29% 35%
7
Sample size Mean Median Range
238.12 198 (17; 1449)
211.29 n.r. n.r.
246 126 (35; 3926)
154.9 83 n.r.
274.42 168 (35; 2465)
Less than 100 observations Software used reported Resampling method mentioned
22.94% 58.46% 93.85%
24.44% 49.02% 66.18%
33.33% 61.90% 52.38%
51.79% 48.70% 54.10%
17% 69% 71%
14.29%
23.08%
26.32%
25.00%
23%
68.57% 57.14% 25.71%
23.08% 17.48% 11.89%
73.68% 21.05%
38.20% 4.40% 1.50%
73% 19% 38%
Overall number of analyzed models
3
4
8 9 10
Formative outer model Reflective criteria used to evaluate formative constructs 12 Indicator weight 13 Significance of weight 14 VIF 11
Operations management Peng and Lai (2012) 42 articles
Supply chain management – 75 articles
15 16 17 18 19 20
Reflective outer model Indicator loadings CR Alpha AVE Fornell–Larcker criterion Cross-loadings
88.61% 56.96% 10.13% 88.61% 50.63% 78.48%
61.81% 55.91% 40.94% 57.48% 55.91% 16.93%
n.r. n.r. n.r. n.r. n.r. n.r.
77.90% 45.59% 30.88% 42.70% 19.10% 19.10%
84% 79% 41% 81% 80% 75%
21 22 23 24 25 26
Inner model R2 Effect size Q2 q2 Path coefficients Significance
96.33% 11.93% 0 0 98.17% 98.17%
88.42% 5.14% 16.40% 0 95.82% 92.28%
85.71% 14.29% 9.52% n.r. 100% 100%
80.40% 10.70% 2.70% 0 95.50% 95.50%
95% 11% 13% 0 97% 97%
Sarstedt et al., 2011) would be a fruitful avenue for SCM researchers using the PLS methodology. A natural direction for further developing the methodology would be to apply PLS to the most current research topics in the SCM field. For example, PLS could be applied to advance our understanding of the exercise of power in modern supply chains, to increase our knowledge of the antecedents of sustainable procurement, or to analyze the influencing factors of
offshoring decisions. Survey research in general should contribute to solving complex SCM problems and can be part of longitudinal or multi-method research designs (Fawcett et al., 2014). Despite the limitations we have articulated, we hope that our recommendations provide a useful framework for a more informed application of PLS in future empirical SCM research.
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
10
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
Appendix A. Citations justifying journals assessed
Journal
Authors justifying quality and importance
Decision Sciences (DS) International Journal of Logistics Management (IJLM) International Journal of Physical Distribution & Logistics Management (IJPDLM) Journal of Business Logistics (JBL)
Giunipero et al. (2008), Rao et al. 2013, Rungtusanatham et al. (2003) Coleman et al. (2012), Liao–Troth et al. (2012)
Journal of Operations Management (JOM) Journal of Purchasing and Supply Management (JPSM) Journal of Supply Chain Management (JSCM) Management Science (MS) Supply Chain Management: An International Journal (SCM:IJ) Transportation Research Part E: The Logistics and Transportation Review (TRE)
Beuckelaer and Wagner, (2012), Coleman et al., (2012), Crum and Poist, (2011) Beuckelaer and Wagner (2012), Coleman et al. (2012), Ellinger and Chapman (2011), Giunipero et al. (2008) Gorman and Kanet (2011), Rao et al. (2013) Giannakis (2012), Giunipero et al. (2008) Harland (2013); Igarashi et al. (2013), Wynstra (2010) Coleman et al. (2012), Ellinger and Chapman (2011), Giunipero et al. (2008) Giunipero et al. (2008), Rao et al. (2013), Rungtusanatham et al. (2003) Ellinger and Chapman (2011), Igarashi et al. (2013), Hoejmose and Adrien-Kirby (2012) Carter et al. (2009), Denk et al. (2012)
Appendix B. Reviewed articles using PLS
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33
Article Reference
Year
Journal
Adomavicius et al. Ahuja et al. Al-Natour et al. Autry et al. Braunscheidel and Suresh Byrd et al. Caniëls et al. Chen et al. Cheung et al. Chu and Wang Claassen et al. D'Arcy and Devaraj Davis et al. Ettlie and Pavlou Golicic et al. Gray and Meister Hall et al. Hartmann and de Grahl Hartmann and de Grahl Hofenk et al. Hoffmann et al. Hsieh et al. Hsu et al. Hu et al. Ilie et al. Jeffers et al. Johnston et al. Keil et al. Kern et al. Klein Klein et al. Kocabasoglu and Suresh Lai et al.
2013 2003 2008 2008 2009 2008 2013 2013 2010 2012 2008 2012 2009 2006 2012 2004 2012 2012 2011 2011 2013 2011 2012 2012 2009 2008 2004 2007 2012 2007 2007 2006 2008
JOM MS DS JBL JOM JBL JPSM JOM JOM JSCM SCM:IJ DS IJLM DS JBL MS IJLM IJPDLM JSCM JPSM JPSM MS DS DS DS DS JOM DS IJPDLM JOM DS JSCM JSCM
39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71
Article Reference
Year
Journal
Looney et al. Martin et al. McCarthy-Byrne and Mentzer McCormack et al. Miocevic Mishra et al. Morgan et al. Oh et al. Palvia et al. Park and Keil Perols et al. Preston et al. Rai and Hornyak Ranganathan and Sethi Ray and Kim Reuter et al. Rexhausen et al. Rosenzweig Saeed et al. Sarker et al. Sarker et al. Sawhney Setia and Patel Sun et al. Tassabehji Teigland and Wasko Teo and Lai Thornton et al. Venkatesh and Bala Venkatesh and Agarwal Venkatesh et al. Wallace et al. Wang and Wie
2008 2010 2011 2008 2011 2013 2007 2012 2010 2009 2013 2008 2013 2002 2011 2012 2012 2009 2010 2010 2011 2013 2013 2009 2010 2003 2009 2013 2008 2006 2012 2009 2007
DS DS IJPDLM SCM:IJ JPSM JOM JOM JOM DS DS JOM DS JOM DS DS JPSM JOM JOM DS DS DS JOM JOM SCM:IJ SCM:IJ DS JBL JSCM DS MS JOM DS DS
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
34 35 36 37 38
Large and Thomsen Large et al. Li et al. Lin et al. Looney et al.
2011 2011 2010 2012 2006
JPSM IJPDLM DS DS DS
72 73 74 75
Xu et al. Yigitbasioglu Yoon et al. Yu et al.
11
2010 2010 2008 2013
MS IJPDLM TRE DS
Appendix C. Full references of reviewed articles using PLS Adomavicius, G., Curley, S.P., Gupta, A., Sanyal, P., 2013. User acceptance of complex electronic market mechanisms: role of information feedback. Journal of Operations Management 31 (6), 489–503. Ahuja, M.K., Galletta, D.F., Carley, K.M., 2003. Individual centrality and performance in virtual R&D groups: an empirical study. Management Science 49 (1), 21–38. Al-Natour, S., Benbasat, I., Cenfetelli, R.T., 2008. The effects of process and outcome similarity on users' evaluations of decision aids. Decision Sciences 39 (2), 175–211. Autry, C.W., Skinner, L.R., Lamb, C.W., 2008. Interorganizational citizenship behaviors: an empirical study. Journal of Business Logistics 29 (2), 53–74. Braunscheidel, M.J., Suresh, N.C., 2009. The organizational antecedents of a firm's supply chain agility for risk mitigation and response. Journal of Operations Management 27 (2), 119–140. Byrd, T.A., Pitts, J.P., Adrian, A.M., Davidson, N.W., 2008. Examination of a path model relating information technology infrastructure with firm performance. Journal of Business Logistics 29 (2), 161–187. Caniëls, M.C.J., Gehrsitz, M.H., Semeijn, J., 2013. Participation of suppliers in greening supply chains: an empirical analysis of German automotive suppliers. Journal of Purchasing and Supply Management 19 (3), 134–143. Chen, D.Q., Preston, D.S., Xia, W., 2013. Enhancing hospital supply chain performance: a relational view and empirical test. Journal of Operations Management 31 (6), 391–408. Cheung, M.-S., Myers, M.B., Mentzer, J.T., 2010. Does relationship learning lead to relationship value? a cross-national supply chain investigation. Journal of Operations Management 28 (6), 472–487. Chu, Z., Wang, Q., 2012. Drivers of relationship quality in logistics outsourcing in China. Journal of Supply Chain Management 48 (3), 78–96. Claassen, M.J.T., van Weele, A.J., van Raaij, E.M., 2008. Performance outcomes and success factors of vendor managed inventory (VMI). Supply Chain Management: An International Journal 13 (6), 406–414. D’Arcy, J., Devaraj, S., 2012. Employee misuse of information technology resources: testing a contemporary deterrence model. Decision Sciences 43 (6), 1091–1124. Davis, D.F., Golicic, S.L., Marquardt, A., 2009. Measuring brand equity for logistics services. The International Journal of Logistics Management 20 (2), 201–212. Ettlie, J.E., Pavlou, P.A., 2006. Technology-based new product development partnerships. Decision Sciences 37 (2), 117–147. Golicic, S.L., Fugate, B.S., Davis, D.F., 2012. Examining market information and brand equity through resource-advantage theory: a carrier perspective. Journal of Business Logistics 33 (1), 20–33. Gray, P.H., Meister, D.B., 2004. Knowledge sourcing effectiveness. Management Science 50 (6), 821–834. Hall, D.J., Skipper, J.B., Hazen, B.T., Hanna, J.B., 2012. Inter-organizational IT use, cooperative attitude, and inter-organizational collaboration as antecedents to contingency planning effectiveness. The International Journal of Logistics Management 23 (1), 50–76. Hartmann, E., de Grahl, A., 2012. Logistics outsourcing interfaces: the role of customer partnering behavior. International Journal of Physical Distribution & Logistics Management 42 (6), 526–543. Hartmann, E.V.I., De Grahl, A., 2011. The flexibility of logistics service providers and its impact on customer loyalty: an empirical study. Journal of Supply Chain Management 47 (3), 63–85. Hofenk, D., Schipper, R., Semeijn, J., Gelderman, C., 2011. The influence of contractual and relational factors on the effectiveness of third party logistics relationships. Journal of Purchasing and Supply Management 17 (3), 167–175. Hoffmann, P., Schiele, H., Krabbendam, K., 2013. Uncertainty, supply risk management and their impact on performance. Journal of Purchasing and Supply Management, 19 (3), 199–211. Hsieh, J.J.P.-A., Rai, A., Xu, S.X., 2011. Extracting business value from IT: a sensemaking perspective of post-adoptive use. Management Science 57 (11), 2018–2039. Hsu, J.S.-C., Lin, T.-C., Cheng, K.-T., Linden, L.P., 2012. Reducing requirement incorrectness and coping with its negative impact in information system development projects. Decision Sciences 43 (5), 929–955. Hu, Q., Dinev, T., Hart, P., Cooke, D., 2012. Managing employee compliance with information security policies: the critical role of top management and organizational culture. Decision Sciences 43 (4), 615–660. Ilie, V., Van Slyke, C., Parikh, M.A., Courtney, J.F., 2009. Paper versus electronic medical records: the effects of access on physicians’ decisions to use complex information technologies. Decision Sciences 40 (2), 213–241. Jeffers, P.I., Muhanna, W.A., Nault, B.R., 2008. Information technology and process performance: an empirical investigation of the interaction between IT and non-IT resources. Decision Sciences 39 (4), 703–735. Johnston, D.A., McCutcheon, D.M., Stuart, F.I., Kerwood, H., 2004. Effects of supplier trust on performance of cooperative supplier relationships. Journal of Operations Management 22 (1), 23–38. Keil, M., Depledge, G., Rai, A., 2007. Escalation: the role of problem recognition and cognitive bias. Decision Sciences 38 (3), 391–421. Kern, D., Moser, R., Hartmann, E., Moder, M., 2012. Supply risk management: model development and empirical analysis. International Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
12
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
Journal of Physical Distribution & Logistics Management 42 (1), 60–82. Klein, R., 2007. Customization and real time information access in integrated eBusiness supply chain relationships. Journal of Operations Management 25 (6), 1366–1381. Klein, R., Rai, A., Straub, D.W., 2007. Competitive and cooperative positioning in supply chain logistics relationships. Decision Sciences 38 (4), 611–646. Kocabasoglu, C., Suresh, N.C., 2006. Strategic sourcing: an empirical investigation of the concept and its practices in U.S. manufacturing firms. Journal of Supply Chain Management 42 (2), 4–16. Lai, F., Li, D., Wang, Q., Zhao, X., 2008. The information technology capability of third-party logistics providers: a resource-based view and empirical evidence from China. Journal of Supply Chain Management 44 (3), 22–38. Large, R.O., Gimenez Thomsen, C., 2011. Drivers of green supply management performance: evidence from Germany. Journal of Purchasing and Supply Management 17 (3), 176–184. Large, R.O., Kramer, N., Hartmann, R.K., 2011. Customer-specific adaptation by providers and their perception of 3PL-relationship success. International Journal of Physical Distribution & Logistics Management 41 (9), 822–838. Li, D., Chau, P.Y.K., Lai, F., 2010. Market orientation, ownership type, and e-business assimilation: evidence from Chinese firms. Decision Sciences 41 (1), 115–145. Lin, T.-C., Cheng, H.K., Wang, F.-S., Chang, K.-J., 2012. A study of online auction sellers’ intention to switch platform: the case of Yahoo! Kimo versus Ruten_eBay. Decision Sciences 43 (2), 241–272. Looney, C.A., Akbulut, A.Y., Poston, R.S., 2008. Understanding the determinants of service channel preference in the early stages of adoption: a social cognitive perspective on online brokerage services. Decision Sciences 39 (4), 821–857. Looney, C.A., Valacich, J.S., Todd, P.A., Morris, M.G., 2006. Paradoxes of online investing: testing the influence of technology on user expectancies. Decision Sciences 37 (2), 205–246. Martin, P., Guide, J.V.D.R., Craighead, C.W., 2010. Supply chain sourcing in remanufacturing operations: an empirical investigation of remake versus buy. Decision Sciences 41 (2), 301–324. McCarthy‐Byrne, T.M., Mentzer, J.T., 2011. Integrating supply chain infrastructure and process to create joint value. International Journal of Physical Distribution & Logistics Management 41 (2), 135–161. McCormack, K., Bronzo Ladeira, M., Paulo Valadares de Oliveira, M., 2008. Supply chain maturity and performance in Brazil. Supply Chain Management: An International Journal 13 (4), 272–282. Miocevic, D., 2011. Organizational buying effectiveness in supply chain context: conceptualization and empirical assessment. Journal of Purchasing and Supply Management 17 (4), 246–255. Mishra, A.N., Devaraj, S., Vaidyanathan, G., 2013. Capability hierarchy in electronic procurement and procurement process performance: an empirical analysis. Journal of Operations Management 31 (6), 376–390. Morgan, N.A., Kaleka, A., Gooner, R.A., 2007. Focal supplier opportunism in supermarket retailer category management. Journal of Operations Management 25 (2), 512–527. Oh, L.-B., Teo, H.-H., Sambamurthy, V., 2012. The effects of retail channel integration through the use of information technologies on firm performance. Journal of Operations Management 30 (5), 368–381. Palvia, P.C., King, R.C., Xia, W., Palvia, S.C.J., 2010. Capability, quality, and performance of offshore IS vendors: a theoretical framework and empirical investigation. Decision Sciences 41 (2), 231–270. Park, C., Keil, M., 2009. Organizational silence and whistle-blowing on IT projects: an integrated model. Decision Sciences 40 (4), 901– 918. Perols, J., Zimmermann, C., Kortmann, S., 2013. On the relationship between supplier integration and time-to-market. Journal of Operations Management 31 (3), 153–167. Preston, D.S., Chen, D., Leidner, D.E., 2008. Examining the antecedents and consequences of CIO strategic decision-making authority: an empirical study. Decision Sciences 39 (4), 605–642. Rai, A., Hornyak, R., 2013. The impact of sourcing enterprise system use and work process interdependence on sourcing professionals' job outcomes. Journal of Operations Management 31 (6), 474–488. Ranganathan, C., Sethi, V., 2002. Rationality in strategic information technology decisions: the impact of shared domain knowledge and IT unit structure. Decision Sciences 33 (1), 59–86. Ray, S., Ow, T., Kim, S.S., 2011. Security assurance: how online service providers can influence security control perceptions and gain trust. Decision Sciences, 42 (2), 391–412. Reuter, C., Goebel, P., Foerstl, K., 2012. The impact of stakeholder orientation on sustainability and cost prevalence in supplier selection decisions. Journal of Purchasing and Supply Management 18 (4), 270–281. Rexhausen, D., Pibernik, R., Kaiser, G., 2012. Customer-facing supply chain practices—the impact of demand and distribution management on supply chain success. Journal of Operations Management 30 (4), 269–281. Rosenzweig, E.D., 2009. A contingent view of e-collaboration and performance in manufacturing. Journal of Operations Management 27 (6), 462–478. Saeed, K.A., Abdinnour, S., Lengnick-Hall, M.L., Lengnick-Hall, C.A., 2010. Examining the impact of pre-implementation expectations on post-implementation use of enterprise systems: a longitudinal study. Decision Sciences 41 (4), 659–688. Sarker, S., Sarker, S., Chatterjee, S., Valacich, J.S., 2010. Media effects on group collaboration: an empirical examination in an ethical decision-making context. Decision Sciences 41 (4), 887–931. Sarker, S., Sarker, S., Kirkeby, S., Chakraborty, S., 2011. Path to “stardom” in globally distributed hybrid teams: an examination of a knowledge-centered perspective using social network analysis. Decision Sciences 42 (2), 339–370. Sawhney, R., 2013. Implementing labor flexibility: a missing link between acquired labor flexibility and plant performance. Journal of Operations Management 31 (1–2), 98–108. Setia, P., Patel, P.C., 2013. How information systems help create OM capabilities: consequents and antecedents of operational absorptive capacity. Journal of Operations Management 31 (6), 409–431. Sun, S.-Y., Hsu, M.-H., Hwang, W.-J., 2009. The impact of alignment between supply chain strategy and environmental uncertainty on Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
13
SCM performance. Supply Chain Management: An International Journal 14 (3), 201–212. Tassabehji, R., 2010. Understanding e-auction use by procurement professionals: motivation, attitudes and perceptions. Supply Chain Management: An International Journal 15 (6), 425–437. Teigland, R., Wasko, M.M., 2003. Integrating knowledge through information trading: examining the relationship between boundary spanning communication and individual performance. Decision Sciences 34 (2), 261–286. Teo, T.S.H., Lai, K.-h., 2009. Usage and performance impact of electronic procurement. Journal of Business Logistics 30 (2), 125–139. Thornton, L.M., Autry, C.W., Gligor, D.M., Brik, A.B., 2013. Does socially responsible supplier selection pay off for customer firms? a crosscultural comparison. Journal of Supply Chain Management 49 (3), 66-89. Venkatesh, V., Agarwal, R., 2006. Turning visitors into customers: a usability-centric perspective on purchase behavior in electronic channels. Management Science 52 (3), 367–382. Venkatesh, V., Bala, H., 2008. Technology acceptance model 3 and a research agenda on interventions. Decision Sciences 39 (2), 273– 315. Venkatesh, V., Chan, F.K.Y., Thong, J.Y.L., 2012. Designing e-government services: key service attributes and citizens' preference structures. Journal of Operations Management 30 (1–2), 116–133. Wallace, D.W., Johnson, J.L., Umesh, U.N., 2009. Multichannels strategy implementation: the role of channel alignment capabilities. Decision Sciences 40 (4), 869–900. Wang, E.T.G., Wei, H.-L., 2007. Interorganizational governance value creation: coordinating for information visibility and flexibility in supply chains. Decision Sciences, 38 (4). 647-674. Xu, X., Venkatesh, V., Tam, K.Y., Hong, S.-J., 2010. Model of migration and use of platforms: role of hierarchy, current generation, and complementarities in consumer settings. Management Science 56 (8), 1304–1323. Yigitbasioglu, O.M., 2010. Information sharing with key suppliers: a transaction cost theory perspective. International Journal of Physical Distribution & Logistics Management 40 (7), 550–578. Yoon, K.B., Kim, H.S., Sohn, S.Y., 2008. An air force logistics management index for effective aircraft operation. Transportation Research Part E: Logistics and Transportation Revie, 44 (6), 1188–1204. Yu, K., Cadeaux, J., Song, H., 2013. Distribution channel network and relational performance: the intervening mechanism of adaptive distribution flexibility. Decision Sciences 44 (5), 915–950.
References Antonakis, J., Bendahan, S., Jacquart, P., Lalive, R., 2010. On making causal claims: a review and recommendations. Leadersh. Q. 21 (6), 1086–1120. Bandalos, D.L., 2002. The effects of item parceling on goodness-of-fit and parameter estimate bias in structural equation modeling. Struct. Equ. Model. 9 (1), 78–102. Bentler, P.M., Huang, W., 2014. On components, latent variables, PLS and simple methods: reactions to Rigdon's rethinking of PLS. Long Range Plan. 47 (3), 138–145. Beuckelaer, A.D., Wagner, S.M., 2012. Small sample surveys: Increasing rigor in supply chain management research. Int. J. Phys. Distrib. Logist. Manag. 42 (7), 615–639. Bollen, K.A., Davis, W.R., 2009. Causal indicator models: identification, estimation, and testing. Struct. Equ. Model.: Multidiscip. J. 16 (3), 498–522. Boomsma, A., Hoogland, J.J., 2001. The robustness of LISREL modeling revisited. In: Cudeck, R., Du Toit, S., Sörbom, D. (Eds.), Structural Equation Modeling: Present and Future. Scientific Software International, Chicago, IL, pp. 139–168. Caniëls, M.C.J., Gehrsitz, M.H., Semeijn, J., 2013. Participation of suppliers in greening supply chains: an empirical analysis of German automotive suppliers. J. Purchas. Supply Manag. 19 (3), 134–143. Carter, C.R., Easton, P.L., Vellenga, D.B., Allen, B.J., 2009. Affiliation of authors in transportation and logistics academic journals: a reevaluation. Transp. J. 48 (1), 42–52. Chen, I.J., Paulraj, A., 2004. Towards a theory of supply chain management: the constructs and measurements. J. Oper. Manag. 22 (2), 119–150. Chin, W.W., 1995. Partial least squares is to LISREL as principal components analysis is to common factor analysisy. Technol. Stud. 2 (x), 315–319. Chin, W.W., 1998. Issues and opinion on structural equation modeling. MIS Q. 22 (1). Cohen, J., 1988. Statistical Power Analysis for the Behavioral Sciences. Lawrence Earlbaum, Hillsdale, NJ. Cohen, J., 1992. A power primer. Psychol. Bull. 112 (1), 155–159. Coleman, B.J., Bolumole, Y.A., Frankel, R., 2012. Benchmarking individual publication productivity in logistics. Transp. J. 51 (2), 164–196. Crum, M.R., Poist, R.F., 2011. IJPDLM's 40th anniversary: an overview and retrospective analysis. Int. J. Phys. Distrib. Logist. Manag. 41 (1), 5–15. D’Arcy, J., Devaraj, S., 2012. Employee misuse of information technology resources: testing a contemporary deterrence model. Decis. Sci. 43 (6), 1091–1124. De Beuckelaer, A., Wagner, S.M., 2012. Small sample surveys: increasing rigor in supply chain management research. Int. J. Phys. Distrib. Logist. Manag. 42 (7), 615–639. Denk, N., Kaufmann, L., Carter, C.R., 2012. Increasing the rigor of grounded theory research – a review of the SCM literature. Int. J. Phys. Distrib. Logist. Manag. 42 (8/9), 742–763. Diamantopoulos, A., Riefler, P., 2011. Using formative measures in international marketing models: a cautionary tale using consumer animosity as an example. In: Sarstedt, M., Schwaiger, C.R.T. (Eds.), Meas. Res. Methods Int. Mark.: Emeral Group Publ. Ltd., pp. 11–30.
Dijkstra, T.K., 2014. PLS' janus face – response to Professor Rigdon's ‘Rethinking Partial Least Squares Modeling: In Praise of Simple Methods’. Long Range Plan. 47 (3), 146–153. Elashoff, J.D. 2007. nQuery Advisor Version 7.0 User’s Guide. Los Angeles, CA, Los Angeles Statistical Solutions. Ellinger, A.E., Chapman, K., 2011. Benchmarking leading supply chain management and logistics strategy journals. Int. J. Logist. Manag. 22 (3), 403–419. Esposito Vinzi, V., Chin, W.W., Henseler, J., Wang, H., 2010. Handbook of Partial Least Squares: Concepts, Methods and Applications. Springer, Heidelberg. Fawcett, S.E., Waller, M.A., Miller, J.W., Schwieterman, M.A., Hazen, B.T., Overstreet, R.E., 2014. A trail guide to publishing success: tips on writing influential conceptual, qualitative, and survey research. J. Bus. Logist. 35 (1), 1–16. Fornell, C., Larcker, D.F., 1981. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 18 (1), 39–50. Furneaux, B., Wade, M., 2011. An exploration of organizational level information systems discontinuance intentions. MIS Q. 35 (3), 573–598. Gefen, D., Rigdon, E.E., Straub, D., 2011. An update and extension to SEM guidelines for administrative and social science research. MIS Q. 35 (2) (iii-A7). Geisser, S., 1975. A predictive approach to the random effect model. Biometrika 61 (1), 101–107. Giannakis, M., 2012. The intellectual structure of the supply chain management discipline: a citation and social network analysis. J. Enterp. Inf. Manag. 25 (2), 136–169. Giunipero, L.C., Hooker, R.E., Joseph–Matthews, S., Brudvig, S., Yoon, T., 2008. A decade of SCM literature: Past, present, and future implications. J. Supply Chain Manag. 44 (4), 66–86. Goodhue, D.L., Lewis, W., Thompson, R., 2012. Does PLS have advantages for small sample size or non-normal data? MIS Q. 36 (3), 891–1001. Gorman, M.F., Kanet, J.J., 2011. A survey-based evaluation of logistics and transportation research journal quality. Transp. J. 50 (4), 390–415. Gray, P.H., Meister, D.B., 2004. Knowledge sourcing effectiveness. Manag. Sci. 50 (6), 821–834. Gudergan, S.P., Ringle, C.M., Wende, S., Will, A., 2008. Confirmatory tetrad analysis in PLS path modeling. J. Bus. Res. 61 (12), 1238–1249. Hair, J., Sarstedt, M., Ringle, C., Mena, J., 2012a. An assessment of the use of partial least squares structural equation modeling in marketing research. J. Acad. Mark. Sci. 40 (3), 414–433. Hair, J.F., Black, W.C., Babin, B.J., Anderson, R.E., Tatham, R.L., 2006. Multivariate Data Analysis. Pearson/Prentice Hall, Upper Saddle River, NJ. Hair, J.F., Ringle, C.M., Sarstedt, M., 2011. PLS-SEM: indeed a silver bullet. J. Mark. Theory Pract. 19 (2), 139–152. Hair, J.F., Ringle, C.M., Sarstedt, M., 2012b. Partial least squares: the better approach to structural equation modeling? Long Range Plan. 45 (5–6), 312–319. Hair, J.F., Ringle, C.M., Sarstedt, M., 2013. Partial least squares structural equation modeling: rigorous applications, better results and higher acceptance. Long Range Plan. 46 (1–2), 1–12.
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132
14
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66
L. Kaufmann, J. Gaeckler / Journal of Purchasing & Supply Management ∎ (∎∎∎∎) ∎∎∎–∎∎∎
Hair, J.F., Sarstedt, M., Pieper, T.M., Ringle, C.M., 2012c. . The use of partial least squares structural equation modeling in strategic management research: a review of past practices and recommendations for future applications. Long Range Plan. 45 (5–6), 320–340. Hall, D.J., Skipper, J.B., Hazen, B.T., Hanna, J.B., 2012. Interorganizational IT use, cooperative attitude, and inter-organizational collaboration as antecedents to contingency planning effectiveness. Int. J. Logist. Manag. 23 (1), 50–76. Hall, R.J., Snell, A.F., Foust, M.S., 1999. Item parceling strategies in SEM: investigating the subtle effects of unmodeled secondary constructs. Org. Research Methods 2 (3), 233–256. Harland, C.M., 2013. Supply chain management research impact: an evidence-based perspective. Supply Chain Manag. 18 (5), 483–496. Hartmann, E., de Grahl, A., 2012. Logistics outsourcing interfaces: the role of customer partnering behavior. Int. J. Phys. Distrib. Logist. Manag. 42 (6), 526–543. Hartmann, E.V.I., De Grahl, A., 2011. The flexibility of logistics service providers and its impact on customer loyalty: an empirical study. J. Supply Chain Manag. 47 (3), 63–85. Henseler, J., Dijkstra, T.K., Sarstedt, M., Ringle, C.M., Diamantopoulos, A., Straub, D. W., Ketchen, D.J., Hair, J.F., Hult, G.T.M., Calantone, R.J., 2014. Common beliefs and reality about PLS: comments on Rönkkö and Evermann (2013). Org. Res. Methods 17 (2), 182–209. Henseler, J., Ringle, C.M., Sinkovics, R.R., 2009. The use of partial least squares path modeling in international marketing. Adv. Int. Mark. 8 (20), 277–319. Henseler, J., Sarstedt, M., 2013. Goodness-of-fit indices for partial least squares path modeling. Comput. Stat. 28 (2), 565–580. Hoejmose, S.U., Adrien-Kirby, A.J., 2012. Socially and environmentally responsible procurement: a literature review and future research agenda of a managerial issue in the 21st century. J. Purch. Supply Manag. 18 (4), 232–242. Hoffmann, P., Schiele, H., Krabbendam, K., 2013. Uncertainty, supply risk management and their impact on performance. J. Purch. Supply Manag. 19 (3), 199–211. Hulland, J., 1999. Use of partial least squares (PLS) in strategic management research: a review of four recent studies. Strat. Manag. J. 20 (2), 195–204. Hwang, H., Malhotra, N.K., Kim, Y., Tomiuk, M.A., Hong, S., 2010. A comparative study on parameter recovery of three approaches to structural equation modeling. J. Marketing Res. 47 (4), 699–712. Igarashi, M., de Boer, L., Fet, A.M., 2013. What is required for greener supplier selection? a literature review and conceptual model development. J. Purch. Supply Manag. 19 (4), 247–263. Jarvis, C.B., MacKenzie, S.B., Podsakoff, P.M., Mick, D.G., Bearden, W.O., 2003. A critical review of construct indicators and measurement model misspecification in marketing and consumer research. J. Consum. Res. 30 (2), 199–218. Kaufmann, L., Saw, A., 2014. Using a multiple-informant approach in SCM research. Int. J. Phys. Distrib. Logist. Manag. 44 (6), 511–527. Klein, R., 2007. Customization and real time information access in integrated eBusiness supply chain relationships. J. Oper. Manag. 25 (6), 1366–1381. Lane, D.M., 2008. HyperStat Online Statistics Textbook. Rice Virtual Lab in Statistics, Houston, TX (accessed 17.09.2014). Lee, L., Petter, S., Fayard, D., Robinson, S., 2011. On the use of partial least squares path modeling in accounting research. Int. J. Accounting Inf. Syst. 12 (4), 305–328. Liao–Troth, S., Thomas, S., Fawcett, S.E., 2012. Twenty years of IJLM: Evolution in research. Int. J. Logist. Manag. 23 (1), 4–30. Lilliefors, H.W., 1967. On the Kolmogorov–Smirnov test for normality with mean and variance unknown. J. Am. Stat. Assoc. 62 (318), 399–402. Lohmöller, J.B., 1989. Latent Variable Path Modeling with Partial Least Squares. Physica-Verlag, Heidelberg, Germany. MacCallum, R.C., Browne, M.W., Sugawara, H.M., 1996. Power analysis and determination of sample size for covariance structure modeling. Psychol. Methods 1 (2), 130–149. Marcoulides, G.A., Chin, W.W., Saunders, C., 2009. A critical look at partial least squares modeling. MIS Q. 33 (1), 171–175. Martinez-Ruiz, A., Aluja-Banet, T., 2009. Toward the definition of a structural equation model of patent value: PLS path modelling with formative constructs. REVSTAT-Stat. J. 7 (3), 265–290. McIntosh, C.N., Edwards, J.R., Antonakis, J., 2014. Reflections on partial least squares path modeling. Org. Res. Methods 17 (2), 210–251. McQuitty, S., 2004. Statistical power and structural equation models in business research. J. Bus. Res. 57 (2), 175–183. Medsker, G.J., Williams, L.J., Holahan, P.J., 1994. A review of current practices for evaluating causal models in organizational behavior and human resources management research. J. Manag. 20 (2), 439–464. Miemczyk, J., Johnsen, T.E., Macquet, M., 2012. Sustainable purchasing and supply management: a structured literature review of definitions and measures at the dyad, chain and network levels. Supply Chain Manag. 17 (5), 478–496. Näslund, D., Kale, R., Paulraj, A., 2010. Action research in supply chain managementa framework for relevant and rigorous research. J. Bus. Logist. 31 (2), 331–355. O’Cass, A., Weerawardena, J., 2010. The effects of perceived industry competitive intensity and marketing-related capabilities: drivers of superior brand
performance. Ind. Mark. Manag. 39 (4), 571–581. Peng, D.X., Lai, F., 2012. Using partial least squares in operations management research: a practical guideline and summary of past research. J. Oper. Manag. 30 (6), 467–480. Petter, S., Straub, D., Rai, A., 2007. Specifying formative construcs in information systems research. MIS Q. 31 (4), 623–656. Rao, S., Iyengar, D., Goldsby, T.J., 2013. On the measurement and benchmarking of research impact among active logistics scholars. Int. J. Phys. Distrib. Logist. Manag. 43 (10), 814–832. Reuter, C., Goebel, P., Foerstl, K., 2012. The impact of stakeholder orientation on sustainability and cost prevalence in supplier selection decisions. J. Purch. Supply Manag. 18 (4), 270–281. Riedl, D.F., Kaufmann, L., Gaeckler, J., 2014. Statistical power of structural equation models in SCM research. J. Purch. Supply Manag. 20 (3), 208–212. Rigdon, E.E., 2012. Rethinking partial least squares path modeling: in praise of simple methods. Long Range Plan. 45 (5–6), 341–358. Ringle, C.M., Sarstedt, M., Straub, D.W., 2012a. A critical look at the use of PLS-SEM in MIS Quarterly. MIS Q. 36 (1). Ringle, C.M., Sarstedt, M., Straub, D.W., 2012b. Editor's comments: a critical look at the use of PLS-SEM in MIS quarterly. MIS Q. 36, iii–xiv. Rönkkö, M., 2014. The effects of chance correlations on partial least squares path modeling. Org. Res. Methods 17 (2), 164–181. Rönkkö, M., Evermann, J., 2013. A critical examination of common beliefs about partial least squares path modeling. Org. Res. Methods 16 (3), 425–448. Royston, J.P., 1983. Some techniques for assessing multivarate normality based on the Shapiro-Wilk W. J. R. Stat. Soc. Ser. C (Appl. Stat.) 32 (2), 121–133. Rozemeijer, F., Quintens, L., Wetzels, M., Gelderman, C., 2012. Vision 20/20: preparing today for tomorrow's challenges. J. Purch. Supply Manag. 18 (2), 63–67. Rungtusanatham, M.J., Choi, T.Y., Hollingworth, D.G., Wu, Z., Forza, C., 2003. Survey research in operations management: historical analyses. J. Oper. Manag. 21 (4), 475–488. Sarstedt, M., Henseler, J., Ringle, C.M., 2011. Multigroup analysis in partial least squares (PLS) path modeling: alternative methods and empirical results. In: Sarstedt, M., Schwaiger, M., Taylor, C.R. (Eds.), Measurement and Research Methods in International Marketing (Advances in International Marketing), 195–218. Emerald Group Publishing Limited. Sarstedt, M., Ringle, C.M., Henseler, J., Hair, J.F., 2014. On the emancipation of PLSSEM: a commentary on Rigdon (2012). Long Range Plan. 47 (3), 154–160. Setia, P., Patel, P.C., 2013. How information systems help create OM capabilities: consequents and antecedents of operational absorptive capacity. J. Oper. Manag. 31 (6), 409–431. Shah, R., Goldstein, S.M., 2006. Use of structural equation modeling in operations management research: looking back and forward. J. Oper. Manag. 24 (2), 148–169. Shook, C.L., Ketchen, D.J., Hult, G.T.M., Kacmar, K.M., 2004. An assessment of the use of structural equation modeling in strategic management research. Strat. Manag. J. 25 (4), 397–404. Steenkamp, J.-B.E.M., Baumgartner, H., 2000. On the use of structural equation models for marketing modeling. Int. J. Res. Mark. 17 (2–3), 195–202. Stone, M., 1974. Cross-validatory choice and assessment of statistical predictions. J. R. Stat. Soc. 36, 111–147. Tenenhaus, M., Vinzi, V.E., Chatelin, Y.-M., Lauro, C., 2005. PLS path modeling. Comput. Stat. Data Anal. 48 (1), 159–205. Thornton, L.M., Autry, C.W., Gligor, D.M., Brik, A.B., 2013. Does socially responsible supplier selection pay off for customer firms? a cross-cultural comparison. J. Supply Chain Manag. 49 (3), 66–89. Verma, R., Goodale, J.C., 1995. Statistical power in operations management research. J. Oper. Manag. 13 (2), 139–152. Wold, H., 1966. Estimation of principal components and related models by iterative least squares. In: Krishnaiah, P.R. (Ed.), International Symposium on Multivariate Analysis, 391–420. Academic Press, Dayton,OH. Wynstra, F., 2010. What did we do, who did it and did it matter? A review of fifteen volumes of the (European) Journal of Purchasing and Supply Management. J. Purch. Supply Manag. 16 (4), 279–292. Xu, X., Venkatesh, V., Tam, K.Y., Hong, S.-J., 2010. Model of migration and use of platforms: role of hierarchy, current generation, and complementarities in consumer settings. Manag. Sci. 56 (8), 1304–1323. Yigitbasioglu, O.M., 2010. Information sharing with key suppliers: a transaction cost theory perspective. Int. J. Phys. Distrib. Logist. Manag. 40 (7), 550–578. Yu, K., Cadeaux, J., Song, H., 2013. Distribution channel network and relational performance: the intervening mechanism of adaptive distribution flexibility. Decis. Sci. 44 (5), 915–950. Zsidisin, G.A., Smith, M.E., McNally, R.C., Kull, T.J., 2007. Evaluation criteria development and assessment of purchasing and supply management journals. J. Oper. Manag. 25 (1), 165–183.
Please cite this article as: Kaufmann, L., Gaeckler, J., A structured review of partial least squares in supply chain management research. Journal of Purchasing and Supply Management (2015), http://dx.doi.org/10.1016/j.pursup.2015.04.005i
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132