ERSA European Regional Science Association Soihtu
taltunnus

ERSA 2003 Congress

Abstracts

The abstract for paper number 138:

Roger von Haefen, University of Arizona, Tucson, USA, H. Spencer Banzhaf, Resources for the Future, Washington, USA
Alternative Strategies for Incorporating Weak Complementarity into Consumer Demand System Specifications

Weak complementarity represents one of the key behavioral restrictions that environmental economists use to recover welfare measures for quality changes from revealed preference data. Although the theoretical implications of weak complementarity have been extensively studied (see, e.g., Smith and Banzhaf [2002], Palmquist [2001]), a systematic investigation of alternative empirical strategies for developing weakly complementary demand system models has not been conducted. We propose in this paper to conceptually and empirically compare four strategies for developing weakly complementary demand system models that have been proposed in the nonmarket valuation literature. The alternative strategies are presented in their most general forms, evaluated in terms of their implications for consumer behavior and welfare measurement, and compared empirically with a common a data set. In sum, our investigation represents a comprehensive assessment of the empirical implications of weak complementarity for valuing quality changes.

As developed by Mäler [1974], Bradford and Hildebrand [1977], and Willig [1978], weak complementarity implies that individuals who do not consume a good are not willing to pay for marginal improvements in the good’s quality. If this restriction is valid, the analyst can recover consistent welfare measures for changes in quality from Hicksian demand functions. At present there are four generic strategies for developing demand system models that are consistent with weak complementarity. The first and oldest approach assumes that quality enters preferences either as an additive or multiplicative augmentation to price (e.g., Willig [1976], Griliches [1964]). This approach to developing weakly complementary preferences has been widely used in applied demand analysis and its implications for behavior and welfare measurement are well understood.

A second approach, introduced indirectly by Larson [1991] and employed recently by Herriges, Kling, and Phaneuf [forthcoming], is relatively new to the nonmarket valuation literature. For lack of a better term, we label the approach “subtracting off” because utility at zero consumption is subtracted off the utility function to impose weak complementarity. Using a decomposition of total value proposed by Herriges, Kling, and Phaneuf, we show that the “subtracting off” approach assumes that nonuse values can be large and negative, a somewhat troubling implication for models that are estimated with only revealed preference data.

A third strategy is Larson’s [1992] “integrating back” approach to developing weakly complementary demand systems. It begins with a system of demand equations that depend on environmental quality. In some cases, these demand equations can be linked back to a quasi-expenditure function up to a constant of integration that is a function of environmental quality. Weak complementarity places structure on how quality enters the constant of integration, and in some cases, the analyst can use this structure to recover a quasi-expenditure function that depends on a constant of integration that is independent of quality. Thus, the analyst can recover a complete characterization of how quality enters preferences. Although Larson’s approach represents a theoretically consistent approach to recovering weakly complementary preferences, we argue that little additional insight is gained from the Larson approach relative to the approaches discussed above. The two examples that Larson considers can be interpreted as special cases of the augmenting and the “subtracting off” approaches, and our own implementation of the Larson approach with twelve linear and semi-log empirical demand specifications suggests that it may not possible generally to recover a full description of preferences from the Larson approach except when quality enters as an augmentation to price.

A final approach to developing weakly complementary demand systems relaxes the smooth relationship between utility and quality that the other approaches implicitly assume. The non-smoothness approach, originally suggested by Bockstael, Hanemann and Strand [1986], assumes that there are regime-specific conditional direct utility functions where regimes are defined in terms of the combination of interior and corner solutions for the M quality differentiated goods. Each regime specific utility function has a weak complementary assumption embedded in it and therefore depends only on the quality of the goods that are consumed in strictly positive quantities. The unconditional direct utility function is the maximum across the regime specific conditional direct utility functions. Although we show that estimating a restricted version of this preference specification is relatively straightforward, we argue that welfare measures for quality changes are not independent of how the analyst arbitrarily introduces non-smoothness into preferences.

To illustrate and compare the empirical properties of each of the four strategies described above, we use a carefully designed data set of recreational fishing trips in the Fox River region of Wisconsin, assembled by Desvousges, MacNair, and Smith [2000] but not yet introduced in the published literature. The data is a panel of the actual trips taken by over 500 angers from June to September 1998. The data include approximately 3,300 single-day recreational fishing trips undertaken at 700 locations. Because they come from a panel of anglers, they offer the opportunity to support both traditional random utility models and Kuhn-Tucker demand models. Additively separable, random coefficient, Kuhn-Tucker continuous demand system specifications that assume weak complementarity in each of the four ways described above are estimated within a Bayesian framework. In contrast to the classical estimation strategies employed by von Haefen, Phaneuf, and Parsons [unpublished], our Bayesian estimation framework is attractive because it allows us to assume that all coefficients entering consumer preferences vary randomly across the population without jeopardizing the computational tractability of estimation. Variations of an adaptive, Monte Carlo, Markov Chain simulation algorithm are used to construct welfare measures for multiple quality change scenarios. We use statistical criteria, the plausibility of behavioral predictions from the alternative models, as well as properties of the welfare measures discussed in our conceptual comparison to assess which of the four approaches generates welfare measures that are most defensible for policy purposes.

Unfortunately full paper has not been submitted.

© 2002 - 2003 by 43rd ERSA Congress - Generated: 05/08/2003