<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
<title>Center for Innovation in Product Development (CIPD)</title>
<link href="https://hdl.handle.net/1721.1/3764" rel="alternate"/>
<subtitle/>
<id>https://hdl.handle.net/1721.1/3764</id>
<updated>2026-04-06T17:13:07Z</updated>
<dc:date>2026-04-06T17:13:07Z</dc:date>
<entry>
<title>Complex System Classification</title>
<link href="https://hdl.handle.net/1721.1/6753" rel="alternate"/>
<author>
<name>Magee, Christopher</name>
</author>
<author>
<name>de Weck, Olivier</name>
</author>
<id>https://hdl.handle.net/1721.1/6753</id>
<updated>2025-05-14T14:52:31Z</updated>
<published>2004-07-24T00:00:00Z</published>
<summary type="text">Complex System Classification
Magee, Christopher; de Weck, Olivier
The use of terms such as “Engineering Systems”, “System of systems” and others have been coming into greater use over the past decade to denote systems of importance but with implied higher complexity than for the term systems alone. This paper searches for a useful taxonomy or classification scheme for complex Systems. There are two aspects to this problem: 1) distinguishing between Engineering Systems (the term we use) and other Systems, and 2) differentiating among Engineering Systems. Engineering Systems are found to be differentiated from other complex systems by being human-designed and having both significant human complexity as well as significant technical complexity. As far as differentiating among various engineering systems, it is suggested that functional type is the most useful attribute for classification differentiation.  Information, energy, value and mass acted upon by various processes are the foundation concepts underlying the technical types.
</summary>
<dc:date>2004-07-24T00:00:00Z</dc:date>
</entry>
<entry>
<title>Architecting and Innovating</title>
<link href="https://hdl.handle.net/1721.1/5064" rel="alternate"/>
<author>
<name>Campbell, Ronald B. Jr.</name>
</author>
<id>https://hdl.handle.net/1721.1/5064</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2004-04-14T00:00:00Z</published>
<summary type="text">Architecting and Innovating
Campbell, Ronald B. Jr.
Innovating is essential to sustained industrial growth and profitability. But experience amply demonstrates how difficult innovation is, especially for large companies. The synthesis of valued offerings by aligning customer needs with technology possibilities lies at the heart of innovation. System architects working at the strategic level are ideally positioned, as a consequence of their experience and training, to play a key and even a leadership role in enabling, energizing, and leading this synthesis. The scope of the architecting effort must include the process architecture of the entire value chain as well as the more conventional product architecture to address all potential wellsprings of innovation. This paper outlines an architecture-centric approach to innovation, based on the concept of the system platform architecture.
</summary>
<dc:date>2004-04-14T00:00:00Z</dc:date>
</entry>
<entry>
<title>Drive Out Fear (Unless You Can Drive It In):The role of agency and job security in process improvement</title>
<link href="https://hdl.handle.net/1721.1/3962" rel="alternate"/>
<author>
<name>Repenning, Nelson</name>
</author>
<id>https://hdl.handle.net/1721.1/3962</id>
<updated>2019-04-12T08:27:55Z</updated>
<published>1998-11-01T00:00:00Z</published>
<summary type="text">Drive Out Fear (Unless You Can Drive It In):The role of agency and job security in process improvement
Repenning, Nelson
Understanding the wide range of outcomes achieved by firms trying to implement TQM and similar process improvement initiatives presents a challenge to management science and organization theory: a few firms reap sustained benefits from their programs, but most efforts fail and are abandoned. A defining feature of such techniques is the reliance on the front-line workforce to do the work of improvement, thus creating the possibility of agency problems; different incentives facing managers and workers. Specifically, successfully improving productivity can lead to lay-offs. The literature provides two opposing theories of how agency interacts with the ability of quality-oriented improvement techniques to dramaticlly increase productivity. The 'Drive Out Fear' school argues that firms must commit to job security, while the 'Drive In Fear' school emphasizes the positive role that insecurity plays in motivating change. In this study a contract theoretic model is developed to analyze the role of agency in process improvement. The main insight of the study is that there are two types of job security, internal and external, that have opposite impacts on the firm's abilty to implement improvement initiatives. The distinction is useful in explaining the results of different case studies and can reconcile the two change theories.
</summary>
<dc:date>1998-11-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Understanding Fire Fighting in New Product Development</title>
<link href="https://hdl.handle.net/1721.1/3961" rel="alternate"/>
<author>
<name>Repenning, Nelson</name>
</author>
<id>https://hdl.handle.net/1721.1/3961</id>
<updated>2019-04-09T15:27:12Z</updated>
<published>2001-03-01T00:00:00Z</published>
<summary type="text">Understanding Fire Fighting in New Product Development
Repenning, Nelson
Despite documented benefits, the processes described in the new product development literature often prove difficult to follow in practice. A principal source of such difficulties is the phenomenon of fire fighting the unplanned allocation of resources to fix problems discovered late in a product's development cycle. While it has been widely criticized, fire fighting is a common occurrence in many product development organizations. To understand both its existence and persistence, in this article I develop a formal model of fire fighting in a multi-project development environment. The major contributions of this analysis are to suggest that: (1) fire fighting can be a self-reinforcing phenomenon; and (2) multi-project development systems are far more susceptible to this dynamic than is currently appreciated. These insights suggest that many of the current methods for aggregate resource and product portfolio planning, while necessary, are not sufficient to prevent fire fighting and the consequent low performance.
</summary>
<dc:date>2001-03-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Metrics Thermostat</title>
<link href="https://hdl.handle.net/1721.1/3960" rel="alternate"/>
<author>
<name>Hauser, John</name>
</author>
<id>https://hdl.handle.net/1721.1/3960</id>
<updated>2019-04-11T07:03:42Z</updated>
<published>2001-07-01T00:00:00Z</published>
<summary type="text">Metrics Thermostat
Hauser, John
The explosion of information and information technology has led many firms to evolve a dispersed product development process with people and organizations spread throughout the world. To coordinate such dispersed processes managers attempt to establish a culture that implicitly rewards product development teams based on their ability to perform against a set of strategic metrics such as customer satisfaction, time to market, defect reduction, or platform reuse. Many papers have focused on selecting the right metrics and establishing the culture. In this paper we focus on a practical method to fine-tune a firm's relative emphasis on the metrics that they have chosen. In particular, we seek to advise a firm whether to increase or decrease their emphasis on each metric such that the change in emphasis improves profits. Using a thermostat analogy we apply an adaptive control feedback mechanism in which we estimate the incremental improvements in priorities that will increase profits. Iterations of adaptive control seek to maximize profits even if the environment is changing. We demonstrate the metric thermostat’s use in an application to a firm with over $20 billion in revenue. In developing the metric thermostat we recognize that there are hundreds of detailed actions, such as the use of the house of quality and the use of robust design, among which the product development team must choose. We also recognize that they will act in their own best interests to choose the actions that maximize their own implicit rewards as determined by the metrics. Management need not observe or dictate these detailed actions, but rather control the process by establishing the culture that sets the implicit weights on the metrics. The thermostat works by changing those implicit weights. We define the problem, introduce the adaptive control mechanism, modify “agency” theory to deal with incremental changes about an operating point, and derive methods that are practical and robust in light of the data that firms have available. Our methods include statistical estimation and internal surveys. The mathematics identify the critical few parameters that need be determined and highlight how to estimate them. Both the measures and the estimation are illustrated in our initial application to a large officeequipment firm. The metrics thermostat suggests that this firm has about the right emphasis on timeto- market, but has overshot on platform reuse and has lost its focus on customer satisfaction. We describe how the firm reacted to the recommendations and changed its organization. We describe additional ongoing applications with the US Air Force, the US Navy, and a major automobile and truck manufacturer.
</summary>
<dc:date>2001-07-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Transferring, Translating and Transforming: An Integrative Framework</title>
<link href="https://hdl.handle.net/1721.1/3959" rel="alternate"/>
<author>
<name>Carlile, Paul</name>
</author>
<id>https://hdl.handle.net/1721.1/3959</id>
<updated>2019-04-09T17:47:25Z</updated>
<published>2002-03-10T00:00:00Z</published>
<summary type="text">Transferring, Translating and Transforming: An Integrative Framework
Carlile, Paul
Organizations must establish processes for managing knowledge across boundaries because of the specialized and task-dependent forms of knowledge required to deliver products and services. To address this challenge an integrative framework is developed that identifies and integrates the value of different approaches to managing knowledge in organizations that are often presented as incompatible in the literature. The framework describes three progressively complex types of&#13;
boundaries: syntactic, semantic and pragmatic. Each increasingly complex boundary requires a&#13;
more complex process to facilitate communication and innovation across specialized forms of&#13;
knowledge. The framework categorizes types of boundaries, gauges their complexity, and then&#13;
describes the processes involved in managing knowledge across each of them. The development&#13;
of a new engineering tool in an automotive firm is presented to illustrate the conceptual strength&#13;
of this framework.
</summary>
<dc:date>2002-03-10T00:00:00Z</dc:date>
</entry>
<entry>
<title>Platform Architecture A Two-Level Optimization Approach</title>
<link href="https://hdl.handle.net/1721.1/3958" rel="alternate"/>
<author>
<name>de Weck, Olivier</name>
</author>
<author>
<name>Suh, Eun Suk</name>
</author>
<id>https://hdl.handle.net/1721.1/3958</id>
<updated>2019-04-12T08:08:29Z</updated>
<published>2002-10-03T00:00:00Z</published>
<summary type="text">Platform Architecture A Two-Level Optimization Approach
de Weck, Olivier; Suh, Eun Suk
Introduction to Platform Architecture in Products VW Golf 2 -Automotive Platforming Example:A Two-Level Optimization Approach 3 -Discussion
</summary>
<dc:date>2002-10-03T00:00:00Z</dc:date>
</entry>
<entry>
<title>Putting Patents in Context:Exploring Knowledge Transfer from MIT</title>
<link href="https://hdl.handle.net/1721.1/3957" rel="alternate"/>
<author>
<name>Agrawal, Ajay</name>
</author>
<author>
<name>Henderson, Rebecca</name>
</author>
<id>https://hdl.handle.net/1721.1/3957</id>
<updated>2019-04-10T12:16:28Z</updated>
<published>2001-08-09T00:00:00Z</published>
<summary type="text">Putting Patents in Context:Exploring Knowledge Transfer from MIT
Agrawal, Ajay; Henderson, Rebecca
In this paper we explore the degree to which patents are representative of the magnitude,&#13;
direction, and impact of the knowledge spilling out of the university by focusing on MIT, and in particular on the departments of Mechanical and Electrical Engineering.&#13;
Drawing on both qualitative and quantitative data, we show that patenting is a minority&#13;
activity: a majority of the faculty in our sample never patent, and publication rates&#13;
far outstrip patenting rates. Most faculty members estimate that patents account for&#13;
less than 10% of the knowledge that transfers from their labs. Our results also suggest&#13;
that in two important ways patenting is not representative of the patterns of knowledge&#13;
generation and transfer from MIT: patent volume does not predict publication volume,&#13;
and those firms that cite MIT papers are in general not the same firms as those that&#13;
cite MIT patents. However, patent volume is positively correlated with paper citations,&#13;
suggesting that patent counts may be reasonable measures of research impact. We close&#13;
by speculating on the implications of our results for the difficult but important question&#13;
of whether, in this setting, patenting acts as a substitute or a complement to the process&#13;
of fundamental research.
</summary>
<dc:date>2001-08-09T00:00:00Z</dc:date>
</entry>
<entry>
<title>Why Firefighting Is Never Enough: Preserving High-Quality Product Development</title>
<link href="https://hdl.handle.net/1721.1/3956" rel="alternate"/>
<author>
<name>Black, Laura</name>
</author>
<author>
<name>Repenning, Nelson</name>
</author>
<id>https://hdl.handle.net/1721.1/3956</id>
<updated>2019-04-10T19:40:56Z</updated>
<published>2000-01-01T00:00:00Z</published>
<summary type="text">Why Firefighting Is Never Enough: Preserving High-Quality Product Development
Black, Laura; Repenning, Nelson
Understanding the wide range of outcomes achieved by firms trying to implement TQM and similar process improvement initiatives presents a challenge to management science and organization theory: a few firms reap sustained benefits from their programs, but most efforts fail and are abandoned. A defining feature of such techniques is the reliance on the front-line workforce to do the work of improvement, thus creating the possibility of agency problems; different incentives facing managers and workers. Specifically, successfully improving productivity can lead to lay-offs. The literature provides two opposing theories of how agency interacts with the ability of quality-oriented improvement techniques to dramaticlly increase productivity. The 'Drive Out Fear' school argues that firms must commit to job security, while the 'Drive In Fear' school emphasizes the positive role that insecurity plays in motivating change. In this study a contract theoretic model is developed to analyze the role of agency in process improvement. The main insight of the study is that there are two types of job security, internal and external, that have opposite impacts on the firm's abilty to implement improvement initiatives. The distinction is useful in explaining the results of different case studies and can reconcile the two change theories.
</summary>
<dc:date>2000-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Metrics Thermostat</title>
<link href="https://hdl.handle.net/1721.1/3823" rel="alternate"/>
<author>
<name>Hauser, John</name>
</author>
<id>https://hdl.handle.net/1721.1/3823</id>
<updated>2019-04-12T08:06:53Z</updated>
<published>2001-07-01T00:00:00Z</published>
<summary type="text">Metrics Thermostat
Hauser, John
The explosion of information and information technology has led many firms to evolve a dispersed product development process with people and organizations spread throughout the world. To&#13;
coordinate such dispersed processes managers attempt to establish a culture that implicitly rewards product development teams based on their ability to perform against a set of strategic metrics such as customer satisfaction, time to market, defect reduction, or platform reuse. Many papers have focused on selecting the right metrics and establishing the culture. In this paper we focus on a practical method to fine-tune a firm's relative emphasis on the metrics that they have chosen. In particular, we seek to advise a firm whether to increase or decrease their emphasis on each metric such that the&#13;
change in emphasis improves profits.&#13;
Using a thermostat analogy we apply an adaptive control feedback mechanism in which we estimate the incremental improvements in priorities that will increase profits. Iterations of adaptive&#13;
control seek to maximize profits even if the environment is changing. We demonstrate the metric&#13;
thermostat’s use in an application to a firm with over $20 billion in revenue.&#13;
In developing the metric thermostat we recognize that there are hundreds of detailed actions,&#13;
such as the use of the house of quality and the use of robust design, among which the product development team must choose. We also recognize that they will act in their own best interests to choose the actions that maximize their own implicit rewards as determined by the metrics. Management need not observe or dictate these detailed actions, but rather control the process by establishing the culture that sets the implicit weights on the metrics. The thermostat works by changing those implicit weights.&#13;
We define the problem, introduce the adaptive control mechanism, modify “agency” theory&#13;
to deal with incremental changes about an operating point, and derive methods that are practical and robust in light of the data that firms have available. Our methods include statistical estimation and internal surveys. The mathematics identify the critical few parameters that need be determined and highlight how to estimate them. Both the measures and the estimation are illustrated in our initial application to a large officeequipment&#13;
firm. The metrics thermostat suggests that this firm has about the right emphasis on timeto-&#13;
market, but has overshot on platform reuse and has lost its focus on customer satisfaction. We&#13;
describe how the firm reacted to the recommendations and changed its organization. We describe&#13;
additional ongoing applications with the US Air Force, the US Navy, and a major automobile and&#13;
truck manufacturer.
</summary>
<dc:date>2001-07-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Untangling the Origins of Competitive Advantage</title>
<link href="https://hdl.handle.net/1721.1/3822" rel="alternate"/>
<author>
<name>Cockburn, Iain</name>
</author>
<author>
<name>Henderson, Rebecca</name>
</author>
<author>
<name>Stern, Scott</name>
</author>
<id>https://hdl.handle.net/1721.1/3822</id>
<updated>2019-04-12T08:06:53Z</updated>
<published>2000-03-01T00:00:00Z</published>
<summary type="text">Untangling the Origins of Competitive Advantage
Cockburn, Iain; Henderson, Rebecca; Stern, Scott
What are the origins of competitive advantage? Although this question is fundamental to strategy&#13;
research, it is one to which we lack a clear answer. As strategy researchers we believe that some firms consistently outperform others, and we have some evidence consistent with this belief (Rumelt, 1991; McGahan and Porter, 1997). We also have a number of well developed theories as to why, at any given moment, it is possible for some firms (and some industries) to earn supranormal returns. As of yet, however, we have no generally accepted theory C and certainly no systematic evidence C as to the origins or the dynamics of such differences in performance. We know, for example, why high barriers to entry coupled with a differentiated product positioning obtained through unique organizational competencies may provide a firm with competitive advantage. But we know much less about how barriers to entry are built: about why this firm and not that one developed the competencies that underlie advantage, and about the dynamic process out of which competitive advantage first arises and then erodes over time.
A paper prepared for the&#13;
SMJ Special Issue on The Evolution of Firm Capabilities
</summary>
<dc:date>2000-03-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Establishing Quantitative Economic Value for Features and Functionality of New Products and New Services (CHAPTER N)</title>
<link href="https://hdl.handle.net/1721.1/3821" rel="alternate"/>
<author>
<name>Otto, Kevin</name>
</author>
<author>
<name>Tang, Victor</name>
</author>
<author>
<name>Seering, Warren</name>
</author>
<id>https://hdl.handle.net/1721.1/3821</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2003-09-13T00:00:00Z</published>
<summary type="text">Establishing Quantitative Economic Value for Features and Functionality of New Products and New Services (CHAPTER N)
Otto, Kevin; Tang, Victor; Seering, Warren
This chapter has two key themes: (1) a list of customer needs is interesting, but insufficient for many development decisions, (2) establishing a quantified, dollar value for each requirement is more helpful. To that end, we present an approach and method to establishing the quantitative monetary value for new product features and performance. This approach is targeted to product development managers and engineers engaged at the “front-end” of the product development process when the decisions about selection and trade-off of product functions and features are made. This approach examines the customer’s business operations and essentially establishing their business case for your product down to the feature and performance levels. This provides for much better trade-off decisions in new product development. This approach also helps to identify whitespace opportunities, those new product and/or service opportunities that are not being served by any current product. Moreover, because the methodology is fine grained, the whitespace opportunities are resolved into clear and actionable product development projects.
</summary>
<dc:date>2003-09-13T00:00:00Z</dc:date>
</entry>
<entry>
<title>Organizational Languages</title>
<link href="https://hdl.handle.net/1721.1/3820" rel="alternate"/>
<author>
<name>Wernerfelt, Birger</name>
</author>
<id>https://hdl.handle.net/1721.1/3820</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2003-11-11T00:00:00Z</published>
<summary type="text">Organizational Languages
Wernerfelt, Birger
The paper is concerned with communication within a team of players trying to coordinate in response to information dispersed among them. The problem is nontrivial because they cannot communicate all information instantaneously, but have to send longer or shorter sequences of messages, using coarse codes. We focus on the design of these codes and show that members may gain compatibility advantages by using identical codes, and that this can support the existence of several, more or less efficient, symmetric equilibria. Asymmetric equilibria may exist only if coordination across different sets of members is of sufficiently different importance. The results are consistent with the stylized fact that firm differ even within industries and that coordination between divisions is harder than coordination inside divisions.
</summary>
<dc:date>2003-11-11T00:00:00Z</dc:date>
</entry>
<entry>
<title>Product Development Processes, Three Vectors Of Improvement</title>
<link href="https://hdl.handle.net/1721.1/3819" rel="alternate"/>
<author>
<name>Holmes, Maurice</name>
</author>
<author>
<name>Ronald, Campbell</name>
</author>
<id>https://hdl.handle.net/1721.1/3819</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2003-01-01T00:00:00Z</published>
<summary type="text">Product Development Processes, Three Vectors Of Improvement
Holmes, Maurice; Ronald, Campbell
Product Development Processes have achieved a state of some maturity in recent years, but have focused primarily on structuring technical activities from the initiation of development to launch. We advocate major advances on three fronts; first, implementing an end-to-end process from the front end through field operations, second, integrating business considerations much better into the end-to-end process, and third, incorporating a performance improvement closed loop into the process. We call the resulting process a Product Development Business Process. Three initial applications are summarized.
Improving product development processes along three key vectors leads to greatly improved business performance.
</summary>
<dc:date>2003-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Surviving the Gales of Creative Destruction: The Patterns of Innovative Activity in the Desktop Laser Printer Industry</title>
<link href="https://hdl.handle.net/1721.1/3818" rel="alternate"/>
<author>
<name>de Figueiredo, John</name>
</author>
<author>
<name>Kyle, Margaret</name>
</author>
<id>https://hdl.handle.net/1721.1/3818</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2000-07-14T00:00:00Z</published>
<summary type="text">Surviving the Gales of Creative Destruction: The Patterns of Innovative Activity in the Desktop Laser Printer Industry
de Figueiredo, John; Kyle, Margaret
In this paper, we examine the product life cycle in the desktop laser printer industry, from its inception in 1984 through 1996. During this time, the industry experienced a significant degree of innovation, as well as an enormous amount of product introduction and subsequent exit. The relative roles of market structure, innovation,&#13;
and firm effects are explored in more detail using a multidimensional product space. We introduce a very detailed product-level dataset on the desktop laser printer industry. We have a number of findings: (1) product portfolios of&#13;
firms are growing larger on average, as fewer firms offer more products; (2) products on the technological frontier have better survival prospects than printers behind the frontier; (3) product characteristics, such as page description&#13;
language, speed, and resolution, have the largest effect on product survival rates; (4) awards granted to models and firms by leading PC publications have no effect on hazard rates of the current product portfolios of firms, but lead to much higher entry rates by the firm; and (5) while there are many similarities between dominant and fringe firms, differences in innovative and product life cycle behavior persist which is often overlooked in current studies of economic activity.
</summary>
<dc:date>2000-07-14T00:00:00Z</dc:date>
</entry>
<entry>
<title>Modeling Complex Behavior Simply with Embedded System Engineering</title>
<link href="https://hdl.handle.net/1721.1/3817" rel="alternate"/>
<author>
<name>Salminen, Vesa</name>
</author>
<author>
<name>Pillai, Balan</name>
</author>
<id>https://hdl.handle.net/1721.1/3817</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2000-01-01T00:00:00Z</published>
<summary type="text">Modeling Complex Behavior Simply with Embedded System Engineering
Salminen, Vesa; Pillai, Balan
Newton’s second law as written appears simple, but how to use the law perceptually is a complex task. Most of the&#13;
cases it is good for conceptual discussion or development. Normally the sign convention is not shown and has problems&#13;
to solve. Due to this fact, many do not fully understand Newton’s second law. Modeling complex behavior is like the&#13;
Newtonian law; people tend to misuse. In this article, it is experienced modeling the complex behavior simply with&#13;
embedded system engineering scheme, which is conceptually a new approach. In cases where applied Newton’s second&#13;
law for the center of gravity G but the correct point in the formulation is the center of mass C. This is why the subscript&#13;
c, not G is chosen. Nevertheless, in many engineering applications, the gravity field may be considered uniform, hence&#13;
the center of mass and the center of gravity coincide, C = G, even though they are different by definition.
</summary>
<dc:date>2000-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Do Venture Capitalists Affect Commercialization Strategies at Start-ups?</title>
<link href="https://hdl.handle.net/1721.1/3816" rel="alternate"/>
<author>
<name>Hsu, David</name>
</author>
<id>https://hdl.handle.net/1721.1/3816</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2000-06-16T00:00:00Z</published>
<summary type="text">Do Venture Capitalists Affect Commercialization Strategies at Start-ups?
Hsu, David
I empirically study the effect of venture capital (VC) on product development and&#13;
commercialization strategy of start-up organizations. In doing so, I segment entrant&#13;
commercialization strategies into two camps according to competitive effect: to&#13;
“cooperate” is to license-out technology or be acquired, while to “compete” is to develop&#13;
technology independently. Building on the work of Gans, Hsu, and Stern (2000) on the&#13;
drivers of entrant commercialization strategy, this paper examines the direct and indirect&#13;
effects of VC on product development and competition. I start with two important&#13;
determinants of start-up commercialization strategy: (1) the entrant’s relative investment&#13;
cost of acquiring and controlling complementary assets needed to successfully&#13;
commercialize its innovation, and (2) the entrant’s ability to effectively protect its&#13;
intellectual property. I then test a novel sample of 118 technology-based projects divided&#13;
almost evenly between two mechanisms of entrepreneurial finance. These two&#13;
mechanisms differ in institutional detail in ways that allow a quasi-experiment of the&#13;
effect of VC on start-up commercialization strategy. The U.S. Small Business Innovative&#13;
Research (SBIR) program provides a grant to R&amp;D without taking equity in a start-up or&#13;
changing the corporate governance of project development. In contrast, VCs take an&#13;
equity stake and participate in corporate governance in exchange for capital. Neither of&#13;
these financing mechanisms, however, alters the underlying complementary asset or&#13;
intellectual property regime associated with the project. Two main findings about the&#13;
commercialization strategy and product market effects of venture capital emerge: (1) VCbacking&#13;
skews commercialization strategies across industries toward cooperating, and (2)&#13;
VCs make their portfolio firms more sensitive to the business environment.
</summary>
<dc:date>2000-06-16T00:00:00Z</dc:date>
</entry>
<entry>
<title>Competition, Innovation, and Product Exit</title>
<link href="https://hdl.handle.net/1721.1/3815" rel="alternate"/>
<author>
<name>de Figueiredo, John</name>
</author>
<author>
<name>Kyle, Margaret</name>
</author>
<id>https://hdl.handle.net/1721.1/3815</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2001-03-19T00:00:00Z</published>
<summary type="text">Competition, Innovation, and Product Exit
de Figueiredo, John; Kyle, Margaret
Why do products exit markets? This paper integrates rationale for product exit from a number of different literatures and compares the statistical and substantive effect of these explanations. We use a novel dataset covering&#13;
every product introduced into the desktop laser printer industry since its inception. Using hedonic models, hazard rate models, and count models, this study generates three main findings. First, innovation does not drive products&#13;
out of market per se. Managers do not pull products off the market when they innovate. Rather they seem to keep the incumbent products on the market and add the newer, more innovative products to the marketplace that have longer expected lives. Second, competition has a large impact on driving products out of markets. These noninnovative products remain in the product portfolios of companies until competition drive the products out of markets, not managerial decisions. Third, holding other factors constant, scale and learning have a marginal statistical and substantive effect on product exit.
</summary>
<dc:date>2001-03-19T00:00:00Z</dc:date>
</entry>
<entry>
<title>Product Concept Metrics: a Preliminary Study Working Paper</title>
<link href="https://hdl.handle.net/1721.1/3814" rel="alternate"/>
<author>
<name>Takala, Roope</name>
</author>
<author>
<name>Hölttä, Katja</name>
</author>
<id>https://hdl.handle.net/1721.1/3814</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2001-07-01T00:00:00Z</published>
<summary type="text">Product Concept Metrics: a Preliminary Study Working Paper
Takala, Roope; Hölttä, Katja
Metrics for product concept evaluation and screening is a relatively unstudied topic of product development.&#13;
Having a clearly documented set of metrics for concept screening decisions is a prerequisite for an educated and traceable decision making process. Measuring product concepts and comparing the results of pervious products and their success rates to the metrics documented for their screening provides a basis on which to improve the efficiency of product development work. In this study a list of product concept screening metrics, or issues if you please, is put forth. This list is priorized according to the importance of the metrics. The relative importance of the metrics is determined for a number of different groups, including&#13;
academia and industry, along with geographical samples from Finland and the USA. Furthermore a brief study on published product concepts is presented to show that some of these metrics are indeed researched by companies and to determine the information that companies seek to obtain by publishing product concepts. Finally the realtion of the results of these studies is discussed in terms of theirimplications to the management of research and development in corporations.
</summary>
<dc:date>2001-07-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Transferring, Translating and Transforming: An Integrative Framework</title>
<link href="https://hdl.handle.net/1721.1/3813" rel="alternate"/>
<author>
<name>Carlile, Paul</name>
</author>
<id>https://hdl.handle.net/1721.1/3813</id>
<updated>2025-05-14T14:52:32Z</updated>
<published>2002-03-10T00:00:00Z</published>
<summary type="text">Transferring, Translating and Transforming: An Integrative Framework
Carlile, Paul
Organizations must establish processes for managing knowledge across boundaries because of the specialized and task-dependent forms of knowledge required to deliver products and services. To address this challenge an integrative framework is developed that identifies and integrates the value of different approaches to managing knowledge in organizations that are often presented as incompatible in the literature. The framework describes three progressively complex types of boundaries: syntactic, semantic and pragmatic. Each increasingly complex boundary requires a&#13;
more complex process to facilitate communication and innovation across specialized forms of&#13;
knowledge. The framework categorizes types of boundaries, gauges their complexity, and then&#13;
describes the processes involved in managing knowledge across each of them. The development&#13;
of a new engineering tool in an automotive firm is presented to illustrate the conceptual strength&#13;
of this framework.
</summary>
<dc:date>2002-03-10T00:00:00Z</dc:date>
</entry>
<entry>
<title>Towards a Theory of Complicatedness: Framework for Complex Systems Analysis and Design</title>
<link href="https://hdl.handle.net/1721.1/3812" rel="alternate"/>
<author>
<name>Tang, Victor</name>
</author>
<author>
<name>Salminen, Vesa</name>
</author>
<id>https://hdl.handle.net/1721.1/3812</id>
<updated>2025-05-14T14:52:31Z</updated>
<published>2001-08-01T00:00:00Z</published>
<summary type="text">Towards a Theory of Complicatedness: Framework for Complex Systems Analysis and Design
Tang, Victor; Salminen, Vesa
Global, dynamic, and competitive business environment has increased the complexity in&#13;
product, service, operational processes and human side. Much engineering effort goes into&#13;
reducing systems complexity. We argue that the real issue is reducing complicatedness. This is&#13;
an important distinction. Complexity can be a desirable property of systems provided it is&#13;
architected complexity that reduces complicatedness. Complexity and complicatedness are not synonyms. Complexity is an inherent property of systems; complicatedness is a derived&#13;
function of complexity. We introduce the notion of complicatedness of complex systems,&#13;
present equations for each and show they are separate and distinct properties. To make these&#13;
ideas actionable, we present a design methodology to address complicatedness. We show&#13;
examples and discuss how our equations reflect the fundamental behavior of complex systems&#13;
and how our equations are consistent with our intuition and system design experience. We&#13;
discuss validation experiments with global firms and address potential areas for further&#13;
research. We close with a discussion of the implications for systems design engineers. As&#13;
engineers, we believe our strongest contributions are to the analysis, design, and managerial&#13;
practice of complex systems analysis and design.&#13;
We illustrate the difference between complexity and complicatedness. Relative to a manual&#13;
transmission, a car’s automatic transmission has more parts and more intricate linkages. It is&#13;
more complex. To drivers, it is unquestionably less complicated, but to mechanics who have to&#13;
fix it, it is more complicated. This illustrates a fundamental fact about systems; decision units&#13;
act on systems to manage their behavior. Complexity is an inherent property of systems.&#13;
Complicatedness is a derived property that characterizes an execution unit’s ability to manage&#13;
a complex system. A system of complexity level, Ca, may present different degrees of&#13;
complicatedness, K, to distinct execution units E and F; KE=KE(Ca) â&#137;  KF=KF(Ca).&#13;
We summarize relevant literature on systems complexity in Figure 1. Columns [1] to [15] are&#13;
keyed to the references. Rows identify key areas of research results; e.g., metrics,&#13;
complicatedness, etc. We make four observations about the locus of results. One, there is a&#13;
dearth of quantitative frameworks or metrics. There is no research on complicatedness and&#13;
complexity as distinct properties of systems. Two, research seems to cluster around&#13;
engineering management and physical products. The focus is on modularization and&#13;
interactions with a bias to linear systems and qualitative metrics. Three, there are efforts on&#13;
methodologies and tools, but theory, foundations and software have a demonstrably lesser&#13;
presence. Ferdinand’s work in software systems complexity is a happy exception [1]. It is&#13;
analytical, rigorous and elegant. Three, services and enterprise solutions are barely addressed.&#13;
This is a serious omission given the high proportion of services in industrialized economies.&#13;
Fourth, although layering of abstract systems and reintegration have a long history; the&#13;
literature is skewed to decomposition rather than integration.
To be presented at the 13 th International Conference on Engineering Design, Glasgow, Scotland, August 2001.
</summary>
<dc:date>2001-08-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Architecting Option Content</title>
<link href="https://hdl.handle.net/1721.1/3811" rel="alternate"/>
<author>
<name>Otto, Kevin</name>
</author>
<id>https://hdl.handle.net/1721.1/3811</id>
<updated>2019-04-09T18:35:28Z</updated>
<published>2000-01-01T00:00:00Z</published>
<summary type="text">Architecting Option Content
Otto, Kevin
This paper presents an approach to determine the proper number of levels required on independent product architectural attributes, given their ability to generate added revenue through more direct targeting to smaller segments, and given the added costs of doing so. This is done in as simple and readily implementable manner as&#13;
possible, making use only of conjoint data and cost estimates. From this, the order in which to consider added breakouts across the different attributes are prioritized. From this, for any minimum level of profit worth considering, a set of attribute levels to offer on each architectural attribute can be selected. Then, for any selected set of attribute levels to offer, the most effective product family using those levels is determined from the permutations.
</summary>
<dc:date>2000-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Modularization to Support Multiple Brand Platforms</title>
<link href="https://hdl.handle.net/1721.1/3810" rel="alternate"/>
<author>
<name>Agus, Sudjianto</name>
</author>
<author>
<name>Otto, Kevin</name>
</author>
<id>https://hdl.handle.net/1721.1/3810</id>
<updated>2019-04-12T08:17:22Z</updated>
<published>2001-09-09T00:00:00Z</published>
<summary type="text">Modularization to Support Multiple Brand Platforms
Agus, Sudjianto; Otto, Kevin
Methods to determine acceptable architecture for multiple platforms supporting multiple brands must represent both platform cost saving commonization as well as revenue enhancing brand distinctions. Functional architecting methods&#13;
determine modularization based upon functional concerns. Brand identity is additionally determined by sensory aesthetics.&#13;
We introduce three architecting rules to maintain brand identity in platforms. A dominant theme must be ensured on each product of a brand, and this must be transferred to each product's specifications and aesthetics. Elements critical to brand identity must be made common across all products in a brand. For any platform, brand specific elements must be maintained unique on each product variant. The set of elements not identified as a brand carrier can be made common to a platform. A matrix representation of each platform and its supported brand variants is useful as an architecting tool.
</summary>
<dc:date>2001-09-09T00:00:00Z</dc:date>
</entry>
<entry>
<title>Modularizing Product Architectures Using Dendrograms</title>
<link href="https://hdl.handle.net/1721.1/3809" rel="alternate"/>
<author>
<name>Hölttä, Katja</name>
</author>
<author>
<name>Tang, Victor</name>
</author>
<author>
<name>Seering, Warren</name>
</author>
<id>https://hdl.handle.net/1721.1/3809</id>
<updated>2019-04-10T19:23:00Z</updated>
<published>2003-01-01T00:00:00Z</published>
<summary type="text">Modularizing Product Architectures Using Dendrograms
Hölttä, Katja; Tang, Victor; Seering, Warren
A module is a structurally independent building block of a larger system with well-defined&#13;
interfaces. A module is fairly loosely connected to the rest of the system allowing an&#13;
independent development of the module as long as the interconnections at the interfaces are&#13;
well thought of. [1][2]&#13;
The advantages of modularity are possible economies of scale and scope and economies in&#13;
parts sourcing [1]. Modularity also provides flexibility that enables product variations and&#13;
technology development without changes to the overall design [2]. Same flexibility allows&#13;
also for independent development of modules, which is useful in concurrent design or&#13;
overlapped product development [3], collaborative projects, or when buying the module from&#13;
a supplier [4]. Modularity also eases the management of complex product architectures [2]&#13;
and therefore also their development. Modularity can also be used to create product families&#13;
[5] [6] [7]. This saves design and testing costs and can allow for greater variation but one&#13;
must be aware of possible excess functionality costs if a low cost and low functionality part is&#13;
replaced by a higher cost part in order to use the same part in both products [8] [9].&#13;
Modularity and product platforms have been shown to be useful [e.g. 6] but there seem to be&#13;
few methods to choose the best modules for a product family or joint development platform.&#13;
Baldwin and Clark [1] discuss how to modularize but they do not address the problem of&#13;
what exactly should be included in a module. Ericsson [2] has developed a modularization&#13;
method called Modular Function Deployment (MFD) but it is intended for single products&#13;
only, not product families. Also Design Structure Matrix clustering [10] [11] is intended for&#13;
single products, but it has an advantage that it has been reduced to a repeatable algorithms that&#13;
2&#13;
can be run by a computer, which enables the modularizing of also complex systems. Stake&#13;
[11] introduces a clustering algorithm for MFD to group functions according to modularity&#13;
driver scores. He and Blackenfelt [12] also show how MFD and DSM can be integrated to&#13;
combine benefits of the two methods but they are still intended for single products only. Kota&#13;
et al. [13] present a benchmark method to compare own platform to competitor’s platform.&#13;
The method takes manufacturing, component’s size, and material into account in addition to&#13;
functionality, but it is not a platforming tool. Stone et al. [14] discuss heuristics to group&#13;
functions in a function structure [for more about function structure see 15] into modules&#13;
within a product and Zamirowski and Otto [7] add three additional heuristics to apply across&#13;
products in a product family. Dahmus [5] et al. apply the heuristics and introduce a&#13;
modularity matrix to help decide what modules should and what should not be shared across a&#13;
product family. The weakness of the heuristics is that they are not repeatable since the&#13;
functional decomposition and the use of heuristics depend on the user’s point of view. Our&#13;
goal is to overcome these weaknesses by introducing a more systematic method for grouping&#13;
functions into modules.&#13;
Another weakness of the existing methods is that they use nominal or ordinal scales instead of&#13;
more rigorous ratio scales. Sosa et al. [16] use ordinal scale (-2,-1,0,1,2) in component DSMs,&#13;
Ericsson [2] in MFD, and Stake [11] and Blackenfelt [12] in their combined MFD/DSM&#13;
approach. Dahmus [5] as well as Zamirowski and Otto [7] suggest the use of Pugh’s concept&#13;
selection that is also based on ordinal scales. Kmenta and Ishii [17] discuss the problem of&#13;
performing arithmetic operations on ordinal measures. Stated simply, it produces inconsistent&#13;
results. Otto and Wood [18] discuss more broadly the strengths and weakness of these&#13;
different type measures. Kurshid and Sahai [19] present a rigorous treatment of these&#13;
measures. Ratio scales are most useful because the point zero has meaning, and mathematical&#13;
operations such as multiplying and dividing have meaning, e.g., meters/second.&#13;
In this paper, we address the weakness of all the above. We use a more flexible flow method&#13;
[20] for identifying possible modules in a function structure and our algorithm can be put into&#13;
a computer. In addition we develop a genuine metric space with a distance function that is&#13;
based on the flow characteristics and we will use a ratio scale.&#13;
This algorithm is designed especially for the flow method [20] but it could possibly be used&#13;
also in conjunction with other modularization methods. The flow method is based on the&#13;
heuristics introduced by Stone et al. [14] and further developed for product families by&#13;
Zamirowski and Otto [7]. The difference is that in flow method the focus is on the flows&#13;
instead of the functions in a function structure. Functions can even be ignored since often the&#13;
end result (outputs) and the requirements needed to achieve it (inputs) are all that matter. The&#13;
flow method was designed to identify commonalties between different products. It is more&#13;
flexible than the function focused heuristics and can therefore be used also in case of joint&#13;
development of a common module for even very different products. It is also applicable in&#13;
product family platforming.&#13;
The problem we address in this study is how to group functions in a functional&#13;
decomposition, such as a function structure, to form a module commonalty hierarchy that can&#13;
be used to define common modules across products. The following section will introduce the&#13;
grouping algorithm. We will then go on to show an example of this method applied to four&#13;
products. We will end the article with our conclusions and suggestions for further study
</summary>
<dc:date>2003-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Patterns of Product Development Interactions</title>
<link href="https://hdl.handle.net/1721.1/3808" rel="alternate"/>
<author>
<name>Eppinger, Steven</name>
</author>
<author>
<name>Salminen, Vesa</name>
</author>
<id>https://hdl.handle.net/1721.1/3808</id>
<updated>2019-04-10T12:16:28Z</updated>
<published>2001-08-21T00:00:00Z</published>
<summary type="text">Patterns of Product Development Interactions
Eppinger, Steven; Salminen, Vesa
Development of complex products and large systems is a highly interactive social process&#13;
involving hundreds of people designing thousands of interrelated components and making&#13;
millions of coupled decisions. Nevertheless, we have created methods to study the&#13;
development process, identify its underlying structures, and critique its operation.&#13;
In this article, we introduce three views of product development complexity: a process&#13;
view, a product view, and an organization view. We are able to learn about the complex&#13;
social phenomenon of product development by studying the patterns of interaction across&#13;
the decomposed elements within each view. We also compare the alignment of the&#13;
interaction patterns, between the product, process, and organization domains. We then&#13;
propose metrics of product development complexity by studying and comparing these&#13;
interaction patterns. Finally, we develop hypotheses regarding the patterns of product&#13;
development interactions, which will be helpful to guide future research.
INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN&#13;
ICED 01 GLASGOW, AUGUST 21-23, 2001
</summary>
<dc:date>2001-08-21T00:00:00Z</dc:date>
</entry>
<entry>
<title>Product Development Process Modeling Using Advanced Simulation</title>
<link href="https://hdl.handle.net/1721.1/3807" rel="alternate"/>
<author>
<name>Cho, Soo-Haeng</name>
</author>
<author>
<name>Eppinger, Steven</name>
</author>
<id>https://hdl.handle.net/1721.1/3807</id>
<updated>2019-04-12T08:09:14Z</updated>
<published>2001-09-01T00:00:00Z</published>
<summary type="text">Product Development Process Modeling Using Advanced Simulation
Cho, Soo-Haeng; Eppinger, Steven
This paper presents a product development process&#13;
modeling and analysis technique using advanced simulation.The model computes the probability distribution of lead time in a resource-constrained project network where iterations take&#13;
place among sequential, parallel and overlapped tasks. The model uses the design structure matrix representation to capture the information flows between tasks. In each simulation run,&#13;
the expected durations of tasks are initially sampled using the Latin Hypercube Sampling method and decrease over time as the model simulates the progress of dynamic stochastic processes. It is assumed that the rework of a task occurs for the&#13;
following reasons: (1) new information is obtained from overlapped tasks after starting to work with preliminary inputs,&#13;
(2) inputs change when other tasks are reworked, and (3) outputs fail to meet established criteria. The model can be used for better project planning and control by identifying leverage points for process improvements and evaluating alternative planning and execution strategies. An industrial example is used to illustrate the utility of the model.
</summary>
<dc:date>2001-09-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Patent Litigation As a Leading Market Indicator</title>
<link href="https://hdl.handle.net/1721.1/3806" rel="alternate"/>
<author>
<name>Tang, Victor</name>
</author>
<author>
<name>Huang, Biao</name>
</author>
<id>https://hdl.handle.net/1721.1/3806</id>
<updated>2019-04-10T12:16:28Z</updated>
<published>2001-01-01T00:00:00Z</published>
<summary type="text">Patent Litigation As a Leading Market Indicator
Tang, Victor; Huang, Biao
The purpose of this paper is to introduce patent litigation as a leading indicator of market&#13;
growth. We model the intensity of patent litigation and the market growth for the&#13;
personal computer and cellular phone market in the US. By means of these analytic&#13;
models, we show that patent litigation is a leading indicator to market growth. We are&#13;
also able to very precisely delineate discrete stages of the product’s market life cycle and&#13;
demarcate the time when life-cycle transitions are about to take place. We close this&#13;
paper with a discussion on new lines of patent research that are potentially useful for&#13;
managerial practice and for investment decisions.
</summary>
<dc:date>2001-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Meanings, Measures, Maps, and Models: Understanding the Mechanisms of Continuous Change</title>
<link href="https://hdl.handle.net/1721.1/3805" rel="alternate"/>
<author>
<name>Repenning, Nelson</name>
</author>
<id>https://hdl.handle.net/1721.1/3805</id>
<updated>2019-04-10T12:16:28Z</updated>
<published>2000-11-01T00:00:00Z</published>
<summary type="text">Meanings, Measures, Maps, and Models: Understanding the Mechanisms of Continuous Change
Repenning, Nelson
There is now considerable controversy concerning the role that incremental change plays in the process of organizational transformation. Some scholars assert that incremental change is the primary source of resistance to more radical re-orientations, while others argue that on occasion, ongoing incremental change can produce dramatic transformation. To help reconcile these competing perspectives, in this paper I report the results of an inductive study of one firm's successful attempt to improve continuously and incrementally its core manufacturing process. The principal results of this effort are: (1) to challenge the current view of the source of change in process-oriented improvement initiatives; and (2) to offer an alternative characterization of the mechanisms through which competence-enhancing, incremental change actually occurs. The theory emerging from this analysis provides one path to resolving the dilemma posed by incremental change processes that can, on occasion, produce organizational transformation, but more often limit the organization's ability to adapt to its environment.
</summary>
<dc:date>2000-11-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Why Firefighting Is Never Enough: Preserving High-Quality Product</title>
<link href="https://hdl.handle.net/1721.1/3804" rel="alternate"/>
<author>
<name>Black, Laura</name>
</author>
<author>
<name>Repenning, Nelson</name>
</author>
<id>https://hdl.handle.net/1721.1/3804</id>
<updated>2019-04-09T16:25:04Z</updated>
<published>2001-01-01T00:00:00Z</published>
<summary type="text">Why Firefighting Is Never Enough: Preserving High-Quality Product
Black, Laura; Repenning, Nelson
In this paper, we add to insights already developed in single-project models about insufficient resource allocation and the "firefighting" and last-minute rework that often result by asking why dysfunctional resource allocation persists from project to project. We draw on data collected from a field site concerned about its new product development process and its quality of output to construct a simple model that portrays resource allocation in a multi-project development environment. The main insight of the analysis is that under-allocating resources to the early phases of a given project in a multi-project environment can create a vicious cycle of increasing error rates, overworked engineers, and declining performance in all future projects. Policy analysis begins with those that were under consideration by the organization described in our data set. Those policies turn out to offer relatively low leverage in offsetting the problem. We then test a sequence of new policies, each designed to reveal a different feature of the system's structure and conclude with a strategy that we believe can significantly offset the dysfunctional dynamics we discuss. The paper concludes with a discussion of the challenges managers may face in implementing the strategy that can prevent persistent under-allocation of resources to projects.
</summary>
<dc:date>2001-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>A Simulation-Based Approach to Understanding the Dynamics of Innovation Implementation</title>
<link href="https://hdl.handle.net/1721.1/3803" rel="alternate"/>
<author>
<name>Repenning, Nelson</name>
</author>
<id>https://hdl.handle.net/1721.1/3803</id>
<updated>2019-04-12T08:27:55Z</updated>
<published>1999-10-01T00:00:00Z</published>
<summary type="text">A Simulation-Based Approach to Understanding the Dynamics of Innovation Implementation
Repenning, Nelson
The history of management practice is filled with innovations that failed to live up to the promise suggested by their early success. A paradox facing organization theory is that the failure of these innovations often cannot be attributed to an intrinsic lack of efficacy. To resolve this paradox, in this paper I study the process of innovation implementation. Working from existing theoretical frameworks, I synthesize a model that describes the process through which participants in an organization develop commitment to using a newly adopted innovation. I then translate that framework into a formal model and analyze it using computer simulation. The analysis suggests three new constructs—reversion, regeneration and the motivation threshold—characterizing the dynamics of implementation. Taken together, these constructs offer an alternative explanation for the paradox of innovations that produce early results but fail to find a permanent home in the organizations that adopt them.
</summary>
<dc:date>1999-10-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Integrated Simulation and Design Synthesis</title>
<link href="https://hdl.handle.net/1721.1/3802" rel="alternate"/>
<author>
<name>Wallace, David</name>
</author>
<author>
<name>Yang, Elaine</name>
</author>
<author>
<name>Senin, Nicola</name>
</author>
<id>https://hdl.handle.net/1721.1/3802</id>
<updated>2019-04-10T12:16:24Z</updated>
<published>2001-01-01T00:00:00Z</published>
<summary type="text">Integrated Simulation and Design Synthesis
Wallace, David; Yang, Elaine; Senin, Nicola
The potential benefits of mathematically predicting and analyzing the integrated behavior&#13;
of product concepts throughout the design synthesis cycle are widely recognized. Better&#13;
up-front integrated design will not only reduce development time and cost, but also will&#13;
yield higher quality products with improved performance. Many academic researchers&#13;
and companies have attempted to develop integrated simulation environments, and it has&#13;
been observed consistently that significant difficulties arise because of the large scale,&#13;
complexity, rate-of-change, heterogeneity, and proprietary barriers associated with&#13;
product design synthesis. However, the focus of most integration efforts has been on&#13;
enabling technology, while the process of how integrated systems are constructed has not&#13;
been questioned.&#13;
The literature acknowledges that it is very difficult to represent and structure emergent&#13;
processes using explicit system definition techniques like those that have been almost&#13;
universally adopted. The belief that design synthesis is an emergent system definition&#13;
process drives the search for a different approach to building integrated design&#13;
simulations. Inspired by a vision of the World-Wide Web as an emergent informationnetwork&#13;
building environment, a World-Wide Simulation Web concept is proposed for&#13;
defining an emergent, integrated, simulation-building environment. Participants should&#13;
be able to make interfaces to local sub-system simulations parametrically operable and&#13;
accessible over the Internet. Furthermore, any participant should be able to make&#13;
relationships between parameters in different simulation interfaces or to create additional&#13;
models that bridge interfaces to different simulations distributed over the Internet.&#13;
The DOME (Distributed Object-based Modeling Environment) project has developed a&#13;
software infrastructure for the purpose of refining and testing emergent simulation&#13;
definition concepts. A federating solving mechanism has been developed that allows&#13;
local solvers to respond in a manner that is consistent with the overall system structure&#13;
even though there is no centralized coordination of the simulation. Results from several&#13;
pilot studies support the belief that an emergent, decentralized approach to building&#13;
integrated simulations can resolve many of the difficulties associated with integrated&#13;
system simulation.
</summary>
<dc:date>2001-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Integrated Design in a Service Marketplace</title>
<link href="https://hdl.handle.net/1721.1/3801" rel="alternate"/>
<author>
<name>Abrahamson, Shaun</name>
</author>
<author>
<name>Wallace, David</name>
</author>
<author>
<name>Senin, Nicola</name>
</author>
<author>
<name>Sferro, Peter</name>
</author>
<id>https://hdl.handle.net/1721.1/3801</id>
<updated>2019-04-12T08:09:07Z</updated>
<published>2000-01-01T00:00:00Z</published>
<summary type="text">Integrated Design in a Service Marketplace
Abrahamson, Shaun; Wallace, David; Senin, Nicola; Sferro, Peter
This paper presents a service marketplace vision for enterprise-wide integrated design&#13;
modeling. In this environment, expert participants and product development&#13;
organizations are empowered to publish their geometric design, CAE, manufacturing, or&#13;
marketing capabilities as live services that are operable over the Internet. These services&#13;
are made available through a service marketplace. Product developers, small or large, can&#13;
subscribe to and flexibly inter-relate these services to embody a distributed product&#13;
development organization, while simultaneously creating system models that allow the&#13;
prediction and analysis of integrated product performance. It is hypothesized that product&#13;
development services will become commodities, much like many component-level&#13;
products are today. It will be possible to rapidly interchange equivalent design service&#13;
providers so that the development of the product and the definition of the product&#13;
development organization become part of the same process. Computer-aided design tools&#13;
will evolve to facilitate the publishing of live design services. A research prototype&#13;
system called DOME is used to illustrate the concept and a pilot study with Ford Motor&#13;
Company is used in a preliminary assessment of the vision.
</summary>
<dc:date>2000-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Fast Polyhedral Adaptive Conjoint Estimation</title>
<link href="https://hdl.handle.net/1721.1/3800" rel="alternate"/>
<author>
<name>Olivier, Toubia</name>
</author>
<author>
<name>Duncan, Simester</name>
</author>
<author>
<name>John, Hauser</name>
</author>
<id>https://hdl.handle.net/1721.1/3800</id>
<updated>2019-04-09T19:03:39Z</updated>
<published>2002-02-01T00:00:00Z</published>
<summary type="text">Fast Polyhedral Adaptive Conjoint Estimation
Olivier, Toubia; Duncan, Simester; John, Hauser
We propose and test a new adaptive conjoint analysis method that draws on recent polyhedral “interior-point” developments in mathematical programming. The method is designed to offer accurate estimates after relatively few questions in problems involving many parameters. Each respondent’s ques-tions are adapted based upon prior answers by that respondent. The method requires computer support but can operate in both Internet and off-line environments with no noticeable delay between questions. We use Monte Carlo simulations to compare the performance of the method against a broad array of relevant benchmarks. While no method dominates in all situations, polyhedral algorithms appear to hold significant potential when (a) metric profile comparisons are more accurate than the self-explicated importance measures used in benchmark methods, (b) when respondent wear out is a concern, and (c) when product development and/or marketing teams wish to screen many features quickly. We also test hybrid methods that combine polyhedral algorithms with existing conjoint analysis methods. We close with suggestions on how polyhedral methods can be used to address other marketing problems.
</summary>
<dc:date>2002-02-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>Application and Test of Web-based Adaptive Polyhedral Conjoint Analysis</title>
<link href="https://hdl.handle.net/1721.1/3773" rel="alternate"/>
<author>
<name>Dahan, Ely</name>
</author>
<author>
<name>Hauser, John</name>
</author>
<author>
<name>Simester, Duncan</name>
</author>
<author>
<name>Toubia, Olivier</name>
</author>
<id>https://hdl.handle.net/1721.1/3773</id>
<updated>2019-04-11T07:03:42Z</updated>
<published>2002-01-01T00:00:00Z</published>
<summary type="text">Application and Test of Web-based Adaptive Polyhedral Conjoint Analysis
Dahan, Ely; Hauser, John; Simester, Duncan; Toubia, Olivier
In response to the need for more rapid and iterative feedback on customer preferences, researchers are developing new web-based conjoint analysis methods that adapt the design of conjoint questions based on a respondent’s answers to previous questions. Adapting within a respondent is a difficult dy-namic optimization problem and until recently adaptive conjoint analysis (ACA) was the dominant method available for addressing this adaptation. In this paper we apply and test a new polyhedral method that uses “interior-point” math programming techniques. This method is benchmarked against both ACA and an efficient non-adaptive design (Fixed). &#13;
&#13;
Over 300 respondents were randomly assigned to different experimental conditions and were asked to complete a web-based conjoint exercise. The conditions varied based on the design of the con-joint exercise. Respondents in one group completed a conjoint exercise designed using the ACA method, respondents in another group completed an exercise designed using the Fixed method, and the remaining respondents completed an exercise designed using the polyhedral method. Following the conjoint exer-cise respondents were given $100 and allowed to make a purchase from a Pareto choice set of five new-to-the-market laptop computer bags. The respondents received their chosen bag together with the differ-ence in cash between the price of their chosen bag and the $100. &#13;
&#13;
We compare the methods on both internal and external validity. Internal validity is evaluated by comparing how well the different conjoint methods predict several holdout conjoint questions. External validity is evaluated by comparing how well the conjoint methods predict the respondents’ selections from the choice sets of five bags. &#13;
&#13;
The results reveal a remarkable level of consistency across the two validation tasks. The polyhe-dral method was consistently more accurate than both the ACA and Fixed methods. However, even better performance was achieved by combining (post hoc) different components of each method to create a range of hybrid methods. Additional analyses evaluate the robustness of the predictions and explore al-ternative estimation methods such as Hierarchical Bayes. At the time of the test, the bags were proto-types. Based, in part, on the results of this study these bags are now commercially available.
</summary>
<dc:date>2002-01-01T00:00:00Z</dc:date>
</entry>
<entry>
<title>The Virtual Customer</title>
<link href="https://hdl.handle.net/1721.1/3772" rel="alternate"/>
<author>
<name>Hauser, John</name>
</author>
<author>
<name>Dahan, Ely</name>
</author>
<id>https://hdl.handle.net/1721.1/3772</id>
<updated>2019-04-12T08:25:03Z</updated>
<published>2001-12-01T00:00:00Z</published>
<summary type="text">The Virtual Customer
Hauser, John; Dahan, Ely
Communication and information technologies are adding new capabilities for rapid and&#13;
inexpensive customer input to all stages of the product development (PD) process. In this article&#13;
we review six web-based methods of customer input as examples of the improved Internet capabilities&#13;
of communication, conceptualization, and computation. For each method we give examples&#13;
of user-interfaces, initial applications, and validity tests. We critique the applicability of the&#13;
methods for use in the various stages of PD and discuss how they complement existing methods.&#13;
For example, during the fuzzy front end of PD the information pump enables customers&#13;
to interact with each other in a web-based game that provides incentives for truth-telling and&#13;
thinking hard, thus providing new ways for customers to verbalize the product features that are&#13;
important to them. Fast polyhedral adaptive conjoint estimation enables PD teams to screen larger&#13;
numbers of product features inexpensively to identify and measure the importance of the&#13;
most promising features for further development. Meanwhile, interactive web-based conjoint&#13;
analysis interfaces are moving this proven set of methods to the web while exploiting new capabilities&#13;
to present products, features, product use, and marketing elements in streaming multimedia&#13;
representations. User design exploits the interactivity of the web to enable users to design&#13;
their own virtual products thus enabling the PD team to understand complex feature interactions&#13;
and enabling customers to learn their own preferences for new products. These methods can be&#13;
valuable for identifying opportunities, improving the design and engineering of products, and&#13;
testing ideas and concepts much earlier in the process when less time and money is at risk. As&#13;
products move toward pretesting and testing, virtual concept testing on the web enables PD&#13;
teams to test concepts without actually building the product. Further, by combining virtual concepts&#13;
and the ability of customers to interact with one another in a stock-market-like game, securities&#13;
trading of concepts provides a novel way to identify winning concepts.&#13;
Prototypes of all six methods are available and have been tested with real products and&#13;
real customers. These tests demonstrate reliability for web-based conjoint analysis, polyhedral&#13;
methods, virtual concept testing, and stock-market-like trading; external validity for web-based&#13;
conjoint analysis and polyhedral methods; and consistency for web-based conjoint analysis vs.&#13;
user design. We report on these tests, commercial applications, and other evaluations.
</summary>
<dc:date>2001-12-01T00:00:00Z</dc:date>
</entry>
</feed>
