Now showing 1 - 10 of 24
  • Publication
    Toward a functional reference model for master data quality management
    The quality of master data has become an issue of increasing prominence in companies. One reason for that is the growing number of regulatory and legal provisions companies need to comply with. Another reason is the growing importance of information systems supporting decision-making, requiring master data that is up-to-date, accurate and complete. While improving and maintainingmaster data quality is an organizational task that cannot be encountered by simply implementing a suitable software system, system support is mandatory in order to be able to meet challenges efficiently and make for good results. This paper describes the design process toward a functional reference model for master data quality management (MDQM). The model design process spanned several iterationscomprising multiple design and evaluation cycles, including the model's application in a participative case study at consumer goods manufacturer Beiersdorf. Practitioners may use the reference model as an instrument for the analysis, design and implementation of a company's MDQM system landscape. Moreover, the reference model facilitates evaluation of software systems and supports company-internal and external communication. From a scientific perspective, the reference model is a design artifact; hence it represents a theory for designing information systems in the area of MDQM
    Type:
    Journal:
    Volume:
    Issue:
    Scopus© Citations 38
  • Publication
    Collaborative management of business metadata
    Legal provisions, cross-company data exchange and intra-company reporting or planning procedures require comprehensively, timely, unambiguously and understandably specified business objects (e.g. materials, customers, and suppliers). On the one hand, this business metadata has to cover miscellaneous regional peculiarities in order to enable business activities anywhere in the world. On the other hand, data structures need to be standardized throughout the entire company in order to be able to perform global spend analysis, for example. In addition, business objects should adapt to new market conditions or regulatory requirements as quickly and consistently as possible. Centrally organized corporate metadata managers (e.g. within a central IT department) are hardly able to meet all these demands. They should be supported by key users from several business divisions and regions, who contribute expert knowledge. However, despite the advantages regarding high metadata quality on a corporate level, a collaborative metadata management approach of this kind has to ensure low effort for knowledge contributors as in most cases these regional or divisional experts do not benefit from metadata quality themselves. Therefore, the paper at hand identifies requirements to be met by a business metadata repository, which is a tool that can effectively support collaborative management of business metadata. In addition, the paper presents the results of an evaluation of these requirements with business experts from various companies and of scenario tests with a wiki-based prototype at the company Bayer CropScience AG. The evaluation shows two things: First, collaboration is a success factor when it comes to establishing effective business metadata management and integrating metadata with enterprise systems, and second, semantic wikis are well suited to realizing business metadata repositories.
    Type:
    Journal:
    Volume:
    Issue:
    Scopus© Citations 30
  • Publication
    Type:
    Journal:
    Volume:
    Issue:
  • Publication
    Product data quality in supply chains: the case of Beiersdorf
    (Springer, 2011-06-01) ;
    Schierning, Andreas
    ;
    ;
    A number of business requirements (e.g. compliance with regulatory and legal provisions, diffusion of global standards, supply chain integration) are forcing consumer goods manufacturers to increase their efforts to provide product data (e.g. product identifiers, dimensions) at business-to-business interfaces timely and accurately. The quality of such data is a critical success factor for efficient and effective cross-company collaboration. If compliance relevant data (e.g. dangerous goods indicators) is missing or false, consumer goods manufacturers risk being fined and see their company's image damaged. Or if logistics data (e.g. product dimensions, gross weight) is inaccurate or provided not in time, business with key account trading partners is endangered. To be able to manage the risk of business critical data defects, companies must be able to a) identify such data defects, and b) specify and use metrics that allow to monitor the data's quality. As scientific research on both these issues has come up with only few results so far, this case study explores the process of identifying business critical product data defects at German consumer goods manufacturing company Beiersdorf AG. Despite advanced data quality management structures such defects still occur and can result in complaints, service level impairment and avoidable costs. The case study analyzes product data use and maintenance in Beiersdorf's ecosystem, identifies typical product data defects, and proposes a set of data quality metrics for monitoring those defects.
    Type:
    Journal:
    Volume:
    Issue:
    Scopus© Citations 33
  • Publication
    Dealing with Complexity: A Method to adapt and implement a Maturity Model for Corporate Data Quality Management
    Reference models usually serve as starting points for developing company specific models. Unfortunately, successful usageof reference models is often impeded by various aspects, such as a lack of acceptance among employees, incorrect modelimplementation, or high project costs - all of which more often than not are resulting from an imbalance between the modelscomplexity and the complexity of a companys specific structures. The paper at hand develops a methodical approach fortaking a given reference model (the Maturity Model for Corporate Data Quality Management) and transforming it into acompany specific model, with a particular focus on the specific complexity of a companys structures. Corporate Data QualityManagement describes the quality oriented organization and control of a companys key data assets such as material,customer, and vendor data. Two case studies show how the method has been successfully implemented in real-worldscenarios.
  • Publication
    Identification of Business Oriented Data Quality Metrics
    (Curran, 2011-11-07) ; ; ;
    Bowen, Paul
    ;
    Elmagarmid, Ahmed K.
    ;
    ;
    Sattler, Kai-Uwe
    Corporate data of poor quality can have a negative impact on the performance of business processes and thereby the success of companies. Similar to machine tools corporate data show signs of wear (imaging a moving customer with a new address, for example) and have to be monitored continuously for quality defects. Effective quality control of corporate data requires metrics that monitor potential data defects with the most significant impact on the performance of a company's processes. However, due to company specific success factors and IT landscapes, it is hardly possible to provide generic metrics that can be implemented without any adjustment. This paper presents a method for the identification of business oriented data quality metrics. The presented approach takes into account company specific requirements from both a business and an IT perspective. The method's design and evaluation process is discussed in the context of three real-world cases.
  • Publication
    A Cybernetic View on Data Quality Management
    (Association for Information Systems, 2010-08-14) ; ;
    Corporate data of poor quality can have a negative impact on the performance of business processes and thereby the success of companies. In order to be able to work with data of good quality, data quality requirements must clearly be defined. In doing so, one has to take into account that both the provision of high-quality data and the damage caused by low-quality data brings about considerable costs. As each company's database is a dynamic system, the paper proposes a cybernetic view on data quality management (DQM). First, the principles of a closed-loop control system are transferred to the field of DQM. After that a meta-model is developed that accounts for the central relations between data quality, business process performance, and related costs. The meta-model then constitutes the basis of a simulation technique which aims at the explication of assumptions (e.g. on the effect of improving a data architecture) and the support of DQM decision processes.
  • Publication
    Design Alternatives for the Evaluation of Design Science Research Artifacts
    (Association for Computing Machinery, 2009-05-07) ;
    Gubler, Philipp
    ;
    ;
    Vaishanvi, Vijay
    Within a consideration of cost effectiveness the evaluation of design science research artifacts is of major importance. In thepast, a plenitude of approaches has been developed for this purpose partly artifact-specific, partly artifact-neutral. Nonetheless,there is a lack of a comprehensive overview over existing methods as well as a systemization of those with regard to fundamentalstructuring criteria. The paper at hand surveys existing methods and introduces a framework that equally supports the designer andthe user of artifact evaluation approaches. Subsequent to the embedding of the framework into the design science research processtwo exemplary application scenarios are described.
    Scopus© Citations 77