Now showing 1 - 10 of 11
  • Publication
    Management of the Master Data Lifecycle: A Framework for Analysis
    (Emerald Group Publ., 2013-05-01) ;
    Straub, Kevin
    ;
    ;
    Purpose - The purpose of the paper is to propose a reference model describing a holistic view of the master data lifecycle, including strategic, tactical and operational aspects. The Master Data Lifecycle Management (MDLM) map provides a structured approach to analyze the master data lifecycle.Design/methodology/approach - Embedded in a design oriented research process, the paper applies the Component Business Model (CBM) method and suggests a reference model which identifies the business components required to manage the master data lifecycle. CBM is a patented IBM method to analyze the key components of a business domain. The paper uses a participative case study to evaluate the suggested model.Findings - Based on a participative case study, the paper shows how the reference model allows to analyze the master data lifecycle on a strategic, a tactical and an operational level, and how it helps identify areas of improvement.Research limitations/implications - The paper presents design work and a participative case study. The reference model is grounded in existing literature and represents a comprehensive framework forming the foundation for future analysis of the master data lifecycle. Furthermore, the model represents an abstraction of an organization's master data lifecycle. Hence, it forms a theory for designing More research is needed in order to more thoroughly evaluate the presented model in a variety of real-life settings.Practical implications - The paper shows how the reference model enables practitioners to analyze the master data lifecycle and how it helps identify areas of improvement.Originality/value - The paper reports on an attempt to establish a holistic view of the master data lifecycle, including strategic, tactical and operational aspects, in order to provide more comprehensive support for its analysis and improvement.
    Type:
    Journal:
    Volume:
    Issue:
    Scopus© Citations 34
  • Publication
    Integrating a data quality perspective into business process management
    Purpose: The purpose of this paper is to conceptualize data quality (DQ) in the context of business process management and to propose a DQ oriented approach for business process modeling. The approach is based on key concepts and metrics from the data quality management domain and supports decision-making in process re-design projects on the basis of process models.Design/methodology/approach: The paper applies a design oriented research approach, in the course of which a modeling method is developed as a design artifact. To do so, method engineering is used as a design technique. The artifact is theoretically founded and incorporates DQ considerations into process re-design. Furthermore, the paper uses a case study to evaluate the suggested approach.Findings: The paper shows that the DQ oriented process modeling approach facilitates and improves managerial decision-making in the context of process re-design. Data quality is considered as a success factor for business processes and is conceptualized using a rule-based approach.Research limitations/implications: The paper presents design research and a case study. More research is needed to triangulate the findings and to allow generalizability of the results.Practical implications: The paper supports decision-makers in enterprises in taking a DQ perspective in business process re-design initiatives.Originality/value: The paper reports on integrating DQ considerations into business process management in general and into process modeling in particular, in order to provide more comprehensive decision-making support in process re-design projects. The paper represents one of the first contributions to literature regarding a contemporary phenomenon of high practical and scientific relevance.
    Type:
    Journal:
    Volume:
    Issue:
    Scopus© Citations 33
  • Publication
    Dealing with Complexity: A Method to adapt and implement a Maturity Model for Corporate Data Quality Management
    Reference models usually serve as starting points for developing company specific models. Unfortunately, successful usageof reference models is often impeded by various aspects, such as a lack of acceptance among employees, incorrect modelimplementation, or high project costs - all of which more often than not are resulting from an imbalance between the modelscomplexity and the complexity of a companys specific structures. The paper at hand develops a methodical approach fortaking a given reference model (the Maturity Model for Corporate Data Quality Management) and transforming it into acompany specific model, with a particular focus on the specific complexity of a companys structures. Corporate Data QualityManagement describes the quality oriented organization and control of a companys key data assets such as material,customer, and vendor data. Two case studies show how the method has been successfully implemented in real-worldscenarios.
  • Publication
    Towards a Process Reference Model for Information Supply Chain Management
    High-quality information is a prerequisite for companies to accomplish business and strategic goals,such as global reporting, customer relationship management or compliance with legal provisions.During the last years, experts in the field of information quality begun to realize that a paradigm shiftis needed to solve information quality issues in organizations. Information should be treated as aproduct and information quality is possible only through the quality management of informationsupply chains. The paper at hand contributes to this new direction by proposing a process referencemodel for quality management of information supply chains (Information Product Supply ChainManagement, IPSCM) by leveraging the SCOR-Model, a widely accepted standard for supply chainmanagement. The IPSCM-Model enables users to address, improve, and communicate informationcreation practices within and between all interested parties.
  • Publication
    Strategic Business Requirements for Master Data Management Systems
    (AIS Association for Information Systems, 2011-08-07) ;
    Master Data Management (MDM) is of increasing importance because it is seen as a promising approach in companies torespond to a number of strategic business requirements, such as complying with an increasing number of regulations,supporting internal and external business process integration, and establishing a 360-degree-view on the customer. As aresult, software vendors such as IBM, Oracle, SAP, and TIBCO are offering MDM application systems. However, the usercommunity feels a significant mismatch between their own strategic requirements and the functionality currently offered bythe software products. As the Information Systems (IS) research community has remained silent so far regarding this researchproblem, the research presented in this paper makes intensive use of knowledge from the practitioners' community in order todesign a framework for strategic business requirements to be met by MDM systems. As an outcome of a design-orientedresearch process, the framework is an artifact which advances the scientific body of knowledge while at the same timeproviding benefit for practitioners. The framework includes seven design principles which are translated into 23requirements. The requirements form a baseline for internal and external communication in companies and for the design ofconcrete MDM systems.
  • Publication
    Towards a Maturity Model for Corporate Data Quality Management
    (ACM, 2009) ; ; ;
    Shin, Dongwan
    High-quality corporate data is a prerequisite for world-wide business process harmonization, global spend analysis, integrated service management, and compliance with regulatory and legal requirements. Corporate Data Quality Management (CDQM) describes the quality oriented organization and control of a companys key data assets such as material, customer, and vendor data. With regard to the aforementioned business drivers, companies demand an instrument to assess the progress and performance of their CDQM initiative. This paper proposes a reference model for CDQM maturity assessment. The model is intended to be used for supporting the build process of CDQM. A case study shows how the model has been successfully implemented in a real-world scenario.
    Scopus© Citations 41
  • Publication
    Fallstudie B. Braun Melsungen - Globales Stammdatenmanagement
    (Institut für Wirtschaftsinformatik, Universität St. Gallen, 2009-03-01)
    Weber, Kristin
    ;
    Type:
    Issue:
  • Publication
    Case Study Ciba - Organizing Master Data Management
    (Institute for Information Management, University of St. Gallen, 2008-09-01)
    Weber, Kristin
    ;
    As part of the Enterprise project, Ciba started a Master Data Management (MDM) initiative with the vision to consolidate master data across the company, to have strict master data governance rules and responsibilities in place, to formalize master data maintenance processes and validations in order to ensure master data quality, and to document key objects in a central database repository.Six key principles set the top framework for master data and strengthen the data quality awareness among master data stakeholders and users. The new MDM organization consists of three closely interlinked parts: a business data ownership model, a stewardship organization, and a maintenance organization. Local and global workflows have been implemented to support master maintenance processes.
    Type:
    Issue:
  • Publication
    Datenqualitätsmanagement aus Prozessperspektive: Methoden und Modelle
    (Institut für Wirtschaftsinformatik, Universität St. Gallen, 2013-10-01)