Conceptual Framework

From Metadata-Registry
Jump to: navigation, search

Metadata Management Conceptual Framework

Currently central data stores associated with institutional repositories and other metadata aggregators face a number of challenges: data is often stored in remote locations, in incompatible data formats, and varying degrees of quality--this data must be aggregated, normalized, and integrated. Existing data is frequently highly variable in quality, often incomplete, and may not conform to existing standards. [1]

The Metadata Management Services integrated toolset seeks to solve a number of these problems by providing services to improve the quality of existing metadata, manage the aggregation of metadata (both from original metadata sources and quality improvement services), and provide services to redistribute the improved metadata in multiple formats.

The broad architectural framework underpinning the toolset involves the following functional systems and services:

  • A central repository that will aggregate item-level metadata from multiple non-equivalent and otherwise incompatible data sources, in multiple formats (using OAI as the data interchange glue), ultimately providing a base for other services.
  • Harvest services to manage the routine repeat harvesting of metadata from both data providers and data enhancement services, via OAI, including:
    • Data provider and service registration systems
    • An integrated OAI harvest service that includes flexible handling and reporting of routine low-level data validation errors
    • A harvest management and scheduling service to automate repeat data harvests
    • Event logging and history interfaces
    • Error-tracking and notification services
    • User access management
    • User forums and information sharing systems
    • Harvest diagnostics and helpdesk support, enabling extensive problem solving by staff without strong technical/programming skills
  • A comprehensive suite of OAI servers and related data interface tools to enable data providers to easily create interfaces between an OAI server and any number of internal data storage systems regardless of format or storage mechanism. These servers should:
    • be available in multiple programming languages and run on multiple operating systems to provide out-of-the-box functionality in a broad range of environments
    • share a single object model across languages and operating systems to make it easy for developers and consultants to switch languages and operating environments
    • provide a single, simple plugin API to support multiple data sources, making it easy to share data plugins
  • Testing services that will allow data providers to fully test new OAI server installations for functionality, protocol compliance, and data validity as well as provide ongoing validation and testing to support trouble-shooting and diagnostics
  • Data transformation services to provide:
    • data normalization and cleanup
    • data enhancement
    • crosswalking to multiple formats
  • Data distribution services to allow data consumers, including the original data providers, to access the normalized, enhanced, and aggregated data in the central repository in a variety of data formats and schemas, in order to support a broad range of downstream data-based services
    • All data transformations and normalization is completely transparent to downstream users, with complete data provenance at all levels [2] [3]