The official definition provided by DAMA International, the professional organization for those in the data management profession, is: "Data Resource Management is the development and execution of architectures, policies, practices and procedures that properly manage the full data lifecycle needs of an enterprise." This definition is fairly broad and encompasses a number of professions which may not have direct technical contact with lower-level aspects of data management, such as relational database management.
Alternatively, the definition provided in the DAMA Data Management Body of Knowledge () is: "Data management is the development, execution and supervision of plans, policies, programs and practices that control, protect, deliver and enhance the value of data and information assets."
The concept of "Data Management" arose in the 1980s as technology moved from sequential processing (first cards, then tape) to random access processing. Since it was now technically possible to store a single fact in a single place and access that using random access disk, those suggesting that "Data Management" was more important than "Process Management" used arguments such as "a customer's home address is stored in 75 (or some other large number) places in our computer systems." During this period, random access processing was not competitively fast, so those suggesting "Process Management" was more important than "Data Management" used batch processing time as their primary argument. As applications moved into real-time, interactive applications, it became obvious to most practitioners that both management processes were important. If the data was not well defined, the data would be mis-used in applications. If the process wasn't well defined, it was impossible to meet user needs.
Corporate Data Quality Management (CDQM) is, according to the European Foundation for Quality Management and the Competence Center Corporate Data Quality (CC CDQ, University of St. Gallen), the whole set of activities intended to improve corporate data quality (both reactive and preventive). Main premise of CDQM is the business relevance of high-quality corporate data. CDQM comprises with following activity areas:.
The DAMA Guide to the Data Management Body of Knowledge" (DAMA-DMBOK Guide), under the guidance of a new DAMA-DMBOK Editorial Board. This publication is available from April 5, 2009.
In modern management usage, one can easily discern a trend away from the term "data" in composite expressions to the term "information" or even "knowledge" when talking in a non-technical context. Thus there exists not only data management, but also information management and knowledge management. This is a misleading trend as it obscures that traditional data are managed or somehow processed on second looks. The distinction between data and derived values can be seen in the information ladder. While data can exist as such, "information" and "knowledge" are always in the "eye" (or rather the brain) of the beholder and can only be measured in relative units. However, data has staged a comeback with the popularisation of the term Big_data, which refers to the collection and analyses of very large sets of data, to interpret and predict.
Several organisations have established a data management centre (DMC) for their operations.
Integrated data management (IDM) is a tools approach to facilitate data management and improve performance. IDM consists of an integrated, modular environment to manage enterprise application data, and optimize data-driven applications over its lifetime. IDM's purpose is to:
A Data Management Framework (DMF) is a system of thinking, terminology, documentation, resources and insights which allow the user of the Framework to correctly view data related concepts and information available to themselves, in their own context, in the broader context of the Framework, thereby ensuring all who use the Framework remain able to correctly integrate their conversations and work.
There are a number of Data Management Frameworks in existence today.
The Data Atom Data Management Framework version 1.0 was developed, tested and expanded by William Richard Evans between 2010 and 2014, when William then certified as an Internationally Certified Data Management Professional through DAMA International.
The Data Atom Data Management Framework version 2.0 was developed, tested and expanded by William Richard Evans between 2014 and 2017.
With the advent of Artificial Intelligence, the Internet of Things, Data Lakes, and many other new considerations for Data Management, William has replaced his Data Atom Data Management Framework Version 2.0 with "The Multi Dimensional Data Management Framework V3.0)".
4.4 Data Management Center (DMC)[:] The Data Management Center is the data center for all of the deployed cluster networks. Through the DMC, the LMF allows the user to list the services in any cluster member belonging to any cluster [...].