Content needs to be modular, structured, reusable and device and platform independent.
Integration of ETL data flows will usually be developed within tools specialised to manage those flows in a proprietary way.
Data Warehouse describes the operational extract, cleansing, transformation, control and load processes that maintain the data in a data warehouse.
Drivers for data governance most often focus on reducing risk or improving processes. Please select the elements that relate to the reduction in risk:
Bold means doing something that might cause short term pain, not just something that looks good in a marketing email.
Part of alignment includes developing organizational touchpoints for data governance work. Some examples of touchpoints include: Procurement and Contracts; Budget and Funding; Regulatory Compliance; and the SDLC framework.
Change only requires change agents in special circumstances, especially when there is little to no adoption.
The IT security policy provides categories for individual application, database roles, user groups and information sensitivity.
Consistent input data reduces the chance of errors in associating records. Preparation processes include:
All data is of equal importance. Data quality management efforts should be spread between all the data in the organization.
Data governance requires control mechanisms and procedures for, but not limited to, assignment and tracking of action items.
Real-time data integration is usually triggered by batch processing, such as historic data.
A deliverable in the data architecture context diagram includes an implementation roadmap.
The roles associated with enterprise data architecture are data architect, data modellers and data stewards.
Data governance can be understood in terms of political governance. It includes the following three function types:
A roadmap for enterprise data architecture describes the architecture’s 3 to 5-year development path. The roadmap should be guided by a data management maturity assessment.
Examples of concepts that can be standardized within the data quality knowledge area include:
In gathering requirements for DW/BI projects, begin with the data goals and strategies first.
Media monitoring and text analysis are automated methods for retrieving insights from large unstructured or semi-structured data, such as transaction data, social media, blogs, and web news sites.
Enterprise data architecture influences the scope boundaries of project and system releases. An example of influence is data replication control.
In the Data Warehousing and Business Intelligence Context Diagram, a primary deliverable is the DW and BI Architecture.
A Data Management Maturity Assessment (DMMA) can be used to evaluate data management overall, or it can be used to focus on a single Knowledge Area or even a single process.
Logical abstraction entities become separate objects in the physical database design using one of two methods.
Data security includes the planning, development and execution of security policies and procedures to provide authentication, authorisation, access and auditing of data and information assets.
Data profiling is a form of data analysis used to inspect data and assess quality.
The ethics of data handling are complex, but is centred on several core concepts. Please select the correct answers.
ANSI 859 recommends taking into account the following criteria when determining which control level applies to a data asset:
In Resource Description Framework (RDF) terminology, a triple store is composed of a subject that denotes a resource, the predicate that expresses a relationship between the subject and the object, and the object itself.
In matching, false positives are three references that do not represent the same entity are linked with a single identifier.
Service accounts are convenient because they can tailor enhanced access for the processes that use them.
There are numerous methods of implementing databases on the cloud. The most common are:
Validity, as a dimension of data quality, refers to whether data values are consistent with a defined domain of values.
Obfuscating or redacting data is the practice of making information anonymous ot removing sensitive information. Risks are present in the following instances:
The Data Governance Council (DGC) manages data governance initiatives, issues, and escalations.
A deliverable in the data modelling and design context diagram is the logical data model.
Architects seek to design in a way that brings value to an organisation. To reach these goals, data architects define and maintain specifications that:
Subtype absorption: The subtype entity attributes are included as nullable columns into a table representing the supertype entity
Tools required to manage and communicate changes in data governance programs include
The term data quality refers to both the characteristics associated with high quality data and to the processes used to measure or improve the quality of data.
The most important reason to implement operational data quality measurements is to inform data consumers about levels of data effectiveness.