Glossary: Claravine and Industry Terms

Julia Randall
Julia Randall
  • Updated




A connector to an external account, including any required information for authentication (username, password, API key, etc.).


The Administrator access level provides full permissions to see and edit all areas of the account, including users, groups, connectors, targets, field sets, lists, templates, folders, patterns, account settings, and organization reports.

application program interface (API)

A software intermediary that allows two applications to talk to each other.


The approval process alerts one or more designated administrators (via email) that they need to review and approve or reject submission data before it can be fully processed.



A field concatenation strings together any fields within a field set. Concatenations can be used in other concatenations (for example, Final Campaign Name=Brand-LOB_Campaign Name_Flight Date). Field concatenations can be modified by editing the submission; Claravine updates the concatenation as the included fields are updated.

Example uses include UTM parameters, Campaign Manager and Facebook naming conventions, and content naming conventions. Because field concatenations can be modified, we do not recommend using concatenated fields for tracking code, campaign code, key, or UTM ID purposes.


Claravine's term for an integration. Programmatically automates data flows in and out of Claravine to accelerate business workflows.

content lifecycle management

The process of planning, creating, optimizing, organizing, distributing, measuring, and preserving content throughout its lifetime in your organization. It's a process that aims to improve efficiency and remove barriers to success.

content management system (CMS)

A software application that enables users to create, edit, collaborate on, publish, and store digital content. This typically includes two major components: a content management application (CMA), as the front-end user interface that allows a user to add, modify, and remove content from a website; and a content delivery application (CDA) that compiles the content and updates the website. Example: WordPress.


A piece of data from a website that is stored within a web browser and that the website can retrieve at a later time. Cookies are used to tell the server that users have returned to a particular website. First-party cookies are created and stored by the websites you are visiting directly to improve on-site experience and personalization. Third-party cookies are leveraged across other domains and hosted by ad servers, aggregators, etc., for retargeting and segmentation.

customer data platform

A customer data platform is a collection of software that creates a persistent, unified customer database that is accessible to other systems. Data is pulled from multiple sources, cleaned, and combined to create a single customer profile. This structured data is then made available to other marketing systems.


data accuracy

A component of data quality, it refers to whether the data values stored for an object are the correct values. To be correct, a data value must be the right value and must be represented in a consistent and unambiguous form.

data clean room

A place where organizations can aggregate data to analyze and gain insights from it while adhering to strict controls, such as ensuring that customer-level data never leaves the clean room. For example, an advertiser can add first-party data to Amazon’s clean room and see how it matches up with the aggregated data of that platform to inform ad buys across audiences.

data completeness

The comprehensiveness or wholeness of data. If data is complete, there should be no gaps or missing information that may be required by a connecting technology.

data consistency

The process of keeping information uniform as it moves across a network and between various applications on a computer. There are typically three types of data consistency: point in time consistency, transaction consistency, and application consistency.

data dictionary

A centralized repository of information used to catalog and communicate the structure and content of a database or dataset, such as the names of measured variables or descriptions for individually named data objects. A data dictionary explains the purpose of data elements and includes metadata within the context of a project to guide accepted meanings and representation for each dataset. It is the foundation for a data taxonomy.

data downtime

Period of time when your data is partial, erroneous, missing, or otherwise inaccurate. Data downtime is highly costly for data-driven organizations today and affects almost every team.

data fabric

An architecture and set of data services that provide consistent capabilities across a choice of endpoints spanning hybrid multi-cloud environments. It is a powerful architecture that standardizes data management practices and practicalities across cloud, on-premise, and edge devices.

data governance

A collection of processes, roles, policies, standards, and metrics that ensure the effective and efficient use of information in enabling an organization to achieve its goals. It defines who can take what action, upon what data, in what situations, using what methods.

data latency

The time it takes for data to travel from one place to another. Companies usually measure latency as the time-lapse between the occurrence of a data-generating event and the arrival of this data at the company database or system.

data layer

The behind-the-scenes structure that web sites and mobile apps tap into for timely and consistent visitor data. Typically a JavaScript object is used to pass information from the website to a Tag Manager container, which can then use that information to populate variables and activate triggers in your tag configurations. Every tool you hook up to your website—analytics, heat-mapping, live chat, etc.—accesses this one layer of data, which ensures two things: that each tool gets the data it needs, and that the data each tool uses is the same.

data management

The practice of managing data as a valuable resource to unlock its potential for an organization. Managing data effectively requires having a data strategy and reliable methods to access, integrate, cleanse, govern, store, and prepare data for analytics.

data mapping

Ability to map the field schema of third-party applications or systems to another application or system.

data model

Organizes elements of data and standardizes how they relate to one another and to the properties of real-world entities.

data observability

Tooling or a technical solution that allows teams to fully understand the health of the data in a system and actively debug a system.  Observability is based on exploring properties and patterns not defined in advance.

data orchestration

Data orchestration is the process of taking siloed data from multiple data storage locations, combining and organizing it, and making it available for data analysis tools. Data orchestration enables businesses to automate and streamline data-driven decision making.

data quality

The measure of how well suited a data set is to serve its specific purpose. Measures of data quality are based on data quality characteristics such as accuracy, completeness, consistency, validity, uniqueness, and timeliness.

data standard

An assembled collection of data components—fields, data types, taxonomies, validation rules, and mappings—that uniformly describe data to the expectation of all data consumers.

data taxonomy

Hierarchical classifications and naming conventions for touchpoint dimensions. These are the elements of marketing and advertising initiatives such as creative, placement, keyword, publisher, and more.

data usability

Considers the ability of a user to derive useful information from data. For example, data provided in a series of text files that require weeks of complex processing to be in a form suitable for analysis are not very usable.


Comprehensive, tabular view of all rows and fields for all data associated with a particular template. Data is added to datasets via submissions; once submissions are approved, they become part of the dataset.

Dataset Editor

A view in Claravine that enables users to browse, search, and bulk-edit data.

dataset management

features and functionality that support dataset management including data validation, dataset searching, filtering, editing and creating

data standard

A data standard is an assembled collection of data components which uniformly describe data to the expectation of all data consumers. Data standards are the key to breaking down silos and creating the world we envision.

digital asset management (DAM)

software that allows businesses to organize, distribute, collaborate on, and securely store digital files that make up a digital asset library. DAM platforms contain features such as permission controls, rights management, and asset performance analytics. Example: Adobe Experience Manager



An element is the basic building block of a pattern. Examples of elements include: a constant, a variable (placeholder for values that will change in the template by text fields or lists), an auto # (auto-generated sequential number), a date, a random number (auto-generated string value), and a delimiter (character that separates each element in a pattern).

extract, transform, and load (ETL)

A type of data integration used to blend data from multiple sources. Data is taken (extracted) from a source system, converted (transformed) into a format that can be analyzed, and stored (loaded) into a data warehouse or other system.


field set

A collection of columns that can be used across templates to simplify management and enforce consistency.


Folders enable administrators to group content managed through the platform together to efficiently manage data access.



Custom-built sets of users allowing administrators to efficiently manage access to data in the platform by template. A user can be a member of one or more groups.


inbound integration

Type of data integration allowing data created or managed in another business system (for example, cloud application, data infrastructure, etc.) to be imported.


See connector.


Knowledge Base

Administrator and User Documentation, FAQs, and Release Notes for Claravine available here.



Manager access level has intermediate permissions an Administrator assigns.

marketing mix modeling (MMM)

a regression technique which helps in quantifying the impact of several marketing inputs on sales or Market Share. The purpose of using MMM is to understand how much each marketing input contributes to sales, and how much to spend on each marketing input. Sometimes referred to as media mix modeling depending on the objective.

metadata management

Metadata management establishes policies and procedures that enable organizations to derive maximum business value from their digital assets. Good metadata management ensures digital assets are properly maintained and can be discovered, integrated, linked, shared, and analyzed across the organization. Metadata Management tools leverage database-level metadata to derive insights/value. (Different from Claravine in that we manage operational or business-level metadata.)

mobile ad ID (MAID)

a sequence of random symbols, given by the mobile device's operating system. It's shared with the servers of the apps that the user is using to track his customer journey and “remember” his or her choices. Examples: Apple’s Identifier for Advertisers (IDFA), Google’s MobileAdvertising ID (AAID)

multi-touch attribution

a marketing effectiveness measurement technique that takes all of the touchpoints on the consumer journey into consideration and assigns fractional credit to each so that a marketer can see how much influence each channel has on a sale.



automated notifications (via email) informing users of product activity and issues in data to enable quick, delegated resolution.

Number of Users

The number of platform users authorized to access your instance of The Data Standards Cloud.


outbound integration

Type of data integration allowing data to be pushed back to its place of origin or to other dependent business systems (e.g. analytics systems).



A pattern is a custom-built value uniquely identifying a row of data; it creates a key to which metadata is associated in both Adobe Analytics and Google Analytics. A pattern can consist of a single value (such as a random number) or a composite key consisting of multiple attributes of a record. One example of pattern use is managing the taxonomy that will be used to generate tracking codes

Pattern elements can include: constants, variables (fields that change and are not static; these can be a text field, a list, a date, an autofill, a concatenation), dates, auto numbering (unique keys can be created with an incrementing number), and random numbering (unique keys can be created with a random string of numbers and/or letters). 

Patterns lock once the submission has been committed and cannot be edited by a user. If the pattern elements include a variable that the user wants to change (such as Flight Date, Campaign Name, or Category), a new submission must be created because the pattern does not update to match the edited value.

pending submission

Newly added data accessed and managed through inbound connectors to a dataset.

pick list

Predefined set of acceptable values for a specified field that users can select from. Helps prevent data outside of the standard from being entered.

powered list

Type of pick list where select options are driven by data already in or added to a dataset.



Report can refer to a aashboard, chart, or data export that customers use to monitor, audit, and visualize activity within the product.

roles and permissions

Standard or customized sets of permissions allowing administrators to grant users the right level of authorization and group assignment in the platform.


A row of data in a submission, which contains any number of fields, depending on how the template is configured.



Quickly scan values across all datasets to locate matching records of data.

single sign-on (SSO) authentication

Programmatic user management for enterprise customers. Enables users to sign into The Data Standards Cloud using their organization-managed credentials, instead of managing a different login specific to this platform. In practice, users log in at your company’s login page.


Allows users to draft and save new data to a dataset following the data standards defined on a template. Includes sending data to anywhere there is an outbound integration.



A location where a template field is mapped to send data (multi-targets enables you to map the same submission to multiple locations).


The basic Claravine work unit; a collection of taxonomy elements, such as fields or patterns, that express data standards for a specific type of data, such as a campaign or digital asset. Data integrations are part of the template configuration, including the settings around link validations.

tracking code and URL generation

The automation of tracking code and URL creation in the format defined by the user.



The User access level allows a user to create submissions using the templates assigned at the Group level.


validation (data)

The verification that data is following the defined standard. This can happen inside the Data Standards Cloud (for example, a single cell), or can be used to verify data that exists outside of the platform (for example, tag validation).

validation (tag)

Automated verification of marketing tag placement and configuration on owned landing (web) pages entered by the user, including analytics and/or marketing tag definitions that should be present on the defined page.

validation (scheduled)

Scheduling of validations per template: weekly, daily, hourly.

validation (on demand)

upon completion of a submission all link fields will be validated and the results will be displayed in the completed submission.  A user is able to manually run a validation by clicking on the refresh arrow on the page.



Article is closed for comments.