Learn the best practices for importing data and updating the Adobe Campaign database using workflows.
Summary Consistency in data imports is important to avoid errors or unwanted results in further processing such as data management, targeting, and exports. This document presents some of the most important principles for creating good import workflows.
Digital Marketing solutions Adobe Campaign v6.11
Adobe Campaign v7
Adobe Campaign Standard
Audience Administrators, Advanced users

If you have questions about this article or about any other Adobe Campaign topic, ask the Community.

Using import templates

Most import workflows must contain the following activities: Data loading (file)EnrichmentSplitDeduplicationUpdate data.

Using import templates makes it convenient to prepare similar imports and ensure data consistency within the database. Learn how to build workflow templates in the documentation.

In many projects, imports are built without Deduplication activity because the files used in the project do not have duplicates. Later in production, duplicates can have appeared from different files and generate duplicates in the database. De-duplication is then difficult. Therefore a deduplication step is a good precaution in all import workflows.

Do not rest on assumption that the incoming data is consistent and correct. Or that the IT department or Adobe Campaign supervisor can take care of it. During the project, keep the data cleansing in mind. Deduplicate, reconcile, and maintain consistency when you import data.

An import template example is available in the documentation.

Using flat files format

The most efficient format for imports is flat files.

Flat files can be imported in bulk mode at the database level.

For example:

  • Separator: tab or semicolon
  • First line with headers
  • No string delimiter
  • Date format: YYYY/MM/DD HH:mm:SS

Adobe Campaign cannot import XML files using standard file import activities. It is possible to import XML files using JavaScript but only with small volumes: less than 10K records per file.

Using compression and encryption

Use zipped files for imports and exports when possible.

On Linux, it is possible to unzip a file and import at the same time using a command line. For example:

zcat nl6/var/vp/import/filename.gz

It is also a good practice to encrypt files that are sent across the network if it is unsecured. GPG can be used for this task.

Loading in batch

Loading data in batch is more effective than loading one line at a time.

Imports using web services are not efficient. It is best to use files whenever possible.

Calling external web services to enrich profiles in real time is known to cause performance problems and memory leaks, because it works at line level.

If you want to import data from a web service, it is better to do it in batch using a workflow than in real time using a Web application or a web service.

Using Data Management

Loading in iterative mode (line by line) using JavaScript must be limited to small volumes.

For better efficiency, always use the Data Loading (file) activity in data management workflows.

Importing in Delta mode

Regular Imports must be done in delta mode. It means that only modified or new data is sent to Adobe Campaign, instead of the whole table every time.

Full imports m be used for initial load only.

Import data using data management rather than JavaScript.

Maintaining consistency

To maintain the data consistency in the Adobe Campaign database, follow the principles below:

  • If the imported data matches a reference table in Adobe Campaign, then it must be reconciled with that table in the workflow. Records that do not match must be rejected.
  • Ensure that the imported data is always "normalized" (email, phone number, direct mail address). This normalization must be reliable and must not change over the years. If not, some duplicates are likely to appear in the database, and as Adobe Campaign does not provide tools to do "fuzzy" matching, it is difficult to manage and remove them.
  • Transactional data must have a reconciliation key and be reconciled with the existing data to avoid creating duplicates.
  • Import related files in order. If the import is composed of multiple files, the workflow must make sure that the files are imported in the correct order. When a file fails, the other files are not imported.
  • Deduplicate, reconcile, and maintain consistency when you import data.

Esta obra está autorizada con arreglo a la licencia de Reconocimiento-NoComercial-CompartirIgual 3.0 Unported de Creative Commons.  Los términos de Creative Commons no cubren las publicaciones en Twitter™ y Facebook.

Avisos legales   |   Política de privacidad en línea