User Guide

Test attributes in Adobe Acrobat Analyzer

Test how an attribute performs across real documents before you rely on it at scale.

Use this process to evaluate how an attribute extracts values across multiple documents in a collection. Reviewing real results helps you identify inconsistencies, unexpected matches, and gaps in extraction before you apply the attribute more broadly.

Note

You are responsible for creating, testing, and validating custom attributes to ensure they meet your organization’s requirements. Adobe Acrobat Analyzer provides tools for evaluation and feedback; however, accuracy depends on your input and subsequent refinement.

Before you begin

  • You must have permission to view and edit attributes.
  • The attribute you want to test must already exist.
  • You need access to documents that are expected to contain the attribute.
  • Testing supports selecting up to 10 documents at a time.

Test your attribute

  1. Open the Attributes tab.

  2. Select the attribute you want to test.

    The attribute definition opens automatically.

  3. Select Test under the attribute definition.

    The attribute page with the definition exposed and the "Evaluate attribute" button highlighted.

  4. Select 2-3 (up to 10) documents to test.

    • The All files collection is selected by default.
      • Use the collections dropdown to filter the list to a single collection.
      • Use the Search field to narrow the list of documents.
  5. Select Add files.

    The page refreshes to display the evaluation results.

    The "Evaluate attribute" list with filters applied.

Review extracted results

After the test completes, a table with four columns displays each selected document with the extracted outputs and user feedback shown next to the file name.

The first three columns are the extracted value:

  • {Attribute name} – The extracted value based on your query from the Detailed attribute description.
  • Attribution Content - Citations from the source document supporting the extracted result.
  • Explanation – Assumptions and conditions Acrobat Analyzer used to return the extracted result.

The fourth column is User feedback, tracking feedback given to refine the attribute.

Hover over the file name and select Review to see the full text for the Attribute value, Attribution content, and Explanation.
Provide feedback by grading the extractions and giving examples of inclusion and exclusion for what should and should not be found. 

The "Evaluate attribute" page with the "Review" button highlighted.

Note

Excel and .csv files cannot be previewed in Acrobat Analyzer.
When testing attributes using Excel or .csv files, download the extracted results to review the source content and validate extraction accuracy.

After selecting the Review button, you'll see:

  • On the left, the document viewer highlights the location in the file where the value was extracted.
  • On the right, you’ll see the full text of the extracted outputs (Attribute nameAttribution content, and Explanation).
  • Under the extracted outputs, there is a feedback option that lets you identify whether the value is correct or incorrect.
The document citation with the in-document location highlighted and the text of the extraction beside it.

Tip

Navigation arrows next to the document name let you move forward or backward through the selected documents.
Reviewing multiple documents helps you confirm whether the extraction is consistent or needs adjustment.

When you identify an Incorrect value, you'll be prompted to give inclusion and exclusion examples of what should and should not be found.

  • Be very specific and exact in the examples you give – verbatim from the document.
  • Acrobat Analyzer will generalize from these examples to extract answers with similar patterns.
The "Is this value correct" input panel when Incorrect is selected.

Select Back to table to return to the results table.

  • The feedback you provide is shown in the User feedback column.
  • Inclusion and exclusion examples provided through user feedback are automatically added to the Attribute Editor under Advanced options.
The attribute explaination page with the User feedback column highlighted.

Refine the attribute

Review the user feedback you provided.

If you like all the examples added, select Retest to see how your examples have refined your extracted outputs.

  • If you graded the attributes but didn’t provide examples, nothing will have changed in the Attribute Editor, so the results will be the same.

When refinements beyond adding examples are needed, tune the Detailed attribute description and retest. Iterating the description can improve accuracy over time.

The attribute evaluation page with the "Re-analyze the document" button highlighted.

Tip

If extracted values are incorrect or inconsistent, carefully review the Explanation column, as that gives you insight into how Acrobat Analyzer understands your attribute description. Use this insight to refine your description and Retest.

Use Advanced options when needed

If refinement alone does not produce acceptable results, use Advanced options in the attribute definition.

Advanced options let you provide examples that guide extraction behavior:

  • Add inclusion examples to show what should be found in the extracted outputs.
  • Add exclusion examples to show what should not be found in the extracted outputs.
  • Use 1–5 examples to avoid overfitting.

These examples help reduce ambiguity and improve consistency across documents.

The attribute definition highlighting the Advanced options

When to refine an attribute

Refine your attribute if you notice:

  • Missing values that should have been found.
  • Values that are too broad, such as full sentences instead of specific terms.
  • Incorrect values that resemble the target concept.
  • Inconsistent results across similar documents.

Small adjustments to your description, type, or examples often resolve these issues.

When to recreate an attribute

If repeated refinement does not produce consistent results, it may be more effective to recreate the attribute with:

  • A simpler description.
  • A narrower scope.
  • Fresh examples.
  • A more specific attribute type (text, date, or number).

Recreating an attribute can help remove earlier assumptions or overfitting introduced during refinement.

When your attribute is ready

An attribute is ready for broader use when:

  • Extractions consistently return the correct value across evaluated documents.
  • Documents that do not contain the attribute correctly return no value.
  • Formatting and answer structure follow your defined expectations.
  • Results remain stable after minor refinements.

Once validated on a small set of documents, expand the evaluation to larger datasets to identify remaining patterns or edge cases.

You can also refresh attributes within curated collections to observe behavior at scale.

Adobe, Inc.

Get help faster and easier

New user?