Apply masks in compositing workflows

Last updated on Apr 20, 2026

Learn how to apply masks in composite workflows to edit only a part of your image.

Masks let you work on specific regions of an image during compositing, filling, and placement operations, helping you change only the parts you need while preserving the rest.

Binary mask interpretation

Masking nodes and mask inputs accept binary image masks: grayscale or single-channel images where pixel values determine where effects apply.

  • White (high luminance) marks regions where edits are applied (compositing, generative fills, or adjustments).
  • Black marks areas that remain unmodified.

Make sure to check the documentation for each node to understand its specific behavior. Some operations may invert semantics or support soft masks, so details can vary.

You can create masks manually in external apps for precise art direction or generate them programmatically within workflows using luminosity-driven or segmentation nodes.

Create masks in external apps

In Photoshop, Illustrator, or another editor, create a grayscale or black-and-white layer matching the size and alignment of your target image.

Paint white where the workflow should act and black where it should not, using soft edges only if the target node supports feathered masks.

Export the mask as PNG or another compatible format.

Add the mask file through Input images or the mask-specific input your node exposes.

This approach works best when you need exact control over silhouettes—logos, product cutouts, or hand-refined regions.

Generate masks inside workflows

You do not have to author every mask by hand. Workflows include nodes that generate masks programmatically, for example, luminosity-driven or segmentation-style masks, so nodes that follow cnodes receive a ready-made mask input.

Add an input node for your base image.

Connect a masking or analysis node that outputs a mask image, such as Apply luminosity mask or Remove background.

Connect the mask output to a node accepting mask inputs, such as Composite images (2D), Fill region, or similar operations.

Generated masks follow the white-applies, black-protects interpretation unless the node specifies otherwise.

Connect masks in the graph

Identify whether your target node expects a separate mask input or a combined asset with both mask and image in one parameter.

Verify dimensions and color space are compatible by checking node requirements, resizing or converting in an upstream step if needed.

Run validation on the downstream node to catch mask-related errors such as missing connections, size mismatches, or all-black or all-white masks.

Use cases

  • Selective compositing: Place subjects or products only inside white mask regions
  • Generative edits: Restrict Firefly-style fills or expansions to masked areas while preserving surrounding content
  • Batch consistency: Apply rule-based masks (luminosity, subject separation) to process multiple images uniformly without manual brushing

Troubleshooting

Symptom

What to check

Nothing changes in the image

Mask may be all black, inverted, or disconnected; confirm white covers the intended edit region.

Wrong area is affected

Invert the mask in an external tool or toggle invert if the node provides it; verify alignment with the base image.

Harsh edges

Use feathering in source art or a node that supports soft masks if hard edges appear incorrect.