Skip to content

Data Forwarding: Support batching, field filtering #112653

@Angelodaniel

Description

@Angelodaniel

Problem Statement

Data engineering teams using Sentry's Data Forwarding (via SQS → S3) run into two friction points at scale:

No batching — each event triggers an individual S3 write, leading to a high volume of requests and increased costs on the receiving end.
No field filtering — the full event payload is always forwarded, but most teams only need a subset (e.g. stacktraces, account, device, and network details) for downstream enrichment in their data warehouse. There's no way to include/exclude specific fields.

These teams typically want to join Sentry data with other signals in a data warehouse to analyze bug impact, and the current forwarding setup makes that expensive to operate.

Solution Brainstorm

Allow configuring a batch window (by time or event count) so multiple events are grouped into a single forwarding write
Allow customers to define an allowlist or denylist of fields to include in forwarded payloads
Add Databricks as a native forwarding destination alongside the existing SQS/S3 option (see also ISWF-1681)

Product Area

Settings - Integrations

Metadata

Metadata

Assignees

No one assigned
    No fields configured for issues without a type.

    Projects

    Status

    Waiting for: Product Owner

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions