Submit a ticket My Tickets
Welcome
Login  Sign up

PowerDMARC and Elastic - Implementation Guide

PowerDMARC Elastic SIEM Implementation Guide

Introduction

With PowerDMARC’s Elastic SIEM integration, you can seamlessly ingest and monitor your email authentication and domain security audit data directly within your Elastic Stack environment. By leveraging the PowerDMARC API and Elastic Agent’s built-in httpjson input, organizations can build a streamlined SIEM integration without external scripts or complex configurations — simply configure through the Kibana web interface, deploy, and gain centralized visibility into their email security posture across all domains.


This guide covers the complete setup: from installing the Elastic Agent to configuring the API integration, building ingest pipelines for field mapping and enrichment, and creating a monitoring dashboard — all performed through the Kibana web UI and Dev Tools Console.


API Documentation: https://app.powerdmarc.com/swagger-ui/index.html

Alternative Documentation: https://api.powerdmarc.com/


Architecture Overview

For this example, we are using the audit log endpoint for testing purposes and illustration.


PowerDMARC REST API

        ↓

Elastic Agent (httpjson input) — installed on your host

        ↓

Elasticsearch Ingest Pipeline (ECS mapping, GeoIP, deduplication)

        ↓

Elasticsearch Data Stream (logs-powerdmarc.audit-*)

        ↓

Kibana (Discover, Dashboards, Alerts, Detection Rules)


The Elastic Agent runs on your host machine (Windows, Linux, or macOS) and polls the PowerDMARC API at a configurable interval. Each response is split into individual audit log events, processed through an ingest pipeline for field normalization and enrichment, and stored in an Elasticsearch data stream. Kibana provides the visualization and alerting layer.

Prerequisites

Before starting, ensure you have:


  • Elastic Stack 8.x deployment (Elastic Cloud or self-managed) with Kibana access

  • Fleet enabled in your Elastic deployment

  • Permission to create Ingest Pipelines, Agent Policies, and Package Policies

  • A host machine (Windows, Linux, or macOS) to install the Elastic Agent on

  • A PowerDMARC API Bearer Token with permission to access Audit Logs

  • Network connectivity from the host to both the PowerDMARC API and your Elasticsearch cluster

Configuration Steps

All configuration is performed through the Kibana Dev Tools Console. This provides a single interface for creating pipelines, managing Fleet policies, and verifying data ingestion.

Step 1: Open the Dev Tools Console

  1. Open your Kibana URL in the browser.

  2. Click the hamburger menu (☰) on the top left.

  3. Scroll down to Management and click Dev Tools.

  4. The Console editor opens with a left panel (input) and right panel (response).

  5. Clear any default example code in the left panel.


Step 2: Create the Ingest Pipeline

The ingest pipeline normalizes raw PowerDMARC API fields into Elastic Common Schema (ECS) format, enriches IP addresses with GeoIP data, and generates a fingerprint-based document ID for deduplication.


Paste and run the following in Dev Tools:


PUT _ingest/pipeline/powerdmarc-audit-pipeline

{

  "description": "Process PowerDMARC audit logs into ECS fields",

  "processors": [

    {

      "json": {

        "field": "message",

        "target_field": "powerdmarc"

      }

    },

    {

      "date": {

        "field": "powerdmarc.created_at",

        "formats": ["yyyy-MM-dd HH:mm:ss"],

        "target_field": "@timestamp"

      }

    },

    {

      "rename": {

        "field": "powerdmarc.user_name",

        "target_field": "user.name",

        "ignore_missing": true

      }

    },

    {

      "rename": {

        "field": "powerdmarc.action",

        "target_field": "event.action",

        "ignore_missing": true

      }

    },

    {

      "rename": {

        "field": "powerdmarc.ip_address",

        "target_field": "source.ip",

        "ignore_missing": true

      }

    },

    {

      "rename": {

        "field": "powerdmarc.a_username",

        "target_field": "user.target.name",

        "ignore_missing": true

      }

    },

    {

      "rename": {

        "field": "powerdmarc.other",

        "target_field": "event.reason",

        "ignore_missing": true

      }

    },

    { "set": { "field": "event.kind",       "value": "event" } },

    { "set": { "field": "event.category",   "value": "configuration" } },

    { "set": { "field": "observer.vendor",  "value": "PowerDMARC" } },

    { "set": { "field": "observer.product", "value": "PowerDMARC" } },

    {

      "geoip": {

        "field": "source.ip",

        "target_field": "source.geo",

        "ignore_missing": true

      }

    },

    {

      "fingerprint": {

        "fields": ["user.name", "event.action",

                    "source.ip", "@timestamp"],

        "target_field": "_id",

        "ignore_missing": true

      }

    },

    {

      "remove": {

        "field": ["powerdmarc", "message"],

        "ignore_missing": true

      }

    }

  ]

}


Expected response:

{ "acknowledged": true }


Step 3: Find Your Agent Policy ID

You need the ID of the Agent Policy where the integration will be attached. Run:


GET kbn:/api/fleet/agent_policies


In the response, locate the policy you want to use and copy its id value. If you have multiple policies, use the one assigned to the host running the Elastic Agent — typically not the Fleet Server policy

Step 4: Create the httpjson Integration

This creates a Fleet-managed httpjson integration that polls the PowerDMARC API, handles authentication, pagination, and response splitting — all configured through a single API call.


Before running, replace two placeholders:

  • YOUR_AGENT_POLICY_ID - the ID from Step 3

  • YOUR_POWERDMARC_API_KEY - your PowerDMARC API Bearer token


POST kbn:/api/fleet/package_policies

{

  "policy_ids": ["YOUR_AGENT_POLICY_ID"],

  "package": { "name": "httpjson", "version": "1.24.0" },

  "name": "powerdmarc-audit-logs",

  "description": "PowerDMARC Audit Logs REST API Integration",

  "namespace": "default",

  "inputs": {

    "generic-httpjson": {

      "enabled": true,

      "streams": {

        "httpjson.generic": {

          "enabled": true,

          "vars": {

            "data_stream.dataset": "powerdmarc.audit",

            "pipeline": "powerdmarc-audit-pipeline",

            "request_url": "https://app.powerdmarc.com/api/v1/audit-logs",

            "request_interval": "60m",

            "request_method": "GET",

            "request_transforms": [

              {

                "set": {

                  "target": "url.params.api_key",

                  "value": "YOUR_API_KEY_HERE"

                }

              }

            ],

            "response_split": "target: body.data",

            "response_pagination": [

              {

                "set": {

                  "target": "url.params.page",

                  "value": "[[.last_response.body.current_page]]",

                  "fail_on_template_error": true

                }

              }

            ],

            "request_redirect_headers_ban_list": [],

            "oauth_scopes": [],

            "tags": ["forwarded", "powerdmarc-audit"]

          }

        }

      }

    }

  }

}



Expected response:

{

  "item": {

    "id": "<integration-id>",

    "name": "powerdmarc-audit-logs",

    ...

  }

}

Step 5: Install the Elastic Agent

The Elastic Agent must be installed on a host machine that has network access to both the PowerDMARC API and your Elasticsearch cluster. The agent runs as a service and is managed remotely through Fleet.

You can download the agent directly from Elastics website here


Step 6: Validate Data Ingestion

After the agent picks up the policy (typically within 1–2 minutes), the first API fetch should happen automatically. Verify data is flowing:


6a. Check Document Count

In Dev Tools:

GET logs-powerdmarc.audit-*/_count


A count greater than 0 confirms data is landing.


6b. Inspect a Sample Document

GET logs-powerdmarc.audit-*/_search?size=1


Verify the document contains properly parsed ECS fields:

  • user.name — the username who performed the action

  • event.action — description of the action taken

  • source.ip — originating IP address

  • source.geo.* — GeoIP enrichment (country, city, coordinates)

  • @timestamp — parsed event timestamp


6c. Verify in Discover

  1. Navigate to Analytics → Discover.

  2. Click the data view dropdown and select or create a data view for logs-powerdmarc.audit-*.

  3. Set the time range to Last 30 days.

  4. Audit log events should appear with all mapped fields.

Step 7: Create a Kibana Data View

Create a dedicated data view so the PowerDMARC audit data appears in Discover and can be used in dashboard visualizations.


Run in Dev Tools:


POST kbn:/api/data_views/data_view

{

  "data_view": {

    "title": "logs-powerdmarc.audit-*",

    "name": "PowerDMARC Audit Logs",

    "timeFieldName": "@timestamp"

  }

}


Note the returned id value — it is needed if you want to create dashboard panels programmatically.

Step 8: Build a Dashboard

You can download the dashboard.txt attached at the end of this guide and run it using Dev Tools to automatically create the dashboard 

Next Steps

At this point, PowerDMARC Audit Logs are successfully ingested into Elastic. You can now:


  • Create detection rules for suspicious activity (e.g., logins from unexpected IPs or countries, failed logins, bulk configuration changes)

  • Set up alerting via Elastic’s built-in alerting framework or connectors to email, Slack, etc.

  • Correlate PowerDMARC audit data with other security logs in your SIEM for comprehensive threat detection

  • Extend the integration to additional PowerDMARC API endpoints (DMARC aggregate reports, forensic reports)

  • Build compliance reports using Kibana’s reporting capabilities


P
PowerDMARC is the author of this solution article.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.