Submit a ticket My Tickets
Welcome
Login  Sign up

PowerDMARC and Splunk - Integration Guide

With PowerDMARC's Splunk integration, you can seamlessly ingest and monitor your email authentication and domain security data directly within your Splunk environment. By leveraging the PowerDMARC API, organizations can build a streamlined SIEM integration without complex configurations—simply connect, run, and gain centralized visibility into their email security posture across all domains.

The guide intentionally focuses on setup and ingestion. Splunk dashboards and advanced visualizations are out of scope for this document.

API Documentation

  • Swagger Documentation: https://app.powerdmarc.com/swagger-ui/index.html

  • Alternative Documentation: https://api.powerdmarc.com/

Note: Naming conventions are not restricted to those mentioned in this documentation.


Architecture Overview

For this example, we are using the audit log endpoint for testing purposes and illustration.

PowerDMARC API

    ↓

Python Script (Scheduled via cron/Task Scheduler)

    ↓

Splunk HTTP Event Collector (HEC)

    ↓

Splunk (Search, Dashboards, Alerts, Correlation)

Splunk receives data through its HTTP Event Collector (HEC) endpoint, which allows secure data ingestion from external sources.


Prerequisites

Before starting, ensure you have:

  • Splunk Enterprise or Splunk Cloud with administrative access

  • Permission to:

    • Create HTTP Event Collector (HEC) tokens

    • Create indexes

    • Configure data inputs

  • Python 3.7+ installed on the system running the integration script

  • PowerDMARC API Bearer Token with permission to access Audit Logs

  • Network connectivity from the script execution environment to:

    • PowerDMARC API (https://app.powerdmarc.com)

    • Splunk HEC endpoint


Splunk Configuration

Step 1: Create a Dedicated Index

  1. Navigate to Settings → Indexes

  2. Click New Index

  3. Configure:

    • Index Name: powerdmarc

    • Index Data Type: Events

    • App: search (or your preferred app)

    • Leave other settings as default or adjust based on retention requirements

  4. Click Save

Step 2: Enable HTTP Event Collector (HEC)

  1. Navigate to Settings → Data inputs

  2. Click HTTP Event Collector

  3. Click Global Settings

  4. Configure:

    • All Tokens: Enabled

    • Enable SSL: Enabled (recommended)

    • HTTP Port Number: 8088 (default)

  5. Click Save

Step 3: Create HEC Token

  1. Still in Settings → Data inputs → HTTP Event Collector

  2. Click New Token

  3. Configure token settings:

    • Name: PowerDMARC_Integration

    • Source name override: powerdmarc:api

    • Description: Token for PowerDMARC audit log ingestion

  4. Click Next

  5. Input Settings:

    • Source type: Select Automatic

    • Index: Select powerdmarc (or available indexes including powerdmarc)

    • Default Index: powerdmarc

  6. Click Review

  7. Click Submit

  8. Important: Copy and save the token value immediately (you cannot retrieve it later)


Integration Script Setup

Step 4: Prepare the Python Environment

Option A: Online Installation (Recommended)

Install required Python libraries from PyPI (requires internet connection):

pip install requests

Or use the included requirements.txt file:

pip install -r requirements.txt

Option B: Offline Installation

For environments without internet access, use the included requirements.txt file.

Download packages on a machine with internet access:

pip download -r requirements.txt -d ./packages

Transfer the packages folder to your target system, then install:

pip install --no-index --find-links=./packages -r requirements.txt

Verify Installation:

python3 -c "import requests; print(f'requests version: {requests.__version__}')"

Expected output: requests version: 2.31.0 (or similar)

Step 5: Create the Integration Script

Create powerdmarc_to_splunk.py and paste code from powerdmarc_to_splunk.txt pro

Step 6: Configure the Script

Edit the powerdmarc_to_splunk.py file and update:

  1. POWERDMARC_API_TOKEN: Your PowerDMARC API bearer token

  2. SPLUNK_HEC_URL: Your Splunk HEC endpoint

    • Format: https://your-splunk-instance:8088/services/collector/event

    • For Splunk Cloud: https://input-<your-instance>.cloud.splunk.com:8088/services/collector/event

  3. SPLUNK_HEC_TOKEN: The HEC token created in Step 3

  4. SPLUNK_INDEX: powerdmarc (or your chosen index name)

  5. DAYS_TO_FETCH: Number of days of historical data to fetch (default: 1)

Step 7: Test the Script

Run the script manually to verify connectivity and data ingestion:

python3 powerdmarc_to_splunk.py

Expected output:

============================================================

PowerDMARC to Splunk Integration

============================================================


Fetching PowerDMARC logs from 2025-01-03 00:00:00 to 2025-01-04 00:00:00...

Successfully fetched 15 audit log entries

Sending 15 events to Splunk...


Ingestion Summary:

  Successfully sent: 15

  Failed: 0

  Total: 15


============================================================

Integration completed successfully

============================================================


Schedule Automated Execution

For Linux/Unix (cron)

  1. Edit crontab:

crontab -e


  1. Add entry to run hourly:

0 * * * * /usr/bin/python3 /path/to/powerdmarc_to_splunk.py >> /var/log/powerdmarc_splunk.log 2>&1


  1. Or run every 15 minutes:

*/15 * * * * /usr/bin/python3 /path/to/powerdmarc_to_splunk.py >> /var/log/powerdmarc_splunk.log 2>&1

For Windows (Task Scheduler)

  1. Open Task Scheduler

  2. Click Create Task

  3. Configure:

    • Name: PowerDMARC Splunk Integration

    • Description: Scheduled ingestion of PowerDMARC audit logs

    • Security options: Run whether user is logged on or not

  4. Triggers tab: Click New

    • Begin: On a schedule

    • Settings: Daily, repeat every 1 hour

  5. Actions tab: Click New

    • Action: Start a program

    • Program: python.exe

    • Arguments: C:\path\to\powerdmarc_to_splunk.py

  6. Click OK


Validate Data Ingestion in Splunk

Step 8: Verify Data in Splunk

  1. Navigate to Search & Reporting app

  2. Run the following SPL query:

index=powerdmarc sourcetype=powerdmarc:auditlog

| sort - _time

| head 20

| table _time, user_name, action, ip_address, created_at

Expected Results

You should see audit log entries with fields such as:

  • user_nameUser who performed the action

  • action: Description of the action performed

  • ip_address: IP address of the user

  • created_at: Timestamp of the audit event

Sample Event Structure

{

  "user_name": "John Doe",

  "action": "Updated attached domains",

  "ip_address": "12.111.67.123",

  "a_username": null,

  "other": null,

  "created_at": "2025-01-04 14:29:24"

}


Troubleshooting

Common Issues

Issue: No data appearing in Splunk

  • Verify HEC token is correct and enabled

  • Check HEC endpoint URL is accessible

  • Verify index powerdmarc exists and is accessible

  • Check firewall rules allow outbound HTTPS to Splunk

  • Review script execution logs for errors

Issue: SSL certificate errors

  • For self-signed certificates, set verify=False in the script (not recommended for production)

  • Or install proper SSL certificates on Splunk

Issue: PowerDMARC API authentication failures

  • Verify API token is valid and has correct permissions

  • Check token hasn't expired

  • Confirm API endpoint URL is correct

Issue: Script runs but no logs fetched

  • Verify the time range parameters

  • Check if audit logs exist for the specified period

  • Review PowerDMARC API documentation for parameter format


Next Steps

At this point:

  • PowerDMARC Audit Logs are successfully ingested into Splunk

  • You can now:

    • Create custom dashboards for visualization

    • Set up alerts for specific audit events

    • Correlate PowerDMARC data with other security logs

    • Build reports for compliance and security monitoring

    • Use Splunk's analytics capabilities for threat detection

Recommended Enhancements

  1. Add Additional API Endpoints: Extend the script to fetch DMARC aggregate reports, forensic reports, or domain data

  2. Implement Checkpoint Logic: Store the last successful run timestamp to avoid duplicate ingestion

  3. Add Logging: Implement proper logging with rotation for production environments

  4. Error Notifications: Configure email or Slack notifications on script failures

  5. Deploy as Splunk Add-on: Package the integration as a Splunk TA (Technology Add-on) for easier deployment


Security Considerations

  • Store credentials securely: Use environment variables or encrypted configuration files

  • Enable SSL verification: Always use verify=True for production environments

  • Rotate API tokens regularly: Follow security best practices for credential management

  • Restrict HEC token permissions: Limit to specific indexes and source types

  • Monitor script execution: Set up alerts for failed executions or anomalies

  • Review Splunk access controls: Ensure only authorized users can access PowerDMARC data


Support and Resources

  • PowerDMARC API Documentation: https://api.powerdmarc.com/

  • Splunk HEC Documentation: https://docs.splunk.com/Documentation/Splunk/latest/Data/UsetheHTTPEventCollector

Splunk Answers: https://community.splunk.com/

A
Ayan is the author of this solution article.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.