Data Connections

Connecting an AWS account

Overview

To onboard your AWS environment to DigiUsher, a cross-account IAM role with read-only access to cost, usage, resource, and security metadata is required. This document describes exactly what permissions are requested, why each is needed, and what credentials to share.

Summary of Access Required

ComponentDetails
Identity typeCross-account IAM Role (no credentials stored in your account)
AuthenticationSTS AssumeRole with ExternalId — temporary session tokens only
Trust relationshipCross-account IAM role trusting DigiUsher AWS account 058264546051 with ExternalId
Base permissionsRead-only across cost, compute, database, storage, networking, security, and organizational metadata
Optional permissionsEC2 start/stop, commitment purchases, tag management, automation (all disabled by default)
Data accessS3 bucket containing Cost and Usage Reports (CUR data only)
ScopeSingle AWS account (root or linked)
CapabilityWhat It Provides
Billing dataCost analytics, chargeback/showback, budgeting, forecasting, anomaly detection
Resource inventoryAsset discovery, idle resource detection, tag-based cost allocation
Optimization recommendationsRightsizing, commitment analysis (RI/SP), idle resource cleanup
Utilization metricsCPU, memory, network, disk usage for rightsizing analysis

DigiUsher cannot create, modify, or delete any of your AWS resources (unless optional write permissions are explicitly enabled).

Use CloudFormation for the fastest setup

We strongly recommend the DigiUsher CloudFormation template for the most efficient and reliable setup. It handles both root accounts and linked accounts in a single deployment, automates all resource creation, and simplifies future maintenance.

To proceed with automation, select the appropriate AccountType parameter during template deployment.

Launch Stack

Source: github.com/digiusher/digiusher-iac

If your organization's policies require manual resource provisioning, follow the steps below.


Prerequisites

Information to Gather

ItemHow to Find
AWS Account IDAWS Console > Account Settings, or aws sts get-caller-identity
ExternalIdDigiUsher platform > AWS account connection settings
RegionAll billing-related resources (S3 bucket, CUR exports) must be in us-east-1

Roles Required by the Person Performing Setup

RoleWhy
IAM AdministratorTo create IAM roles and policies
S3 Administrator (root accounts only)To create S3 buckets and bucket policies
Billing Administrator (root accounts only)To create CUR exports

About ExternalId

The ExternalId is generated by DigiUsher and is required for secure cross-account role assumption. It prevents the confused deputy problem.

Network & Email Access (For Regulated Environments)

If your organization restricts outbound internet access or email domains, ensure the following are in place before starting:

  • Domain allowlist: Add *.digiusher.com to your network/firewall allowlist so that users in your organization can access the DigiUsher platform from their browsers.
  • Email allowlist: Add digiusher.com as an approved sender domain in your email security gateway. DigiUsher sends onboarding confirmations, alerts, and reports from @digiusher.com addresses.

Linked Account Setup

Linked (member) accounts only need an IAM role — no S3 bucket or CUR export is required. The root/payer account's CUR covers all linked accounts.

Create the IAM Role

Create an IAM role that allows DigiUsher to assume it with your ExternalId.

  1. Go to IAMRolesCreate role
  2. Select Another AWS account
  3. Enter Account ID: 058264546051
  4. Check Require external ID and paste your ExternalId from DigiUsher
  5. Click Next (do not attach any managed policies)
  6. Name the role (e.g., DigiUsher-ReadAccess)
  7. Add a tag: ManagedBy = DigiUsher
  8. Create the role

First, create a trust policy file:

cat > digiusher-trust-policy.json << 'EOF'
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "AllowDigiUsherAssumeRole",
      "Effect": "Allow",
      "Principal": {
        "AWS": "arn:aws:iam::058264546051:root"
      },
      "Action": "sts:AssumeRole",
      "Condition": {
        "StringEquals": {
          "sts:ExternalId": "YOUR_EXTERNAL_ID"
        }
      }
    }
  ]
}
EOF

Then create the role:

aws iam create-role \
  --role-name DigiUsher-ReadAccess \
  --assume-role-policy-document file://digiusher-trust-policy.json \
  --tags Key=ManagedBy,Value=DigiUsher

Attach the Core Permissions Policy

Attach the read-only policy that grants DigiUsher access to cost, usage, and resource metadata.

  1. Go to IAMRoles → select your DigiUsher role
  2. Click Add permissionsCreate inline policy
  3. Switch to the JSON tab
  4. Paste the policy JSON below
  5. Name the policy DigiUsherCorePermissions

Save the policy JSON below to a file named digiusher-core-policy.json, then run:

aws iam put-role-policy \
  --role-name DigiUsher-ReadAccess \
  --policy-name DigiUsherCorePermissions \
  --policy-document file://digiusher-core-policy.json

Note on non-read actions

The core policy includes one action that is not strictly read-only: compute-optimizer:UpdateEnrollmentStatus (a one-time, non-destructive enrollment that enables Compute Optimizer recommendations). This action does not create, modify, or delete your resources.

Note the Role ARN

After creating the role, copy its ARN and provide it to DigiUsher to complete the connection.

  1. Go to IAMRoles → select your DigiUsher role
  2. Copy the Role ARN from the summary section (e.g., arn:aws:iam::123456789012:role/DigiUsher-ReadAccess)
aws iam get-role \
  --role-name DigiUsher-ReadAccess \
  --query 'Role.Arn' \
  --output text

That's it for linked accounts. Provide the Role ARN back to DigiUsher to complete the setup.


Root (Payer) Account Setup

Root account setup includes everything from the linked account setup above, plus an S3 bucket and a Cost and Usage Report (CUR) export. Choose one of the scenarios below based on your existing infrastructure.

Important

All billing resources (S3 bucket, CUR exports) must be in the us-east-1 region.

Use this if you don't have an existing CUR or billing bucket.

Create the S3 Bucket

  1. Go to S3Create bucket
  2. Bucket name: choose a globally unique name (e.g., acme-corp-digiusher-cur)
  3. Region: US East (N. Virginia) us-east-1
  4. Block all public access: enabled (default)
  5. Bucket Versioning: Enable
  6. Create the bucket
  7. After creation, go to the bucket → Management tab → Create lifecycle rule:
    • Rule name: DeleteOldVersions
    • Apply to all objects
    • Under Noncurrent version expiration: set to 90 days
  8. Create another lifecycle rule:
    • Rule name: TransitionToIntelligentTiering
    • Apply to all objects
    • Under Transition current versions: Transition to Intelligent-Tiering after 180 days
  9. Add tags: ManagedBy = DigiUsher, Purpose = CostAndUsageReports
BUCKET_NAME="acme-corp-digiusher-cur"

# Create the bucket
aws s3api create-bucket \
  --bucket "$BUCKET_NAME" \
  --region us-east-1

# Block public access
aws s3api put-public-access-block \
  --bucket "$BUCKET_NAME" \
  --public-access-block-configuration \
    BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true

# Enable versioning
aws s3api put-bucket-versioning \
  --bucket "$BUCKET_NAME" \
  --versioning-configuration Status=Enabled

# Set lifecycle rules
cat > lifecycle-rules.json << 'EOF'
{
  "Rules": [
    {
      "ID": "DeleteOldVersions",
      "Status": "Enabled",
      "Filter": {},
      "NoncurrentVersionExpiration": {
        "NoncurrentDays": 90
      }
    },
    {
      "ID": "TransitionToIntelligentTiering",
      "Status": "Enabled",
      "Filter": {},
      "Transitions": [
        {
          "Days": 180,
          "StorageClass": "INTELLIGENT_TIERING"
        }
      ]
    }
  ]
}
EOF

aws s3api put-bucket-lifecycle-configuration \
  --bucket "$BUCKET_NAME" \
  --lifecycle-configuration file://lifecycle-rules.json

# Add tags
aws s3api put-bucket-tagging \
  --bucket "$BUCKET_NAME" \
  --tagging 'TagSet=[{Key=ManagedBy,Value=DigiUsher},{Key=Purpose,Value=CostAndUsageReports}]'

Add the Bucket Policy

The bucket policy allows AWS billing services to write CUR data to your bucket.

  1. Go to S3 → select your bucket → Permissions tab
  2. Under Bucket policy, click Edit
  3. Paste the policy below (replace YOUR_BUCKET_NAME and YOUR_ACCOUNT_ID)
  4. Save changes

Save the policy JSON below as bucket-policy.json (with your values substituted), then apply:

aws s3api put-bucket-policy \
  --bucket "$BUCKET_NAME" \
  --policy file://bucket-policy.json

Create the CUR Export

Choose either CUR 2.0 (recommended) or CUR 1.0.

  1. Go to Billing and Cost ManagementData Exports
  2. Click Create export
  3. Export type: Standard data export
  4. Export name: DigiUsher_CUR_Export
  5. Select the table COST_AND_USAGE_REPORT
  6. Time granularity: Daily
  7. Include resource IDs: Yes
  8. Include split cost allocation data: Yes
  9. S3 bucket: select your bucket
  10. S3 prefix: reports/cur2
  11. File format: Parquet
  12. Compression: Parquet
  13. Overwrite: Overwrite existing report
  14. Create the export
ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)

aws bcm-data-exports create-export \
  --region us-east-1 \
  --export '{
    "Name": "DigiUsher_CUR_Export",
    "Description": "CUR 2.0 export for DigiUsher",
    "DataQuery": {
      "QueryStatement": "SELECT bill_bill_type, bill_billing_entity, bill_billing_period_end_date, bill_billing_period_start_date, bill_invoice_id, bill_invoicing_entity, bill_payer_account_id, bill_payer_account_name, capacity_reservation_capacity_reservation_arn, capacity_reservation_capacity_reservation_status, capacity_reservation_capacity_reservation_type, cost_category, discount, discount_bundled_discount, discount_total_discount, identity_line_item_id, identity_time_interval, line_item_availability_zone, line_item_blended_cost, line_item_blended_rate, line_item_currency_code, line_item_legal_entity, line_item_line_item_description, line_item_line_item_type, line_item_net_unblended_cost, line_item_net_unblended_rate, line_item_normalization_factor, line_item_normalized_usage_amount, line_item_operation, line_item_product_code, line_item_resource_id, line_item_tax_type, line_item_unblended_cost, line_item_unblended_rate, line_item_usage_account_id, line_item_usage_account_name, line_item_usage_amount, line_item_usage_end_date, line_item_usage_start_date, line_item_usage_type, pricing_currency, pricing_lease_contract_length, pricing_offering_class, pricing_public_on_demand_cost, pricing_public_on_demand_rate, pricing_purchase_option, pricing_rate_code, pricing_rate_id, pricing_term, pricing_unit, product, product_comment, product_fee_code, product_fee_description, product_from_location, product_from_location_type, product_from_region_code, product_instance_family, product_instance_type, product_instancesku, product_location, product_location_type, product_operation, product_pricing_unit, product_product_family, product_region_code, product_servicecode, product_sku, product_to_location, product_to_location_type, product_to_region_code, product_usagetype, reservation_amortized_upfront_cost_for_usage, reservation_amortized_upfront_fee_for_billing_period, reservation_availability_zone, reservation_effective_cost, reservation_end_time, reservation_modification_status, reservation_net_amortized_upfront_cost_for_usage, reservation_net_amortized_upfront_fee_for_billing_period, reservation_net_effective_cost, reservation_net_recurring_fee_for_usage, reservation_net_unused_amortized_upfront_fee_for_billing_period, reservation_net_unused_recurring_fee, reservation_net_upfront_value, reservation_normalized_units_per_reservation, reservation_number_of_reservations, reservation_recurring_fee_for_usage, reservation_reservation_a_r_n, reservation_start_time, reservation_subscription_id, reservation_total_reserved_normalized_units, reservation_total_reserved_units, reservation_units_per_reservation, reservation_unused_amortized_upfront_fee_for_billing_period, reservation_unused_normalized_unit_quantity, reservation_unused_quantity, reservation_unused_recurring_fee, reservation_upfront_value, savings_plan_amortized_upfront_commitment_for_billing_period, savings_plan_end_time, savings_plan_instance_type_family, savings_plan_net_amortized_upfront_commitment_for_billing_period, savings_plan_net_recurring_commitment_for_billing_period, savings_plan_net_savings_plan_effective_cost, savings_plan_offering_type, savings_plan_payment_option, savings_plan_purchase_term, savings_plan_recurring_commitment_for_billing_period, savings_plan_region, savings_plan_savings_plan_a_r_n, savings_plan_savings_plan_effective_cost, savings_plan_savings_plan_rate, savings_plan_start_time, savings_plan_total_commitment_to_date, savings_plan_used_commitment, split_line_item_actual_usage, split_line_item_net_split_cost, split_line_item_net_unused_cost, split_line_item_parent_resource_id, split_line_item_public_on_demand_split_cost, split_line_item_public_on_demand_unused_cost, split_line_item_reserved_usage, split_line_item_split_cost, split_line_item_split_usage, split_line_item_split_usage_ratio, split_line_item_unused_cost, tags FROM COST_AND_USAGE_REPORT",
      "TableConfigurations": {
        "COST_AND_USAGE_REPORT": {
          "TIME_GRANULARITY": "DAILY",
          "INCLUDE_RESOURCES": "TRUE",
          "INCLUDE_MANUAL_DISCOUNT_COMPATIBILITY": "FALSE",
          "INCLUDE_SPLIT_COST_ALLOCATION_DATA": "TRUE",
          "INCLUDE_CAPACITY_RESERVATION_DATA": "TRUE"
        }
      }
    },
    "RefreshCadence": {
      "Frequency": "SYNCHRONOUS"
    },
    "DestinationConfigurations": {
      "S3Destination": {
        "S3Bucket": "'"$BUCKET_NAME"'",
        "S3Prefix": "reports/cur2",
        "S3Region": "us-east-1",
        "S3OutputConfigurations": {
          "Overwrite": "OVERWRITE_REPORT",
          "Format": "PARQUET",
          "Compression": "PARQUET",
          "OutputType": "CUSTOM"
        }
      }
    }
  }' \
  --resource-tags Key=ManagedBy,Value=DigiUsher
  1. Go to Billing and Cost ManagementCost & Usage Reports
  2. Click Create report
  3. Report name: DigiUsher_CUR_Export
  4. Include resource IDs: checked
  5. Split cost allocation data: checked
  6. Time unit: Daily
  7. S3 bucket: select your bucket
  8. S3 prefix: reports/cur1
  9. Report versioning: Overwrite existing report
  10. Compression: Parquet
  11. Create the report
aws cur put-report-definition \
  --region us-east-1 \
  --report-definition '{
    "ReportName": "DigiUsher_CUR_Export",
    "TimeUnit": "DAILY",
    "Format": "Parquet",
    "Compression": "Parquet",
    "AdditionalSchemaElements": ["RESOURCES", "SPLIT_COST_ALLOCATION_DATA"],
    "S3Bucket": "'"$BUCKET_NAME"'",
    "S3Prefix": "reports/cur1",
    "S3Region": "us-east-1",
    "RefreshClosedReports": true,
    "ReportVersioning": "OVERWRITE_REPORT",
    "AdditionalArtifacts": []
  }'

Create the IAM Role

Follow the same instructions as Linked Account — Step 1 and Step 2 above.

Attach the S3 CUR Access Policy

The root account role needs an additional policy to read from the CUR bucket.

  1. Go to IAMRoles → select your DigiUsher role
  2. Click Add permissionsCreate inline policy
  3. Switch to the JSON tab, paste the policy below
  4. Name it DigiUsherS3CURAccess
cat > digiusher-s3-cur-policy.json << EOF
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "DigiUsherCURPermissions",
      "Effect": "Allow",
      "Action": [
        "s3:GetObject",
        "s3:ListBucket"
      ],
      "Resource": [
        "arn:aws:s3:::$BUCKET_NAME",
        "arn:aws:s3:::$BUCKET_NAME/*"
      ]
    }
  ]
}
EOF

aws iam put-role-policy \
  --role-name DigiUsher-ReadAccess \
  --policy-name DigiUsherS3CURAccess \
  --policy-document file://digiusher-s3-cur-policy.json

Note the Role ARN

Follow Linked Account — Step 3 to retrieve and share the Role ARN with DigiUsher.

Use this if you already have an S3 bucket with the appropriate billing service permissions.

  1. Verify your bucket policy — ensure it includes the billing service permissions shown in New Bucket + New CUR — Step 2. The bucket must be in us-east-1.
  2. Create the CUR export — follow New Bucket + New CUR — Step 3, pointing to your existing bucket.
  3. Create the IAM role — follow Linked Account Steps 1-2.
  4. Attach the S3 CUR access policy — follow New Bucket + New CUR — Step 5, using your existing bucket name.
  5. Share the Role ARN with DigiUsher.

Use this if you already have a CUR configured and just need the DigiUsher IAM role.

  1. Create the IAM role — follow Linked Account Steps 1-2.
  2. Attach the S3 CUR access policy — follow New Bucket + New CUR — Step 5, using your existing bucket name.
  3. Share the Role ARN with DigiUsher, along with the existing bucket name and CUR export name.

Optional Permissions

These permissions are disabled by default and can be added as separate inline policies on the DigiUsher IAM role if needed.


Carbon Footprint Export (Optional)

If you want to track your AWS carbon emissions, create an additional data export alongside your CUR.

  1. Go to Billing and Cost ManagementData Exports
  2. Click Create export
  3. Export type: Standard data export
  4. Export name: DigiUsher_Carbon_Export
  5. Select the table CARBON_EMISSIONS
  6. S3 bucket: the same bucket used for your CUR
  7. S3 prefix: reports/carbon-footprint
  8. File format: Parquet
  9. Compression: Parquet
  10. Overwrite: Overwrite existing report
  11. Create the export
aws bcm-data-exports create-export \
  --region us-east-1 \
  --export '{
    "Name": "DigiUsher_Carbon_Export",
    "Description": "Carbon Footprint export for DigiUsher",
    "DataQuery": {
      "QueryStatement": "SELECT last_refresh_timestamp, location, model_version, payer_account_id, product_code, region_code, total_lbm_emissions_unit, total_lbm_emissions_value, total_mbm_emissions_unit, total_mbm_emissions_value, total_scope_1_emissions_unit, total_scope_1_emissions_value, total_scope_2_lbm_emissions_unit, total_scope_2_lbm_emissions_value, total_scope_2_mbm_emissions_unit, total_scope_2_mbm_emissions_value, total_scope_3_lbm_emissions_unit, total_scope_3_lbm_emissions_value, total_scope_3_mbm_emissions_unit, total_scope_3_mbm_emissions_value, usage_account_id, usage_period_end, usage_period_start FROM CARBON_EMISSIONS",
      "TableConfigurations": {}
    },
    "RefreshCadence": {
      "Frequency": "SYNCHRONOUS"
    },
    "DestinationConfigurations": {
      "S3Destination": {
        "S3Bucket": "'"$BUCKET_NAME"'",
        "S3Prefix": "reports/carbon-footprint",
        "S3Region": "us-east-1",
        "S3OutputConfigurations": {
          "Overwrite": "OVERWRITE_REPORT",
          "Format": "PARQUET",
          "Compression": "PARQUET",
          "OutputType": "CUSTOM"
        }
      }
    }
  }' \
  --resource-tags Key=ManagedBy,Value=DigiUsher

Connect in DigiUsher

After completing the setup, enter the following into the DigiUsher platform to complete the connection:

FieldWhere to Find
Role ARNIAM → Roles → DigiUsher-ReadAccess → ARN
Bucket Name (root only)The S3 bucket name you used for CUR data
CUR Export Name (root only)The name of your CUR export (e.g., DigiUsher_CUR_Export)
CUR Version (root only)CUR1 or CUR2

Security Note

Never share your ExternalId publicly. DigiUsher already has it from the account connection process.

For automated deployment, we recommend using the CloudFormation template which handles all of the above in a single deployment.


Verification Checklist

Verify via CLI

# Check the role exists and has the correct trust policy
aws iam get-role --role-name DigiUsher-ReadAccess

# List attached inline policies
aws iam list-role-policies --role-name DigiUsher-ReadAccess

# For root accounts — verify the CUR export:
# CUR 2.0
aws bcm-data-exports list-exports --region us-east-1
# CUR 1.0
aws cur describe-report-definitions --region us-east-1

All Accounts

  • Obtained ExternalId from DigiUsher platform
  • *.digiusher.com allowlisted in network/firewall (if applicable)
  • digiusher.com allowlisted for incoming email (if applicable)
  • IAM role created with trust policy
  • ExternalId configured in trust policy
  • DigiUsherCorePermissions inline policy attached
  • Role ARN shared with DigiUsher

Root/Payer Accounts Only

  • S3 bucket created in us-east-1 (or existing bucket verified)
  • Bucket policy grants access to billingreports.amazonaws.com and bcm-data-exports.amazonaws.com
  • Public access blocked, versioning enabled
  • CUR export created (CUR 1.0 or CUR 2.0)
  • DigiUsherS3CURAccess inline policy attached with correct bucket name
  • Bucket name and CUR export name shared with DigiUsher

Optional (if enabled)

  • EC2 Start/Stop policy attached (DigiUsherEC2StartStopPolicy)
  • Commitment Purchase policy attached (DigiUsherCommitmentPurchasePolicy)
  • Tag Management policy attached (DigiUsherTagManagementPolicy)
  • Automation policy attached (DigiUsherAutomationPolicy)
  • Carbon Footprint export created

Security

What DigiUsher CAN Access (Read-Only)

  • Cost and usage data via S3 (CUR exports)
  • Resource metadata (names, types, regions, tags) across compute, database, storage, networking, and security services
  • Utilization metrics via CloudWatch
  • Optimization recommendations via Compute Optimizer and Trusted Advisor
  • Reserved Instance and Savings Plan information
  • Organization, account, and OU hierarchy

The core policy includes compute-optimizer:UpdateEnrollmentStatus. This is a one-time, non-destructive enrollment that enables Compute Optimizer recommendations — it does not create, modify, or delete your resources. This permission can be omitted if desired, however DigiUsher will not be able to provide Compute Optimizer-based rightsizing recommendations without it.

What DigiUsher CANNOT Do

  • Create, modify, or delete any AWS resources (unless optional write permissions are explicitly enabled)
  • Access application data, databases, or storage contents (beyond CUR data)
  • Modify IAM policies or permissions
  • Read secrets, credentials, or encryption keys
  • Access network traffic or logs content
  • Make purchases or modify billing settings (unless optional commitment purchase permissions are enabled)

Monitoring

Monitor role assumption activity in CloudTrail, filtering by AssumeRole events for the DigiUsher-ReadAccess principal.

Credential Rotation

  • CloudFormation: Update the stack to rotate the ExternalId if needed.
  • Manual: Update the trust policy on the IAM role with a new ExternalId, then update the ExternalId in the DigiUsher platform.

Revocation

  • CloudFormation: Delete the stack — removes the IAM role, all policies, and S3 bucket (if created by the template).
  • Manual: Delete the DigiUsher-ReadAccess in IAM > Roles. This instantly revokes all access.

BYOC (Bring Your Own Cloud) Deployment


Troubleshooting

"Access Denied" when assuming the role

  1. Verify the trust policy has the correct DigiUsher account ID (058264546051)
  2. Confirm the ExternalId matches what's configured in the DigiUsher platform
  3. Ensure the role ARN was copied correctly (no extra spaces or characters)

CUR data not appearing

  1. CUR exports can take up to 24 hours to deliver the first report
  2. Verify the export exists: Billing > Data Exports (CUR 2.0) or Cost & Usage Reports (CUR 1.0)
  3. Confirm the S3 bucket is in us-east-1
  4. Check that the bucket policy allows billingreports.amazonaws.com and bcm-data-exports.amazonaws.com

Missing Compute Optimizer recommendations

Compute Optimizer requires enrollment. The compute-optimizer:UpdateEnrollmentStatus permission in the core policy allows DigiUsher to opt in. Recommendations may take 12-24 hours to appear after enrollment.


Need Help?

If you encounter any issues not covered above, contact us at support@digiusher.com and we'll help you get set up.