Datadog Usage Metric
Use Datadog integration to collect billable and usage metrics for your organization.
Configure Datadog in Spot Connect
- Sign in to your Datadog account.
- Create an API key in Datadog.
- Copy the API key value.
- Create an application key in Datadog.
- Copy the application key.
- In the Spot console, go to Connect > Settings > Integrations.
- Click Datadog > Add Integration.
- Paste the application key in Datadog Application Key.
- Paste the API key in Datadog API Key.
- Click Add Instance.
Integration Action: Hourly Usage by Product Family
- Set up usage metering in Datadog: Get hourly usage by product family.
- In the Spot console, select Connect > Workflows.
- Click New Workflow and enter a name for the workflow.
- Select Manual Trigger > Create Workflow.
- In the center panel of the workflow builder, click the Manual Trigger node to open the right panel.
- Add a Loop action, update List of Items to be 1, and click Save.
- Add a Datadog Usage Metering action to the Loop action.
- Select the Datadog Instance, Usage Category > Hourly usage by product family, Product Familes, and Start time.
- Click Save Workflow.
Input
Parameter | Description | Required |
---|---|---|
Datadog Instance | The instance added in the integration | Required |
Usage Category | Hourly usage by product family | Required |
Product families | The list of product families to retrieve | Required |
Start Time | Date for usage starting at a specific time, such as 2024-03-01 T06 | Required |
End time | Date for usage ending at a specific time, such as 2024-05-01 T06 | Optional |
Include descendants | Include child org usage in response (true/false) | Optional |
S3 Bucket | The S3 bucket to store the query result. | Optional |
S3 Bucket Key | Name of S3 key | Optional |
Export File Name | Change the name of the S3 file | Optional |
Output
Parameter | Type | Description |
---|---|---|
execution_status | String | Status of run (such as S_OK / E_FAIL) |
output | Map | Usage API response |
s3_url | String | URL where the data/output is saved |
Integration Action: Billable Across Account
- Set up usage metering in Datadog: Get billable usage across your account.
- In the Spot console, select Connect > Workflows.
- Click New Workflow and enter a name for the workflow.
- Select Manual Trigger > Create Workflow.
- In the center panel of the workflow builder, click the Manual Trigger node to open the right panel.
- Add a Loop action, update List of Items to be 1, and click Save.
- Add a Datadog Usage Metering action to the Loop action.
- Select the Datadog Instance, Usage Category > Billable across account, and Start Month.
- Click Save Workflow.
Input
Parameter | Description | Required |
---|---|---|
Datadog Instance | The instance added in the integration | Required |
Usage Category | Billable across account | Required |
Start Month | Date for usage starting at a specific time, such as 2024-03-01 T06 | Required |
S3 Bucket | The S3 bucket to store the query result. | Optional |
S3 Bucket Key | Name of S3 key | Optional |
Export File Name | Change the name of the S3 file | Optional |
Output
Parameter | Type | Description |
---|---|---|
execution_status | String | Status of run (such as S_OK / E_FAIL) |
output | Map | Usage API response |
s3_url | String | URL where the data/output is saved |
Integration Action: Usage Across Account
- Set up usage metering in Datadog: Get usage across your account.
- In the Spot console, select Connect > Workflows.
- Click New Workflow and enter a name for the workflow.
- Select Manual Trigger > Create Workflow.
- In the center panel of the workflow builder, click the Manual Trigger node to open the right panel.
- Add a Loop action, update List of Items to be 1, and click Save.
- Add a Datadog Usage Metering action to the Loop action.
- Select the Datadog Instance, Usage Category > Usage across account, and Start Month.
- Click Save Workflow.
Input
Parameter | Description | Required |
---|---|---|
Datadog Instance | The instance added in the integration | Required |
Usage Category | Usage across account | Required |
Start month | Date for usage starting at a specific time, such as 2024-03-01 T06 | Required |
End month | Date for usage ending at a specific time, such as 2024-05-01 T06 | Optional |
Include org details | Include usage summaries for each suborg (true/false) | Optional |
S3 Bucket | The S3 bucket to store the query result. | Optional |
S3 Bucket Key | Name of S3 key | Optional |
Export File Name | Change the name of the S3 file | Optional |
Output
Parameter | Type | Description |
---|---|---|
execution_status | String | Status of run (such as S_OK / E_FAIL) |
output | Map | Usage API response |
s3_url | String | URL where the data/output is saved |
Integration Action: Historical Cost Across Account
- Set up usage metering in Datadog: Get historical cost across your account.
- In the Spot console, select Connect > Workflows.
- Click New Workflow and enter a name for the workflow.
- Select Manual Trigger > Create Workflow.
- In the center panel of the workflow builder, click the Manual Trigger node to open the right panel.
- Add a Loop action, update List of Items to be 1, and click Save.
- Add a Datadog Usage Metering action to the Loop action.
- Select the Datadog Instance, Usage Category > Historical cost across account, and Start Month.
- Click Save Workflow.
Input
Parameter | Description | Required |
---|---|---|
Datadog Instance | The instance added in the integration | Required |
Usage Category | Historical cost across account | Required |
Start month | Date for usage starting at a specific time, such as 2024-03-01 T06 | Required |
End month | Date for usage ending at a specific time, such as 2024-05-01 T06 | Optional |
Include org details | Include usage summaries for each suborg (true/false) | Optional |
S3 Bucket | The S3 bucket to store the query result. | Optional |
S3 Bucket Key | Name of S3 key | Optional |
Export File Name | Change the name of the S3 file | Optional |
Output
Parameter | Type | Description |
---|---|---|
execution_status | String | Status of run (such as S_OK / E_FAIL) |
output | Map | Usage API response |
s3_url | String | URL where the data/output is saved |