'Datadog metric gets round to 2 decimal places
I am using a python package datadog_api_client to publish data to a metric to Datadog. The code looks like:
now = datetime.datetime.now().timestamp()
metric_name = 'test.metric.count'
with ApiClient(configuration) as api_client:
# Create an instance of the API class
api_instance = metrics_api.MetricsApi(api_client)
body = MetricsPayload(
series=[
Series(
host="test_tool",
interval=20,
metric=metric_name,
points=[
Point([now, float(1127023]),
],
tags=metric_tags,
type="count",
),
],
) # MetricsPayload
try:
# Submit metrics
api_response = api_instance.submit_metrics(body)
print(f"Metric: {metric_name} : {api_response}")
except ApiException as e:
raise Exception(f"Exception when calling MetricsApi->submit_metrics: {e}")
When I run this piece of code Datadog publishes the metric value as 1.13M which is not completely accurate. I checked a lot of documentation but cannot find any attribute that I am missing which can be added to remove the precision.
Can anyone please help me in getting the numbers?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
