'Why is a task and stage numbers are decimal numbers - Apache Spark

I'm trying to understand an error I get while using spark (EMR). In the stderr of the step there is:

TaskSetManager: Lost task 1435.6 in stage 152.0 (TID 102906) on ip-172-11-47-9.ec2.internal, executor 203: org.apache.hadoop.fs.FileAlreadyExistsException (File already exists:s3://<path>) [duplicate 5]

You can see the task and the stage numbers are decimal numbers and right to the dot there isn't always a zero. I can find it a lot, this is just an example. what does the number right to the dot indicates?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source