'ValueError: invalid literal for int() with base 2: '1.0'

I am using genetic algorithm to optimize something and the binary code is used for encoding variables.

When the varibles represented a sequence of binary numbers were decoded as its corresponding values, the error (ValueError: invalid literal for int() with base 2: '1.0' was return)

Specifically,

child = 1.0
V = int(child,2)
ValueError: invalid literal for int() with base 2: '1.0'

Could you please give me some clues how to fix this error?



Solution 1:[1]

You can't convert a string with a decimal point to integer. You need to remove the decimal point and everything after it.

child = '1.0'
V = int(child.split('.')[0], 2)

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Mark Ransom