'Why does conversion from Data to Int16 give the wrong answer in Swift

In my application I am trying to convert Data to Int 16 but I must be doing something wrong. Here is an example of my problem

let x = "0000001110000111"

if let number = Int(x, radix: 2) {
    
    print(number) // This give 903 which is what I would expect 
    
}

let y = Data(x.utf8)

let convertedData = y.withUnsafeBytes {pointer in
    
    return pointer.load(as: Int16.self)
    
}

print (convertedData) // This gives 12336 which is not what I was expecting

let str = String(decoding: y, as: UTF8.self)

print (str) // I wanted to check and make sure I got the correct binary back
            // This gives 0000001110000111 as I would expect 

What am I doing wrong here?



Solution 1:[1]

It's unclear why you expect the .utf8 view of a string to have anything at all to do with its interpretation as an integer. These are characters, not bits (or digits). You have a sequence of 48 and 49 bytes, which naturally is not going to get you anything like the actual number. To see this, just say

for c in y { print(c) }

By the way, the usual way to convert a string to Data is

let y = x.data(using: .utf8)

and the way to convert it back is

let str = String(data: y!, encoding: .utf8)

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1