'What's the simplest way to convert from a single character String to an ASCII value in Swift?
I just want to get the ASCII value of a single char string in Swift. This is how I'm currently doing it:
var singleChar = "a"
println(singleChar.unicodeScalars[singleChar.unicodeScalars.startIndex].value) //prints: 97
This is so ugly though. There must be a simpler way.
Solution 1:[1]
UnicodeScalar("1")!.value // returns 49
Swift 3.1
Solution 2:[2]
Now in Xcode 7.1 and Swift 2.1
var singleChar = "a"
singleChar.unicodeScalars.first?.value
Solution 3:[3]
You can use NSString's characterAtIndex to accomplish this...
var singleCharString = "a" as NSString
var singleCharValue = singleCharString.characterAtIndex(0)
println("The value of \(singleCharString) is \(singleCharValue)") // The value of a is 97
Solution 4:[4]
Swift 4.2
The easiest way to get ASCII values from a Swift string is below
let str = "Swift string"
for ascii in str.utf8 {
print(ascii)
}
Output:
83
119
105
102
116
32
115
116
114
105
110
103
Solution 5:[5]
The way you're doing it is right. If you don't like the verbosity of the indexing, you can avoid it by cycling through the unicode scalars:
var x : UInt32 = 0
let char = "a"
for sc in char.unicodeScalars {x = sc.value; break}
You can actually omit the break in this case, of course, since there is only one unicode scalar.
Or, convert to an Array and use Int indexing (the last resort of the desperate):
let char = "a"
let x = Array(char.unicodeScalars)[0].value
Solution 6:[6]
A slightly shorter way of doing this could be:
first(singleChar.unicodeScalars)!.value
As with the subscript version, this will crash if your string is actually empty, so if you’re not 100% sure, use the optional:
if let ascii = first(singleChar.unicodeScalars)?.value {
}
Or, if you want to be extra-paranoid,
if let char = first(singleChar.unicodeScalars) where char.isASCII() {
let ascii = char.value
}
Solution 7:[7]
Here's my implementation, it returns an array of the ASCII values.
extension String {
func asciiValueOfString() -> [UInt32] {
var retVal = [UInt32]()
for val in self.unicodeScalars where val.isASCII() {
retVal.append(UInt32(val))
}
return retVal
}
}
Note: Yes it's Swift 2 compatible.
Solution 8:[8]
Swift 4.1
https://oleb.net/blog/2017/11/swift-4-strings/
let flags = "99_problems"
flags.unicodeScalars.map {
"\(String($0.value, radix: 16, uppercase: true))"
}
Result:
["39", "39", "5F", "70", "72", "6F", "62", "6C", "65", "6D", "73"]
Solution 9:[9]
Swift 4+
Char to ASCII
let charVal = String(ch).unicodeScalars
var asciiVal = charVal[charVal.startIndex].value
ASCII to Char
let char = Character(UnicodeScalar(asciiVal)!)
Solution 10:[10]
var singchar = "a" as NSString
print(singchar.character(at: 0))
Swift 3.1
Solution 11:[11]
There's also the UInt8(ascii: Unicode.Scalar) initializer on UInt8.
var singleChar = "a"
UInt8(ascii: singleChar.unicodeScalars[singleChar.startIndex])
Solution 12:[12]
With Swift 5, you can pick one of the following approaches in order to get the ASCII numeric representation of a character.
#1. Using Character's asciiValue property
Character has a property called asciiValue. asciiValue has the following declaration:
var asciiValue: UInt8? { get }
The ASCII encoding value of this character, if it is an ASCII character.
The following Playground sample codes show how to use asciiValue in order to get
the ASCII encoding value of a character:
let character: Character = "a"
print(character.asciiValue) //prints: Optional(97)
let string = "a"
print(string.first?.asciiValue) //prints: Optional(97)
let character: Character = "?"
print(character.asciiValue) //prints: nil
#2. Using Character's isASCII property and Unicode.Scalar's value property
As an alternative, you can check that the first character of a string is an ASCII character (using Character's isASCII property) then get the numeric representation of its first Unicode scalar (using Unicode.Scalar's value property). The Playground sample code below show how to proceed:
let character: Character = "a"
if character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: 97
*/
let string = "a"
if let character = string.first, character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: 97
*/
let character: Character = "?"
if character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: Not an ASCII character
*/
Solution 13:[13]
Swift 4
print("c".utf8["c".utf8.startIndex])
or
let cu = "c".utf8
print(cu[cu.startIndex])
Both print 99. Works for any ASCII character.
Solution 14:[14]
var input = "Swift".map { Character(extendedGraphemeClusterLiteral: $0).asciiValue! }
// [83, 119, 105, 102, 116]
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
