'How can you verify a multiple code point emoji is supported?

For context, I'm trying to create a mapping from string code points to emojis and need to know if the system supports the emoji:

("1F9AE") -> "🦮"
("1FAE0") -> "🫠" (iOS 15.4+) / nil (if below 15.4, since it would show as "򪪺")
("1F415-200D-1F9BA") -> "🐕‍🦺"
("1F415-1F9BA") -> nil (since it would normally be "🐕🦺", which isn't a single emoji)

I've gotten this to work with the single code point case with:

func emoji(for codepoint: String) -> String? {
    guard let int = Int(codepoint, radix: 16),
        let scalar = UnicodeScalar(int),
        scalar.properties.isEmoji
    else { return nil }
    return String(scalar)
}

However, I can't figure out what the corresponding check for isEmoji with multiple code points would be.

// assume I had to make these scalars via a `String`
let scalars = [UnicodeScalar(0x1F415)!, UnicodeScalar(0x200D)!, UnicodeScalar(0x1F9BA)!]
let scalarView = String.UnicodeScalarView(scalars)
// How can I check that this `UnicodeScalarView` is for single, supported emoji, since I can't check `isEmoji`?
print(String(scalarView))

For example, "1FAE0-1F3FD" should be nil, since it's not a single emoji ("🫠🏽"). However, in future versions, the melting face might work with skin variation, in which case it should return that single valid emoji.



Solution 1:[1]

According to Emoji 14.0's data files, an emoji is either a basic emoji, a keycap sequence, a flag, a modifier sequence, or a ZWJ sequence. In each of those cases, there will be at least one code point in the sequence for which isEmoji returns true, and the sequence will form a single glyph.

So, you should make a string out of the unicode scalars first:

let scalars = [UnicodeScalar(0x1F415)!, UnicodeScalar(0x200D)!, UnicodeScalar(0x1F9BA)!]
let scalarView = String.UnicodeScalarView(scalars)
let string = String(scalarView)

Then, you can check if it is a emoji like this:

CTLineGetGlyphCount(CTLineCreateWithAttributedString(
    NSAttributedString(string: string)
)) == 1 && 
string.unicodeScalars.contains { $0.properties.isEmoji }

Alternatively, since you just want to check if the emoji can be displayed properly, you can use CTFontGetGlyphsForCharacters to see if Apple Color Emoji supports the characters.

let font = UIFont(name: "AppleColorEmoji", size: 20)! as CTFont
var text = Array(string.utf16)
var glyphs = Array(repeating: 0 as CGGlyph, count: text.count)
let isEmoji = CTFontGetGlyphsForCharacters(font, &text, &glyphs, text.count) && 
    CTLineGetGlyphCount(CTLineCreateWithAttributedString(
        NSAttributedString(string: string)
    )) == 1

Note that both of these methods will return false positives (non-emojis like ASCII letters being reported as emojis), but will not return false negatives.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1