'dictionary x[0:1] only gives me x[0]
when doing a decryptor, I can't solve this one problem. i sign two numbers to a letter using a dictionary. The problem is that it only signs one number
szyfr={x[0:1]:'a', x[2:3]:'b',x[4:5]:'c', x[6:7]:'d'}
x stores 54 numbers saved in a file and that's the key. The problem is, that when i print this dict, It says this:
{'1': 'a', '3': 'b', '9': 'c', '6': 'd', '': ' '}
the first one currently should have been "14" and the second should have been "37"
Has anyone been in a similar problem before? (Sorry if I'm doing something wrong but it's my first post here and English is my second language)
Solution 1:[1]
Python list slicing does not include the element of the last index of the slice.
You need to change it to szyfr={x[0:2]:'a'...}
to get {'14': 'a',} and so on..
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | ROOP AMBER |
