'pandas Windows vs Unix: Calling Index.__init__ with dtype='int' yields either RangeIndex or Int64Index
I have observed a difference in running pandas on Windows vs Unix.
I create a fresh Python 3.9.12 environment on both Windows and Unix and install pandas 1.3.4 in both. I observe the following
Input:
pd.Index(range(3), dtype='int')
Output on Windows:
Int64Index([0, 1, 2], dtype='int64')
Output on Unix:
RangeIndex(start=0, stop=3, step=1)
Summary: Windows and Unix disagree for dtype='int'.
Input:
pd.Index(range(3), dtype='int32')
Output on Windows:
Int64Index([0, 1, 2], dtype='int64')
Output on Unix:
Int64Index([0, 1, 2], dtype='int64')
Summary: Both Windows and Unix return Int64Index for dtype='int32'.
Input:
pd.Index(range(3), dtype='int64')
Output on Windows:
RangeIndex(start=0, stop=3, step=1)
Output on Unix:
RangeIndex(start=0, stop=3, step=1)
Summary: Both Windows and Unix return RangeIndex for dtype='int64'.
My question: Any advice on what to write? I prefer consistent behavior across Windows and Unix. I guess I should refrain from (ever?) writing 'int', so should I instead write 'int32' or 'int64'?
This discussion relates: Why do Pandas integer `dtypes` not behave the same on Unix and Windows?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
