'Check to see if a dataframe increases to a corner
I have a numerous dataframes that looks something like this:
| first_column | second_column | third_column |
| ------------ | ------------- | ------------ |
| 25 | 20 | 18 |
| 20 | 21 | 18 |
| 17 | 18 | 16 |
I want to see how well they converge to a corner (i.e., how well the values are increasing as you go to a corner of the dataframe). I've tried a couple of different methods, such as using df.diff() < 0 to see if the values are are decreasing as you go down the column and then converting the True or False statements to integers of 1 or 0 and summing those up, then doing the same thing but with the dataframe transposed and adding both sums together, but that doesn't seem to give me the correct best dataframes that converge to a corner.
I know this is oddly specific, but is there anyway to best see if the values of a dataframe increase as you go to a corner of the dataframe?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
