'matplotlib quiver arrows point to wrong directions?
I'm using matplotlib.pyplot.quiver to plot a vector field. I use my set of data which are coordinates of the beginning and the end of my vectors. So basically I have A(xa,ya) which are where my arrows start and B(xb,yb) where my arrows end. A-points are localized at the tail of my arrows. But my B points are not localized at the tip of my arrows. Instead, the points are spread like random points. Is it becaus there is an autoscale or something else?
import numpy
import matplotlib.pyplot
import pandas
data = pandas.read_csv(filename,header=None,sep=',',names=['xa','ya','xb','yb'])
xa=data['xa']
ya=data['ya']
xb=data['xb']
yb=data['yb']
u = xb-xa
v = yb-ya
orientation = numpy.arctan2(V,U)
matplotlib.pyplot.quiver(xa,ya,u,v,orientation)
And what I get is ok.
But when I add
matplotlib.pyplot.scatter(xa,ya,color='blue')
matplotlib.pyplot.scatter(xb,yb,color='red')
Example of what I get, here with random data set
Can someone help me to understand why my B points are not localized where they should be? It's a bit confusing for me. I also tested with random values and I always got the same. So I'm wondering if my method is wrong or if there is something I can do to solve this problem. Thanks
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
