[Not sure if this information is completely what you're looking for, but it certainly is relevant. Please give a more specific problem statement or a simple worked example and I'll be happy to expand/refine my answer.]
The angle between vectors and
is defined using the dot product like so:
$$ \cos(\theta) = \frac{\vec{x}\cdot \vec{y}}{\|\vec{x}\| \ \|\vec{y}\|}$$ where the expression $\|\vec{a}\| = \sqrt{a_1^2 + a_2^2 + a_3^2}$ is the magnitude/norm of a vector. The magnitude of a vector in 3D space is just the square root of the sum of the squares of the
components of that vector.
By using the inverse cosine function, you can determine the angle between the vectors. You'll have to pay attention to the sign of the dot product to determine if the resulting angle is acute (positive dot product), perpendicular (zero dot product), or obtuse (negative dot product).
Answer from Xoque55 on Stack ExchangeAngle between two 3D Vectors
geometry - The X angle between two 3D vectors? - Stack Overflow
math - Signed angle between two 3D vectors with same origin within the same plane - Stack Overflow
Find the angle between two 3D-vectors - Mathematics Stack Exchange
How do I calculate the angle between two vectors in 3D?
To calculate the angle between two vectors in a 3D space:
- Find the dot product of the vectors.
- Divide the dot product by the magnitude of the first vector.
- Divide the resultant by the magnitude of the second vector.
Mathematically, angle α between two vectors [xa, ya, za] and [xb, yb, zb] can be written as:
α = arccos[(xa xb + ya yb + za zb) / (√(xa² + ya² + za²) × √(xb² + yb² + zb²) )].
How do I calculate the angle between two vectors in 2D?
To calculate the angle between two vectors in a 2D space:
- Find the dot product of the vectors.
- Divide the dot product by the magnitude of the first vector.
- Divide the resultant by the magnitude of the second vector.
Mathematically, angle α between two vectors [xa, ya] and [xb, yb] can be written as:
α = arccos[(xa xb + ya yb) / (√(xa² + ya²) × √(xb² + yb²))].
How to define the angle formed by two vectors?
The angle formed between two vectors is defined using the inverse cosine of the dot products of the two vectors and the product of their magnitudes.
Videos
I am trying to find the angle between two, three-dimensional lines in ArcPro that meet at a point in 3D space. The two vector lines are known, and the point at which they meet is known.
Any advice on how to accomplish this?
atan2(crossproduct.length,scalarproduct)
The reason for using atan2 instead of arccos or arcsin is accuracy. arccos behaves very badly close to 0 degrees. Small computation errors in argument will lead to disproportionally big errors in result. arcsin has same problem close to 90 degrees.
Computing the altitude angle
OK, it might be I finally understood your comment below about the result being independent of the y angle, and about how it relates to the two vectors. It seems you are not really interested in two vectors and the angle between these two, but instead you're interested in the difference vector and the angle that one forms against the horizontal plane. In a horizontal coordinate system (often used in astronomy), that angle would be called “altitude” or “elevation”, as opposed to the “azimuth” you compute with the formula in your (edited) question. “altitude” closely relates to the “tilt” of your camera, whereas “azimuth” relates to “panning”.
We still have a 2D problem. One coordinate of the 2D vector is the y coordinate of the difference vector. The other coordinate is the length of the vector after projecting it on the horizontal plane, i.e. sqrt(x*x + z*z). The final solution would be
x = A.x - B.x
y = A.y - B.y
z = A.z - B.z
alt = toDegrees(atan2(y, sqrt(x*x + z*z)))
az = toDegrees(atan2(-x, -z))
The order (A - B as opposed to B - A) was chosen such that “A above B” yields a positive y and therefore a positive altitude, in accordance with your comment below. The minus signs in the azimuth computation above should replace the + 180 in the code from your question, except that the range now is [-180, 180] instead of your [0, 360]. Just to give you an alternative, choose whichever you prefer. In effect you compute the azimuth of B - A either way. The fact that you use a different order for these two angles might be somewhat confusing, so think about whether this really is what you want, or whether you want to reverse the sign of the altitude or change the azimuth by 180°.
Orthogonal projection
For reference, I'll include my original answer below, for those who are actually looking for the angle of rotation around some fixed x axis, the way the original question suggested.
If this x angle you mention in your question is indeed the angle of rotation around the x axis, as the camera example suggests, then you might want to think about it this way: set the x coordinate to zero, and you will end up with 2D vectors in the y-z plane. You can think of this as an orthogonal projection onto said plain. Now you are back to a 2D problem and can tackle it there.
Personally I'd simply call atan2 twice, once for each vector, and subtract the resulting angles:
toDegrees(atan2(A.z, A.y) - atan2(B.z, B.y))
The x=0 is implicit in the above formula simply because I only operate on y and z.
I haven't fully understood the logic behind your single atan2 call yet, but the fact that I have to think about it this long indicates that I wouldn't want to maintain it, at least not without a good explanatory comment.
I hope I understood your question correctly, and this is the thing you're looking for.
The solution I'm currently using seems to be missing here.
Assuming that the plane normal is normalized (|Vn| == 1), the signed angle is simply:
For the right-handed rotation from Va to Vb:
atan2((Va x Vb) . Vn, Va . Vb)
For the left-handed rotation from Va to Vb:
atan2((Vb x Va) . Vn, Va . Vb)
which returns an angle in the range [-PI, +PI] (or whatever the available atan2 implementation returns).
. and x are the dot and cross product respectively.
No explicit branching and no division/vector length calculation is necessary.
Explanation for why this works: let alpha be the direct angle between the vectors (0° to 180°) and beta the angle we are looking for (0° to 360°) with beta == alpha or beta == 360° - alpha
Va . Vb == |Va| * |Vb| * cos(alpha) (by definition)
== |Va| * |Vb| * cos(beta) (cos(alpha) == cos(-alpha) == cos(360° - alpha)
Va x Vb == |Va| * |Vb| * sin(alpha) * n1
(by definition; n1 is a unit vector perpendicular to Va and Vb with
orientation matching the right-hand rule)
Therefore (again assuming Vn is normalized):
n1 . Vn == 1 when beta < 180
n1 . Vn == -1 when beta > 180
==> (Va x Vb) . Vn == |Va| * |Vb| * sin(beta)
Finally
tan(beta) = sin(beta) / cos(beta) == ((Va x Vb) . Vn) / (Va . Vb)
Use cross product of the two vectors to get the normal of the plane formed by the two vectors. Then check the dotproduct between that and the original plane normal to see if they are facing the same direction.
angle = acos(dotProduct(Va.normalize(), Vb.normalize()));
cross = crossProduct(Va, Vb);
if (dotProduct(Vn, cross) < 0) { // Or > 0
angle = -angle;
}
Use inner product of vectors. The relation is with $norm(v)=|v|$.
Take and
.
The inner product is
.
The norm is
.
So the angle becomes
Find components of each vector and substitute.
This is not an answer to the stated question, but more like an extended comment; specifically, that there is an even easier way to find out , the minimum distance between the line segment
and point
.
The question implies that OP is interested in whether a spherical shell, let's say of radius , centered at
, intersects the line segment between
and
.
If
, and
, then both points
and
are within the spherical shell. Because the shell is convex, the line segment between
and
is also completely inside the spherical shell (and therefore there is no intersection per se).
In practice, the checks are better written as
and
, i.e.
and
Otherwise,
If
, then point
is within the spherical shell. Because
isn't, the line segment between
and
must pass through the spherical shell.
Otherwise,
If
, then point
is within the spherical shell. Because
isn't, the line segment between
and
must pass through the spherical shell.
Otherwise,
Calculate the distance squared
between point
and the line that passes through points
and
(using e.g. Point-Line distance from Wolfram MathWorld) :
i.e.
Now, if
, the entire line passing through
and
is outside the spherical shell, and there cannot be any intersection.
Otherwise,
Calculate the relative position
of the point closest to
on the line passing through
and
:
If
, then the closest point to
on the line is between
and
, and it is either inside (
) or on (
) the spherical shell.
This test can also be written as
Otherwise, there is no intersection. (The line that passes through
and
does intersect the
-radius spherical shell centered at
, but the intersections do not occur in the segment between
and
.)
If I counted right, the maximum cost of the above tests, total, is 25 multiplications, one division, and 46 additions or subtractions (but much fewer multiplications and additions or subtractions if you use temporary variables so you don't do the same operations repeatedly). On a computer, that is roughly comparable to the work done to evaluate a single trigonometric function; so if you use that as the metric for "easy" (I do), this is definitely "easier".
Hint: Calculate
$\displaystyle \cos \theta = \frac{a . b}{|a||b|}.$
What do you get?
$\displaystyle a . b = \left(\frac{\sqrt{3} \sqrt{2}}{3 \times 2}\right) + \left(\frac{\sqrt{3} \sqrt{2}}{3 \times 2}\right) + (0) = \frac{\sqrt{6}}{3} = \sqrt{\frac{2}{3}}$
$\displaystyle |a| .|b| = \left|~\sqrt{\left(\frac{\sqrt{3}}{3}\right)^2 + \left(\frac{\sqrt{3}}{3}\right)^2 + \left(\frac{\sqrt{3}}{3}\right)^2 }~\right|. \left|~\sqrt{\left(\frac{\sqrt{2}}{2}\right)^2 + \left(\frac{\sqrt{2}}{2}\right)^2 +0^2}~\right| = |1|.|1| = 1$
This comes out to: $\displaystyle \cos^{-1} \sqrt{\frac{2}{3}}$
If you're going to check with Wolfram Alpha, you can ask your question somewhat more directly:
angle between (sqr(3)/3, sqr(3)/3, sqr(3)/3) and (sqr(2)/2,sqr(2)/2,0)
You'll see that the answer is $\arccos(\sqrt(2/3))\approx0.61$, which is exactly what you got.
The answer is to provide a reference UP vector:
public static float AngleBetweenThreePoints(Point3D[] points, Vector3D up)
{
var v1 = points[1] - points[0];
var v2 = points[2] - points[1];
var cross = Vector3D.CrossProduct(v1, v2);
var dot = Vector3D.DotProduct(v1, v2);
var angle = Math.Atan2(cross.Length, dot);
var test = Vector3D.DotProduct(up, cross);
if (test < 0.0) angle = -angle;
return (float) angle;
}
This came from here: https://stackoverflow.com/a/5190354/181622
Are you looking for this ?
θ_radian = arccos ( (P⋅Q) / (∣P∣∣Q∣) ) with vectors P and Q
θ_radian = θ_degree * π / 180
EDIT 0-360 range
angle = angle * 360 / (2*Math.PI);
if (angle < 0) angle = angle + 360;
Say I have two vectors:
V1 = { x: 5, y: 13, z: 32 } V2 = { x: 8, y: -14, z: 0 }
How can I figure out what angle v1 needs to be rotated to look directly at v2?
Put into English: say I knew exactly where I was in space, and exactly where another person was somewhere else in space.... Mathematically, how could I figure out what angles to put my finger at to point at them?