Perspective Geometry: Turning 2D Video Into Trustworthy Angles

A plain swing video is full of distortions—perspective geometry helps us correct for them so coaches can trust what they’re seeing frame by frame.

If you’ve ever filmed a swing and thought, “That doesn’t look like what I saw in person,” you’ve already run into perspective geometry. The camera didn’t lie—it just told a different version of the truth.

Every golf swing is a 3D motion happening in space. Your phone camera, on the other hand, captures a 2D projection of that motion. That projection is shaped by where the camera is, how it’s pointed, and the optics of the lens. Change any of those, and the exact same swing can suddenly look “across the line,” “laid off,” “too flat,” or “too upright.”

Perspective geometry is the math we use in BRD to understand that projection and correct for it. The goal is simple: when you draw a line or measure an angle on screen, it should mean the same thing every time, not depend on whoever was holding the phone that day.

What “perspective” really means for golf video

At a high level, you can think of your phone camera as a “pinhole” that projects 3D points in the world onto a 2D image plane. Objects farther away look smaller, parallel lines appear to converge, and anything not square to the camera gets skewed.

In a golf context, this shows up as a few familiar visual traps:

  • Plane illusions: A swing that’s perfectly on plane can look flat or steep depending on whether the camera is too far inside, too high, or aimed slightly off target.
  • Joint angle distortion: Hip, spine, and wrist angles can look more or less extreme if the golfer is rotated slightly relative to the camera.
  • Club path “mysteries”: The club can appear to travel left or right simply because the camera isn’t centered on the ball–target line.

Perspective geometry is the toolkit we use to decode all of this: how the camera sees the world, and how to map what’s on screen back to something physically meaningful.

Why raw 2D angles can’t be trusted (by themselves)

If you draw a line along the shaft at the top and another along the shoulder turn, you’ve technically drawn an angle. But that angle is “in image space,” not in the actual 3D space where the swing happened.

Two big effects get in the way:

  • Foreshortening: When something is tilted towards or away from the camera, its apparent length shrinks. A perfectly straight lead arm can look slightly bent if it’s angled toward the lens.
  • Out-of-plane rotation: The swing plane is a tilted 3D plane. If the camera isn’t aligned with that plane, lines that are truly straight or parallel in 3D will appear crooked or converging in 2D.

This doesn’t mean 2D tools are useless. It means you want to either:

  • Control the geometry (consistent camera setup), or
  • Model the geometry (use perspective geometry to understand and correct it).

BRD does both: we encourage good camera habits, and we layer on geometric reasoning to make what you see more trustworthy.

Step one: treat camera placement as part of the “measurement”

Before we ever get to equations, we treat camera setup itself as a measurable, repeatable part of the system. Perspective geometry starts with a few questions:

  • How far is the camera from the golfer?
  • Is it roughly level with the hands, chest, or head?
  • Is it pointing along the target line, or off to the side?
  • Is the lens close to the “standard” focal length we expect?

The more consistent these are, the more consistent your visual angles become. In BRD, we’ll eventually bake this into setup guidance: a visual overlay that helps coaches approximate a standard face-on or down-the-line view, the same way launch monitors have “place ball here” markers.

Once we know the camera is in a reasonable zone, perspective geometry lets us go further and correct for the residual distortions that remain.

Reading the scene: lines, vanishing points, and planes

A huge part of perspective geometry is learning to use the course or range itself as a calibration tool. Straight lines in the world leave fingerprints in the image:

  • Mat edges and alignment sticks give us a proxy for the target line and ground plane.
  • Bay dividers and roof beams define verticals and long straight edges that help estimate camera tilt.
  • Flagsticks, posts, or walls help us understand what “true vertical” looks like in the frame.

Mathematically, we’re interested in vanishing points—locations in the image where parallel lines in the world appear to converge. With even a couple of these, we can estimate:

  • how the camera is rotated relative to the ground, and
  • how the ground plane itself is tilted in the image.

Once we have a handle on the ground plane, we can start mapping 2D image coordinates back to a consistent “top view” or “side view” reference, even if the camera wasn’t placed perfectly.

Homographies: flattening the ground so angles make sense

One of the core tools here is called a homography. You can think of it as a fancy 2D warp that takes points from the image and maps them to a flat plane in the real world—like unskewing a photo of a tilted rectangle until it looks top-down.

In our setting, that “rectangle” is often something like the hitting mat or a patch of ground around the ball. If we know (or can estimate) where a few corners are in both the image and in real space, a homography lets us:

  • turn the distorted mat region into a consistent top-down footprint,
  • reason about the ball–target line as a true straight line,
  • and estimate where the golfer is relative to that line.

The payoff: when you draw a line for club path or stance alignment, it’s no longer “just where it looks on screen”—it’s tied to an underlying geometric model of the ground.

Approximating the swing plane from a single camera

With only one camera, we can’t fully reconstruct 3D motion the way a dual-phone stereo setup can. But perspective geometry still lets us build useful approximations of the swing plane.

At a high level, we assume:

  • the swing happens mostly on a tilted plane anchored near the ball,
  • that plane has a relationship to the ground (roughly a certain angle up from the ball–target line),
  • and the camera has a known “standard” vantage point relative to that plane.

With those assumptions plus the homography of the ground, we can:

  • estimate the orientation of the swing plane in 3D,
  • project the club shaft and key joints onto that plane, and
  • compute angles that are closer to the true 3D values than naive 2D measurements.

We also use this to build “guardrails”: if the camera angle is too far off from what the model expects, BRD can warn you that certain angles might not be trustworthy and suggest adjusting the setup.

Reference distances and scale: when we care about centimeters

Perspective geometry isn’t just about angles—it also helps withscale. If we know any real-world distance in the scene (the length of a hitting mat, the distance between two alignment sticks, even the player’s height), we can use it as a yardstick.

This lets us estimate things like:

  • how far the hands travel relative to the body during the backswing,
  • how much the pelvis shifts laterally,
  • or how much the head moves relative to the ball.

In practice, we rarely care if that’s “exactly 14.3 cm” versus “about 15 cm.” What matters is consistency: using the same geometric assumptions every session so changes in the numbers reflect changes in the swing, not changes in the camera.

What perspective geometry looks like inside BRD

Under the hood, BRD doesn’t ask coaches to think in terms of homographies and vanishing points. Instead, we translate that math into practical behaviors:

  • Setup-aware overlays: As we learn the camera orientation from the scene, we can align swing planes, reference lines, and recommended checkpoints to the actual geometry instead of “eyeballing” them.
  • Angle sanity checks: If your camera angle is too extreme, we can flag that certain measurements (e.g., shoulder tilt at the top) may be unreliable, preventing overcoaching on bad data.
  • Consistent visual language: When we say “this player’s shaft is more upright than last month,” that’s anchored to a stable geometric model—not a different phone held a bit higher.

The result is subtle but important: your annotations start to behave more like measurements and less like drawings.

From “good enough” video to a measurement instrument

Most golf-tech workflows today treat video as something you annotate. Perspective geometry nudges us toward treating it as something you can measure.

We’re not trying to replace force plates or 12-camera motion-capture setups. But we are trying to squeeze as much trustworthy information as possible out of the devices coaches already carry in their pockets—by respecting the math of how cameras see the world.

As BRD evolves, perspective geometry will keep showing up in little ways: smarter guidance on where to put the phone, more stable angles across sessions, and better integration with future stereo and simulator experiences.

The end goal: when you look at a swing inside BRD and say, “That shaft is steeper than last month,” you can trust that judgment is grounded in more than just a hunch—it’s supported by the geometry underneath.