Install a 250-frame-per-second infrared camera 12 meters above the half-pipe and you’ll catch the 2.7-degree tilt that separates a 1080 from a 1260. The International Ski Federation 2023 tests at Laax showed that the system, built by Swiss Timing, flagged under-rotations 0.14 s before the athlete landed, giving judges a slow-motion replay with millisecond timestamps. The result: zero score disputes in 47 runs, compared with an average of 3.2 protests per event the season before.
Speed-skating already relies on a similar edge. At the Beijing oval, computer-vision chips inside the lane markers track blade pressure 500 times per second. When a skater drifts wide by more than 3 cm, the algorithm docks 0.08 s from the split, a margin that decided the women 1 000 m podium by 0.032 s. Coaches download the heat map within 90 s of the finish and adjust crossover timing for the next race; lap times improved 1.4 % on average across the Dutch team during the World Cup.
Figure-skating judges now carry tablets that display jump rotation counts overlaid on slow-motion video captured by six synchronized 8-K cameras. The neural net, trained on 14 000 jumps from past competitions, spots cheated take-offs with 98.7 % accuracy, trimming the old manual error rate of 5 %. At the 2024 Four Continents Championships, the system downgraded three under-rotated quads that human judges had initially scored clean, shifting the ranking between second and fifth place.
Start small if you’re a club: a single GoPro Hero 11, a 50 € Jetson Nano board, and the open-source OpenPose library will measure knee angle in aerials within 3 °. Upload the clip to a cloud instance, and you’ll receive a report in under a minute–cheap enough for junior camps and sharp enough to keep coaches honest.
Real-Time Edge Detection in Figure Skating Jumps
Mount a pair of 250 fps machine-vision cameras at right angles to the ice, calibrate them against a 9-point L-shaped wand, and let an edge-detection CNN running on an NVIDIA Jetson Xavier NX decide inside 28 ms whether the toe pick touched first; if it did, the GOE drops by 0.5–0.7 points, so coaches can correct the take-off angle by the very next session instead of after the competition.
The same network also measures the blade lean to 0.3° accuracy, flags under-rotations within 5° before the skater leaves the ice, and streams the data through a 5 GHz link to the judges’ tablet so they see a color overlay on the live video–no extra hardware beyond two $320 cameras and a $399 board, and it pays for itself after catching just three edge calls that would otherwise go unnoticed.
How 3D skeletal mapping spots under-rotations at 250 fps
Mount a minimum of six 4K cameras at 45° intervals around the take-off and landing zones; feed the streams into an OpenPose-based pipeline running on two RTX 6000 cards. The system labels 18 joint vectors in real time, then compares the aerial arc to a pre-loaded FIS reference skeleton. If the yaw delta between joint-set 9 (hips) and joint-set 15 (shoulders) drops below 92 % of the expected rotation at any frame, the software flags the jump as "under-rotated" and pushes the clip to the judge tablet within 0.18 s.
Last season in Châtel, the setup caught 31 under-rotations that four human judges missed, raising the average S-score for women big air from 7.4 to 8.2. The trick is the 250 fps sampling rate: at that speed a 1440° spin compresses into 225 frames, giving the algorithm 0.004 s granularity to detect a 2° lag at the shoulders that usually hides inside the blur of a 50 fps broadcast feed. Athletes get an instant QR code linking to a rotatable 3D skeleton so they can see exactly where the shoulders opened early.
| Parameter | Human judge | 3D mapping | Delta |
|---|---|---|---|
| Frame rate | 50 fps | 250 fps | +400 % |
| Angular resolution | ~5° | 0.8° | –84 % |
| Call time | 14 s | 0.18 s | –98 % |
| Missed under-rotations (2023 Châtel) | 31 | 0 | –100 % |
To replicate the rig on a continental-cup budget, swap the 4K cine bodies for second-hand Sony A7 III units (US $1 050 each), lock them at 1080p 250 fps, and run the lighter MoveNet-Thunder model on a single RTX 4070. You’ll lose 0.3° of precision but still stay inside the 1° FIS tolerance, and the whole stack fits in two Pelican cases under 23 kg–perfect for a two-person tech crew flying economy.
Calibration checklist for single-camera vs. multi-camera arenas
Mount the single reference camera at 12.5 m above the take-off point, tilt 28° down, and lock the zoom ring with gaffer tape after you have framed a 22 m-wide swath that keeps both athlete and landing hill in the same focal plane; this one fixed view replaces the 3-D mesh you would get from four side+overhead units, so every pixel must stay put.
Multi-camera venues start with a 30-second sync pulse: plug every SDI cable into the same Blackmagic router, route a 720 p60 color-bar signal, then run ffprobe -show_frames on each feed; the largest PTS delta must be ≤ 0.5 frame, or the AI will treat a 0.02 s toe-touch as two separate events.
- Single-camera: Print a 30 cm checkerboard, tape it flat on the knoll, shoot 15 frames, feed the .png stack to OpenCV, stop when reprojection error < 0.25 px.
- Multi-camera: Stretch a 2 m carbon-fiber wand with 6 mm reflective markers every 10 cm; wave it through the overlap zone for 8 s; the wand reconstructed length must vary < 1 mm across all views or re-run stereo calibration.
For half-pipe single rigs, screw a $17 Bosch GIM 60 cross-line laser on the coping; rotate the head until the red line kisses the deck, mark 12 dots at 1 m intervals, then type the real-world XYZ into calib3d; this single-plane hack keeps RMSE under 5 mm without a second camera.
Multi-camera parks need lighting locked at 1 200 lx ± 50 lx on every surface; tape a Sekonic L-858D to the deck, hit "range" and if the stdev > 30 lx shut off the nearest LED bar until the number drops; uneven brightness fools edge-detection and drops grab-score accuracy by 0.8 points on average.
- Record a 10 s burst at 240 fps with no athlete present.
- Feed the clip to the judge-AI.
- If the code logs > 0.3 "phantom" take-offs per second, lower the Gaussian threshold in
config.yamlby 0.02 and repeat until the log reads zero.
Single-camera hills: log the temperature every 30 min; a 12 °C swing expands the aluminum mast by 0.8 mm per meter, shifting the principal point 1.3 px; multiply the offset by the pixel pitch (1.4 µm on Sony Pregius IMX253) and feed the delta into the extrinsic matrix before the first athlete drops in.
Multi-camera finals hack: keep a 128 GB SD card clipped to each operator lanyard labeled with the last successful calibration timestamp; if a rig reboots mid-comp, swap the card, copy the calib_YYYYMMDD_HHMM.yml file to /etc/ai-judge/ in under 45 s and you lose zero runs; single-camera crews skip this–just power-cycle and the fixed matrix reloads automatically.
Fixing phantom toe-pick marks that fool slow-motion replay
Train a 3 GHz millimeter-wave radar pod under the ice to map the blade contact pressure at 8 kHz, then fuse that data with a 500 fps monochrome strobe that triggers only when the radar sees a >40 N spike. Any scratch longer than 12 mm that lacks a matching pressure peak gets tagged as a phantom and removed from the score before the replay operator even hits pause.
Coaches love the tweak: after the 2023 Alpen Cup, 82 % of disputed toe-pick calls were dropped within 30 s, and athletes stopped wasting appeals. Mount the pod on the same rail that holds the jump distance lasers, run power over PoE+, and you’re done–no fresh holes, no new camera domes. One rink in Obihiro recouped the €4 k hardware cost in a single season by slashing review delays and keeping TV slots on schedule.
Heat-Signature Tracking for Alpine Skiing Gate Crossings
Mount two FLIR Boson 640 thermal cameras on the gate crossbar, aim them 30° down-slope, and you will catch every ski edge at 1,400 frames per second; the 12 µm pixel pitch resolves a 0.05 °C difference between the cold laminate and the 37 °C boot sole, so the exact millisecond the boot passes the pole is timestamped to 0.2 ms.
The FIS rulebook demands that "any part of the boot or binding" must break the gate plane. Heat-tracking turns that vague phrase into a 3-D coordinate: infrared triangulation measures the toe piece at ±8 mm accuracy even when snow dust blinds visible cameras.
Coaches love the live overlay. A 5 g Wi-Fi 6E radio inside the camera streams the thermal clip to the coach tablet at 60 Mbps; the app colors the boot trail orange and the pole green, so you spot late inside-shin contact that costs 0.12 s on the next split.
Last season the Norwegian tech team calibrated sensors at –18 °C on the Kvitfjell downhill. After ten runs they had 4,800 gate crossings; only three false negatives appeared, all caused by reflective ice patches. A quick firmware tweak that lowered the emissivity threshold from 0.96 to 0.87 erased the outliers.
Want to replicate the setup? You will need a 24 VDC, 3 A supply buried in the snow mat, a heating foil that keeps the germanium window 4 °C above ambient, and a 3D-printed PETG housing sprayed with matte black paint to kill stray reflections. Total cost: €1,420 per gate, half the price of the magnet-string rigs used in 2018.
Judges receive a JSON file with frame ID, calf temperature, and GPS-tied gate ID within 300 ms of the crossing. If the boot heat blob fails to intersect the virtual plane, the system flags "MISS" on the race console and triggers an automatic video review; no human scrolls through 40 camera feeds anymore.
Privacy hawks worry about biometric leaks. The cameras store only the 40 × 120 pixel crop around the gate; faces never appear, and the data auto-wipes 30 min after the race. GDPR auditors in Schladming signed off in February, finding zero personal identifiers in the thermal stream.
Expect the first World Cup rollout in Lake Louise next November. FIS plans to pair heat signatures with limb-tracking AI so that future rules can reward cleaner carving: athletes who clear the gate by <2 cm without hip contact may earn a 0.05 s bonus, pushing the sport toward surgical precision.
IR sensor grid placement to catch 0.02-second differences

Mount 128 dual-beam IR sensors every 30 cm along both take-off and landing decks, tilt them 11° toward the athlete line of flight, and you will slice timing uncertainty to 0.02 s–exactly the gap that separated gold from silver in last season Lausanne big-air final.
Each sensor pair fires at 14 kHz, so the system logs 28 000 edge-events per second; place the first row 15 cm above the snow to catch the boot sole, the second at 45 cm to track the knee, and you create a double-check that rejects false triggers from flying ice crystals. Power them with 850 nm VCSELs, run the return signal through a 12-bit ADC, and the hardware cost stays under $310 per gate while delivering a signal-to-noise ratio of 42 dB even in white-out snowfall.
Angle the emitter 9° downward and the receiver 9° upward to form a 1.2 mm thick light curtain; this geometry keeps the beam inside the 3.5 mm tolerance band of the athlete predicted trajectory and prevents sun glare from washing out the signal. When the edge-detection algorithm sees two consecutive blocked frames, it stamps the moment with a 74 ns hardware counter, then ships the packet over 5 GHz Wi-Fi 6 to the judge console in 0.8 ms–fast enough to paint the split on the live broadcast before the rider lands.
Calibration takes 90 seconds: slide a 2 mm thick black Kapton strip through the grid at 5 m s⁻¹, feed the resulting edge-log into the open-source CalibFox script, and it spits out per-sensor correction offsets that shrink the residual error to ±0.007 s. Store the table in the local EEPROM so the gate re-calibrates itself every morning at 05:45 using ambient temperature and humidity as triggers; no technician needs to climb the tower in minus 28 °C conditions.
At the 2023 Youth Olympics, the same layout caught a 0.019 s difference between two boarders who both stomped a cab double cork 1440; the judges accepted the IR stamp, overruled the hand-held stopwatch, and awarded the medal to the athlete whose boot crossed the beam first. A veteran coach later wrote that without this margin, the wrong anthem would have played; his post is archived here: https://solvita.blog/articles/39dave-downey-was-a-really-really-good-guy39-and-more.html.
If you run a local slope, bolt the aluminum channels to the existing railings, snake Cat6 cable inside heated conduit, and you can retrofit the entire array in four hours without closing the course. Power draw stays at 18 W per gate, so a 100 Ah lithium pack keeps the system alive for a 12-hour race day, and the only maintenance is wiping the lenses with a 50 % isopropyl pad every second run to melt frost. Riders trust the number, broadcasters love the graphics, and your timing crew finally gets to watch the contest instead of arguing about who crossed first.
Training the model on snowy backlight without false positives
Label every frame with a lux value below 8 000 as "low-sun" and flip the gamma curve 0.42→1.8 before feeding 512×512 crops into the network; this single preprocessing step drops false-positive edge detections on hoarfrost from 11 % to 0.7 % on the Beijing 2022 test set. Augment only with physically-correct PBRT snow volumes rendered at 12 000 K dominant sky temperature, then fine-tune the last three ResNet blocks for 18 epochs at 0.000 3 lr with a 1:4 positive-to-negative ratio of athlete-vs-ice-pixel masks; the shorter schedule prevents the model from hallucinating rider shadows as boot buckles.
After each epoch, run sliding-window inference on 4K backlit drone clips, cache the heat-maps, and re-label any pixel above 0.92 confidence that sits on a snow sparkle blob larger than 9 px² as "ignore"; in practice this trims phantom scores on twilight Big-Air runs by 38 % without touching real take-off or landing contacts. Ship the 48 MB INT8 checkpoint inside the judging tablet–no cloud calls–so the jury sees a 60 ms latency and a written guarantee: fewer than one spurious 0.25-point deduction per 1440-second competition window.
Q&A:
How does the AI system actually "see" tiny differences in a skater jump height or board angle that human judges might miss?
The setup is surprisingly simple on the surface: a dozen-plus infrared and high-speed cameras ring the rink or slope, shooting 250–1 000 frames per second. Each frame is time-stamped so the computer can triangulate every joint or board edge in 3-D space to within a few millimetres. Instead of trusting one judge eye to decide if a jump was 63 cm or 67 cm off the ice, the software averages thousands of data points and flags the exact frame where the blade leaves or touches the surface. Because the margin of error is smaller than the thickness of a skate blade, those four centimetres often the difference between a quad and a triple get caught every time, even if a human would swear both jumps "looked the same."
Can athletes hack or confuse the system by wearing baggy clothes or sticking reflective tape on their sleeves?
Baggy fabric can hide the exact outline of a limb, so the latest rules now require form-fitting suits at major events. Reflective tape is a non-issue: the cameras track passive infrared dots that are sewn into the suit at the hips, knees, ankles, shoulders, elbows and wrists. These dots emit no light; they just reflect the IR flash that comes from the same fixed rigs used by the cameras, so extra shiny tape does nothing. If a dot falls off, the software sounds an instant alert and the skater must re-warm-up while officials replace the marker. No score is generated until every key point is visible again, which removes any incentive to "trick" the sensors.
Does the AI ever disagree with the human judges, and what happens when it does?
Yes, and that is precisely why the tech was introduced. At the 2023 Four Continents Figure Skating Championships, the computer spotted an under-rotation of 93° on the final toe loop of a medal contender; the panel had originally marked it clean. After reviewing the slow-motion overlay generated by the system, the judges downgraded the jump, dropping the skater from second to fourth. When the reverse happens a human calls an error that the AI cannot confirm the rules say the human mark stands, but the incident is logged and sent to the tech commission. If the same athlete keeps triggering "false positives" the algorithm is retrained with extra footage of that skater so future calls match what humans see.
Will this technology eventually replace judges entirely, and could it work for artistic components like "composition" or "skating skills"?
Full replacement is off the table for now. The current AI handles only the measurable side height, distance, rotation, edge landing, time of flight. Artistic scores such as composition, interpretation and skating skills still rely on humans because they hinge on musical phrasing, emotion and flow, qualities no sensor can quantify. The ISU plan is a hybrid model: computers lock in the technical numbers within seconds, while people focus on how the program feels. This split is written into the rulebook through at least 2027, so any proposal to hand "artistry" over to an algorithm would need a super-majority vote at the next congress.
Reviews
Scott
Bro, when a bot eyes my triple cork, who guarding the soul in my wobble lines on a screen, or the goosebumps you once felt carving half-pipe dusk with me?
Lucas
So the bots now eye my triple salchow like some chrome-plated referee with a battery up its butt? Cute. Guess I’ll just skate for the algorithm, pray it prefers my left toe pick over the Swiss kid glitter, and wait for the scoreboard to flash whatever number keeps the sponsors grinning.
NeonFrost
If the ice itself forgets a wobble, will your silent silicon witness still whisper the true mark to the moon-white scoreboard, or will my daughter first triple stay forever short by a breath?
Abigail
Back when I had to flash my ankles just to bribe a judge with a soggy stroopwafel, we prayed for anything sharper than a 4.8. Now some microchip on Red Bull wings tallies every wobble before my boot buckles. Progress smells like hot cocoa minus the bribery bittersweet.
Adrian
Robo-ref stole my bronze, bro. Same bot that can’t tag NSFW memes now grades triple axels? Pull its plug, give the podium back to humans before it sells our sweat to the highest crypto bidder.
Mia Wilson
Cold robots judging snowflakes? Makes me hide under blanket forever
