Stanford EE Computer Systems Colloquium

4:15PM, Wednesday, April 9, 2014
NEC Auditorium, Gates Computer Science Building Room B1

Augmented Reality at the America's Cup, the Technology Behind AC LiveLine
Augmented Reality in Broadcast Sports, from the Hockey Puck, to the Yellow First Down Line, to the America's Cup

Stan Honey
America's Cup Event Authority
About the talk:

Augmented reality was first used in live broadcast sports with the Fox Trax highlighted hockey puck introduced at the Hockey All-Star Game in 1996. Far more popular was the Yellow First Down Line which was introduced in 1998 and is now ubiquitous on broadcast football. AC LiveLine is the latest use of augmented reality to enhance the live broadcast of sport but is a significant development beyond prior systems because the video to be augmented is all taken from a helicopter mounted camera which introduces significant challenges.

The technology will be described that was developed to track the America's Cup catamarans to within 2cm, 5 times per second, and superimpose graphics elements such as ahead-behind lines and laylines on the live helicopter footage of the race. Previous America's Cup broadcasts have only featured graphics visible in an animated view of the race. The LiveLine graphics package was designed to help viewers follow the intense action of the AC72s as they flew around the race course on foils, all in live action.

In addition to use for broadcast, this tracking system was used by the America's Cup Race Management team to revolutionize the on the water management of the sport. With the ability to track the America's Cup catamarans within two-centimeters on the race course, event organizers quickly saw the opportunity to leverage the system for on-the-water management of the sport. Telemetering of the course allowed for rapid movement of marks and controlling course limits, while use of real-time overlap and zone-entry determinations enabled umpires to make the most accurate umpiring decisions ever possible in sailing.

The technical and operational developments that lead to LiveLine graphics system and the related race management and electronic umpiring systems will be described along with the problems and pitfalls.

Some Action Photos

[finish flag overlay]




There is no downloadable version of the slides for this talk available at this time.

About the speaker:

[speaker-photo] Bay Area resident Stan Honey (USA), a three-time Emmy Winner for Technical Innovations in Sports TV Broadcast, the Rolex Yachtsman of the Year for 2010, member of the National Sailing Hall of Fame, and inventor on 29 patents in navigation and graphics, was Director of Technology for the 34th America's Cup by the America's Cup Event Authority (ACEA).

A major figure in technological innovation in sports television, Honey lead the development of the Fox-Trax hockey puck system in 1996 which was the first use of augmented reality in live sports broadcasts. Honey co-founded Sportvision in 1998, where he led the development of the yellow first-down line widely used in the broadcast of American football, the ESPN K-Zone baseball pitch tracking and highlighting system, and the Race/FX tracking and highlighting system used in NASCAR. Honey also is recognized as one of the most successful professional navigators in sailing, having navigated ABN AMRO to victory in the 2005-2006 Volvo Ocean Race (around the world) and having navigated Groupama 3 in setting the Jules Verne record for the fastest circumnavigation of the world under sail in 2010.

Prior to co-founding Sportvision in 1998, Stan Honey worked as Executive VP Technology for News Corporation from 1993 through 1998. Honey co-founded ETAK Inc., the company that pioneered vehicle navigation systems, in 1983, which was sold to News Corporation in 1989 and is now part of TomTom. Prior to founding ETAK, Honey worked as a Research Engineer at SRI International while Honey completed his Stanford MSEE. Honey's earned his BS at Yale in 1978.

Contact information:

Stan Honey
stan (at)