Automated no-ball detection, proof of concept
Russell Degnan

The first test between Australia and New Zealand may not have hung on the non-dismissal of Voges at the end of the first day, but the 232 runs it cost New Zealand certainly made it a lot harder. Predictably, a noticeably wrong umpiring decision led to a renewed call for third umpire reviews on every delivery, the return of the back-foot no-ball rule, and some less predictable, non-sequiturs about punishment.

But there is an easy solution.

Tennis has, for over 30 years used electronic means to judge service calls, and more recently, detected let calls with motion sensors. These are relayed to the central umpire, and they use that in calling the point. No-balls in cricket are slightly more complex, as they depend on the position of the foot over (not necessarily on) a line, with confounding shadows and the curved surface of the ground preventing the use of light beams that worked (mostly) for tennis.

But there is an easy solution.

With a fixed side-on camera with a clear view of the line (two is preferable), it is incredibly easy to build a system that will detect a landing foot within an area, and decide if it fell in front of, or before the line.

Computer vision techniques, the sort used by path finding robots, have been around for several decades. I learned the basics (in 1998) and those and more advanced techniques have been developed into the free OpenCV library I used for the code outlined below. To give a sense of how ways it would be to implement automated no-ball checking: my code, using not-particularly high-res footage, sans any setup programs, a live stream, or communication device to the umpire (a phone will suffice for that though) took me around 14 hours. But that involved me learning, from scratch, the OpenCV library, installing Java and SBT, and relearning some coding techniques.

There is no excuse for no-balls not to be automated. It is a trivially easy application of computer technology to a glaring issue.

Step 1. The code [downloadable here] uses the VideoCapture to load the video, and the BackgroundSubtractorMOG to detect edges from the non-filled part of the crease.

This image shows white areas where there is movement from the previous sequence of frames; grey areas show where there is a change, but the same colour as before (indicating shadow). You can see the outline of the bowler as he moves through the crease, and the non-striker backing up.

Step 2. Each frame is examined within the space shown below, to look for objects that will land within it

In a real-world application it needs to specify the side of the pitch to view, and be turned on and off for each ball (as hawkeye also would, so the same operator could be used).

Step 3. A relatively simple formula was used to calculate if a foot was within the frame:

  • There must be at least ten rows of pixels (out of 20) with a continuous line longer than 60 pixels (about 8 inches), and less than 120 pixels.
  • A threshold is used to determine the continuity as there are often bits of noise at the edge
  • Two edges are determined for the back foot - a hard edge (more than 10 rows, and a soft edge (more than 3 rows) to account for movement on the foot. That left a 3 pixel margin of error in this instance (around 10mm), but further testing could improve that. This is the blue box in the first image)
  • That line is compared to the crease line, that is configured before hand (the red line in the first image).

Needless to say, on the ball in question, the bowler was unquestionably behind the line (by 9-12 pixels, or 3-4cm).

Assuming a stream from the fixed cameras could be obtained at the ground, a working and fully tested system could be in place in less than a month. Sometimes, there really is an easy solution.

Cricket - Analysis 22nd February, 2016 20:24:00   [#]