Matches (30)
BAN v NZ (1)
SA v WI (A tour) (1)
Sheffield Shield (3)
Hazare Trophy (18)
WBBL 2023 (1)
IND v ENG (W-A) (1)
Legends League (1)
Abu Dhabi T10 (3)
IND v AUS (1)

'We're looking for some direction on how DRS is to be used in the future'

The ICC's general manager, Geoff Allardice, talks about tests that attempt to assess the efficiency of protocols used in the review system

Nagraj Gollapudi
Last July ICC chief executive Dave Richardson said that the ICC cricket committee would be presented with the findings of an independent assessment on the performance of the technologies in the DRS system. That project, overseen by engineers from the field intelligence unit at the Massachusetts Institute of Technology [MIT] , is now complete, and the committee will be briefed during its ongoing two-day meeting which began at Lord's on Tuesday.
Only two men from the ICC have been privy to the details of the research so far, Anil Kumble, former India captain and legspinner, who is the chairman of the cricket committee, and ICC general manager Geoff Allardice. The latter explains the testing process that involved the experts building special apparatus to test both ball-tracking and the two types of edge detection involved in the DRS.
Can you tell us about the research MIT has been doing in regard to the DRS over the last year?
The first step was to try and develop apparatus to be able to assess the performance of technologies used in the DRS: ball-tracking and the two types of edge detection - one based on noise and one based on Hot Spot.
They have an apparatus to help them assess the performance of edge detection technologies and they are just finalising apparatus now to assess the performance of ball-tracking. Late last year we tested Hawk-Eye's Ultra Edge, their sound-based edge detection system.
What apparatus are these?
There is an apparatus with a swinging arm where we generate fine contact between ball and bat on a regular basis.
One of the difficulties in testing these edge-detection products is that their performance is probably best assessed when there is really fine contact. But to generate enough really fine contact repeatedly, if you have somebody throw a ball and somebody try to generate thin edges for a big enough sample, you'd be there for a week doing it. So the swinging arm does that job.
It is the relativity of speed [of movement] that is important. The faster the ball goes, the more sound is generated. It's the slower speeds where the margins and thickness of contacts matter. The speed the ball passes the bat is not what you see on the speedometer. It's not 140kph - that is at release. The speed past the bat is much different.
The coordinates when bat registers contact are captured through vibration sensors and compared to the output of Ultra Edge or Snicko. All this is done behind closed doors, where they can control the environment.
Then we went a couple of days later to Lord's [last September, in an ODI between England and Australia], where they set up the Ultra Edge on a desk and assessed that. They looked at the timing of the sounds. So the ball hits the middle of the bat and there is a sound that goes up, and they make sure that appears at the moment the ball strikes the bat. We did a closed session and a match session, just to see practically what it is like.
What was the aim of these assessments?
The aim is to try and understand the strengths and weaknesses of each of the technologies. They perform really well in some areas and not so well in others. Our aim is just to try to understand what those areas are. Overall we're getting a pretty good result from a combination of ball-tracking and sound, to the satisfaction of most countries.
"For the last couple of years, we have been trying to get the DRS working as best as we possibly can. It is still doing the job of increasing our percentage of correct decisions by 4-5%, which has been pretty consistent over a long period"
It is just about making sure that the umpires have all the info they need. And if there is any recommendation about how we set these technologies up on a match day or how they are operated, and when we should be putting more store by the outputs of these systems versus when they are less reliable. It's not just a blanket grading of these products. It is not a test that you have to be there or here. This is about finding out what's what: here is an assessment of the technologies being used today, and now what is the next step?
How did MIT get involved?
It came up when there were some issues with Hot Spot a couple of years ago. We were put in touch with one of the professors at MIT, an Indian gentleman, Dr Sanjay Sarma, who was very interested in the issue and has been the main point of contact on the project. We've got other researchers working on the testing and doing the legwork, but he is the lead guy. He is very well respected within the MIT community and loves his cricket.
Professor Sarma is the initial sponsor of the project in [terms of] the construction of the apparatus, to make sure they were fit for purpose. And then there was the running of the actual testing and comparisons, which was overseen by Dr Jaco Pretorius. [The apparatus] was tested first in Boston, then it was shipped to the UK and pieced together again, before making sure it was working again.
Does all of this help at a time when the BCCI is perceived to be softening its stance on the DRS?
Anil [Kumble, the chairman of the ICC cricket committee] is very closely involved in this project. He has visited Boston and seen the work they are doing, and he is very encouraged by the direction they are heading in.
We are aware of the public comments of BCCI about what may happen. I think at this stage we are just trying to get all the info together to present to the cricket committee. And for the last couple of years, we have been trying to get the DRS working as best as we possibly can. We get it working really well in certain games. It is still doing the job of increasing our percentage of correct decisions by 4-5%, which has been pretty consistent over a long period.
Can you tell us about the Hawk-Eye testing from earlier this year?
There was some trialling of Hawk-Eye's ball-tracking system carried out in Winchester in April, in the form of frame tests of where the ball passes [the plane of] the stumps.
There were scaffoldings with cameras positioned similar to how they are usually at a ground. There was a bowling machine, and our aim was to get the ball through the frame. In the frame are laser fields, both vertical and horizontal, so when the ball goes through, they [capture] the exact coordinates of where it entered the frame. You then accordingly compare with the Hawk-Eye projection.
There is an infra-red camera placed over a good length. Bit like Hot Spot - as the ball pitches, it leaves a heat mark. So it compares whether Hawk-Eye got where it pitched correct and where it crossed over the stumps correct.
The bowlers had to land it on a ten-cent piece and get it through the frame. They record all the images, record all the points of interception through the laser field. Hawk-Eye data is got at the end, and they compare the two. They do about 150 balls - one spinner, one quick with red balls, white balls and pink balls, just to see how they adjusted the settings for each. All this involved two or three days of testing at Winchester, where 150-200 deliveries were delivered by bowling machines and the bowlers.
"There will be a discussion around the technology, whether umpires continue to make decisions on field and technology reviews those decisions, whether players continue to initiate reviews or somebody else"
You are going to present all these findings to the cricket committee this week?
The aim is, the researchers will present their findings on each of the technologies they have assessed or observed to the CC [Cricket Committee] - their observations of the technology and the suitability for use. The technology will have areas where it is strong, where it is not strong, and that is what we have asked them to identify. Are there any recommendations to be made around installation, calibration or operation that might give a more consistent performance, or improve the performance of the systems? That is what we have charged them with doing.
The predictive element of ball-tracking is an element the BCCI and like-minded people have been sceptical about.
Yes, that has been raised as one of the concerns around the use of DRS, and I think this will be an important test. They are working through the data and will probably be ready just ahead of the cricket committee meeting, and it will be presented to the group. So there will be a discussion around the technology and around the protocols, how the technology is used. There will also be talk about other things - whether umpires continue to make decisions on field and technology reviews those decisions, whether players continue to initiate reviews or somebody else.
There are lots of different schools of thought on the umpire's call, but the current interpretation flows from: the umpire's decision is made, is it an error, and replays overturn that. There was some connection to how the game has been umpired historically.
We're looking for the cricket committee to provide some direction about how they see DRS being used in the future. Most series are running around the world with good results using all the technologies, so I don't think our aim is for us to be saying yes or no to any technology but just more to understand their performance.
I know umpires are of the view that technology should be used consistently around the world. At the moment we have some games with Hot Spot, some without. They would prefer to see a more consistent role of technology match to match. That doesn't mean if there is a Real Time Snicko in one game and Ultra Edge in another, that's not [what they want]. One or the other at every game is probably the sort of consistency they are looking for.
With inputs from Osman Samiuddin, sportswriter at the National

Nagraj Gollapudi is a senior assistant editor at ESPNcricinfo