Using the imagery captured by the Data Bike to measure cycle path surface quality.
Last month I shared a link to the IEEE Global Road Damage Detection Challenge 2020, after realising measuring the surface quality of a bike path from sensors in a phone was not particularly accurate at scale.
You can see the results of this years Global Road Damage Detection Challenge in this paper.
What I really like about this challenge is that all entries are required to be open-sourced.
There were 12 submissions to the 2020 challenge, as detailed in the report:
|GRDDC – Rank||Name of the Team||Test1-Score||Test2-Score||Link for the Source Code|
The aim of the competition was to build models that could identify areas of a road where damage is present and the severity.
Though I realised this wasn’t completely suited to our aims for two main reasons.
Firstly, many cycle paths cover a range of terrain in addition to asphalt roads (which is what these models were built around).
Secondly, although single areas of damage are important, a large pothole can be very dangerous to a cyclist, ride quality, in my mind, is more about the condition of an entire area of a path. Yes, a large pothole lowers ride quality and should be fixed, but if the path is wide, smooth and the pothole can be easily avoided, then I’d say the overall quality is pretty good.
I then stumbled across another research paper titled; Road Surface Classification with Images Captured From Low-cost Camera.
There’s a nice supplementary blog post here on the research too.
The codebase from the research can be obtained here.
As you can see from the above video, the software detects the class of road (either asphalt, paved, or unpaved) and also determines a quality score (either good, regular or bad). The numbers shown in the video indicate probability of correct detection.
Over the next few weeks I’m going to dig further into these models and code, see how they perform with images from various data bikes, and see if they’re suited for use at scale.
Never miss an update
Sign up to receive new articles in your inbox as they published.