Learning to analyse traffic patterns in developing countries using a neural network
Road traffic accidents are one of the leading causes of death in many Low-Middle Income Countries (LMICs). Infrastructure is limited and typically prioritises the needs of motorised transport. City authorities in LMICs would benefit from rapid assessments of road traffic hazards at key locations (road crossings; other transport infrastructure etc.) both to assess needs but also to determine the impacts of any road improvements.
In the UK, we collect data from sensors in the roads or mounted on street furniture around junctions. These data enable the calculation of metrics such as flow, occupancy and velocity for identifying traffic movements, congestion and incidents. If video footage is available, then they can also be used to identify and count vehicles and their types.
This advanced infrastructure is lacking in many Low-Middle Income Countries. Here, drone footage would allow the rapid collection of imagery that could then be used to identify vehicles, their speeds and density, transport modes and also the interactions of different modes (cars vs. trucks; cars vs. motorbikes; cars vs. pedestrians etc.). It is a much cheaper and more flexible approach to traffic analysis and management without requiring any new traffic infrastructure.
This project will automate the identification and analysis of different transport modes from these drone-video footages. We have already successfully installed a widely-available deep neural network (DNN) that can identify cars from drone footage relatively accurately (Fig 1)
However, the network must be trained to identify other types of vehicle (trucks, bikes) and pedestrians. The student will label existing video footage generated by the University of York and use this labelled data to train the network further using a well-established pipeline.
This may be followed by risk (proximity assessments) and hot spot identification for predicting accidents. This project has the potential to analyse traffic congestion and aid decision support to reduce both congestion and, potentially, traffic fatalities across a wide range of low-middle income countries.
(Figure 1 Automatically labelled drone footage: in a single video frame, 10 cars are identified automatically. The labelling works accurately at more than 30fps on UoY drone footage.)
We are looking for one student for this project
- Ability to label items in images.
- Ability to install and run existing video analytics software for vehicle detection
- Ability to analyse the results and outputs of the video analytics by analysing labels and bounding boxes defined in plain text files
- Understanding of basic statistics
- Familiarity with Linux command line
- Ability to extend this to (one of more of):
- counting the vehicles automatically,
- identifying vehicle types,
- estimating vehicle density automatically,
- estimating vehicle speeds automatically
- determining source-destination of vehicles across the footage.
How to apply
For more details on the summer school application process (including eligibility and funding) please go here: https://digitalcreativity.ac.uk/news/dc-labs-summer-school-2021
If you would like to ask informal questions about the project please contact Dr Victoria Hodge (email@example.com), Dr Steve Cinderby (firstname.lastname@example.org), or Dr Alex Wade (email@example.com).
For general questions about the programme, logistics etc please contact Ella Eyre, DC Labs Administrator on firstname.lastname@example.org
Krishnan, R., Hodge, V., Austin, J., Polak, J., Jackson, T., Smith, M., & Lee, T. C. (2010, October). Decision Support for Traffic Management. In Proceedings of 17th ITS World Congress:(CD-ROM), Busan: Korea, Oct. 25 (Vol. 29, p. 2010).
C. Kyrkou, G. Plastiras, T. Theocharides, S. I. Venieris and C. Bouganis, "DroNet: Efficient convolutional neural network detector for real-time UAV applications," 2018 Design, Automation & Test in Europe Conference & Exhibition (DATE), Dresden, Germany, 2018, pp. 967-972, doi: 10.23919/DATE.2018.8342149. https://zenodo.org/record/1243708#.YGGOba_0kuU