The Defense Innovation Unit Experimental (DIUx), a United States Department of Defense (DoD) organization focused on accelerating commercial technologies to the US military, and the National Geospatial-Intelligence Agency (NGA) have together launched a contest that offers $100,000 as cash awards to develop algorithms that can interpret high-resolution satellite images.
DIUx’s goal is to advance key frontiers in computer vision and develop new solutions for national security and disaster response.
The contest, called the xView Detection Challenge, starts next month and ends in May, followed by a workshop in June. In addition to the cash prize, there are awards in the form of Cloud Compute credits.
Entrants will use xView’s publicly available datasets of overhead imagery to train algorithms to identify details relevant to disaster relief or humanitarian missions. xView’s dataset contains images from complex scenes around the world, annotated using bounding boxes, and are released under a public, noncommercial license for anyone to use. The xView Detection Challenge is focused on accelerating progress in four computer vision frontiers:
- Reduce minimum resolution for detection
- Improve learning efficiency
- Enable discovery of more object classes
- Improve detection of fine-grained classes
Analyze satellite dataset via machine learning
It is widely accepted now that applying computer vision and machine learning to overhead imagery has the potential to detect emerging natural disasters, improve response, quantify the direct and indirect impact and save lives, and the agencies believe the xView Challenge will enable research and applications for important disaster relief missions. Areas of interest include utility trucks, damaged buildings, vehicle lots, helipads/helicopters, among others.
The satellite imageries are at 0.3 meter resolution covering about 1,415 sq km. The images cover both visible and infrared light, and have been hand-annotated with a million examples of 60 different objects.ma
The competition is in the footsteps of challenges such as Common Objects in Context (COCO) and seeks to build off SpaceNet and Functional Map of the World (FMoW) to apply computer vision to a growing amount of available imagery from space for better understanding of the visual world and address a range of important applications.
Check out who all are eligible
More details about the dataset and initial experiments can be found in the NIPS poster presented at the Machine Learning for the Developing World workshop. However, it would also be a good idea to check the terms and conditions before registering for the challenge, since there is a long list of checks and balances. Among others, it is important to note that residents of Iran, Crimea, Cuba, North Korea, Sudan, Syria, or other countries prohibited on the US State Department State Sponsors of Terrorism list (due to OFAC restrictions), and where prohibited by law are not eligible to participate in this. A current employee or contractor of DIUx, NGA, or their affiliates, are also not eligible, while US government officials can only enter the challenge in their personal capacity and may be ineligible to receive certain benefits awarded to winners.
While participants will retain the rights to their submissions of output files and source codes, DIUx and NGA will get the right to use and build on the winning submissions. Winners may be offered the chance to do follow-on work on other defense missions.