In the paper titled “Transfer Learning from Deep Features for Remote Sensing and Poverty Mapping“, the interdisciplinary team comprising of researchers from the Computer Science and Earth System Science department have introduced a new transfer learning approach for poverty estimation based on satellite imagery.
Electricity and nighttime lights are one of the modern indicators of economic activity and the lack of artificial lights during night time is regarded as a sign of relative poverty and a machine learning system that has access to a treasure trove of day time and night time satellite imagery together with “ground truth” (poverty data collected using traditional methods) can come up with some interesting associations which can be then used to map poverty in areas where such ground truth.

Read Full Story @ http://geoawesomeness.com/mapping-poverty-using-machine-learning-and-satellite-imagery/

Advertisements