NatureLocator: Leaf Watch
Strapline: A geospatial smart phone application enabling public engagement in
biological survey work via the crowd sourcing of data collection and data validation
Primary University Users: Researchers and students
Primary Other Users: Anyone interested in collecting or providing crowd-sourced, geotagged photographic data
NatureLocator’s Leaf Watch app provides a cross-platform solution for crowd sourcing the collection of geospatially tagged, photographic data. The project has also integrated the crowd sourcing of data validation so that the data collected can be checked and verified by the general public.
Why we created the Leaf Watch app
The Leaf Watch app was developed in order to augment an existing project, Conker Tree Science, which aims to collect as much information as possible about the current and changing distribution of the horse chestnut leaf miner moth. Prior to the app’s production, data collection had been crowd sourced but the Conker Tree Science project was reliant on people submitting their data using a website form. Submissions over the season numbered circa 500 or so and were text based only. Verification of records, therefore, was not possible and the location of affected trees had to be recorded using an OS map.
The app has greatly facilitated the data collection and submission process (by making it portable) while simultaneously increasing data accuracy by incorporating a GPS tagged photographic image of the subject. The convenience factor which the app provides has encouraged many more people to participate in the study and yielded a much larger data set; some 5200 data submissions. One such user, Laura Illsley, keeps her eyes open for affected horse chestnut trees when out walking her dog. Laura generally has her phone with her when she’s out walking which means there is little effort involved in recording stricken trees. Laura is an example of a large group of data collectors that would not ordinarily have participated in the study but that have contributed valuable data.
View the NatureLocator Leaf Watch website here:
This site contains information on the app itself but also hosts a web application we have built in order to crowd source the validation of the data collected.
View the Validation page here:
The Leaf Watch App
The app itself has been designed to be very simple to use whilst still collecting valid biological data. It is available free for both the iPhone and Android platforms and, once downloaded, can be used immediately without any registration.
After starting the app it’s very straight forward to submit a record.
Clicking on “Add new record” takes you to the “New Record” screen where you follow a sequence of tasks:
- Take a photo of a typical leaf
- Give the leaf a damage score
- Use your phone’s GPS to add the location (or if no GPS, you can put in a text based location)
- Assess the type of ground cover
You can also add your e-mail address if you want to be contacted by the team information on the results and future projects etc.
Once all fields are completed, you can the send the record in.
To download the app for the iPhone or iPad visit the App Store: http://itunes.apple.com/gb/app/conker-tree-science-leaf-watch/id445371129?mt=8
To download the app for the Android platform visit the Android Market: https://market.android.com/details?id=uk.ac.bris.ilrt.leafwatch
The Data Validation Web Application
The validation web application adds a valuable extra level of functionality. As a tool for research both products can be seen as providing a rounded “service”. Designed as a mechanism by which large quantities of data can be checked and verified by the general public, the validation process is based around the same concept as a Galaxy Zoo. It also provides some element of “gamification” in the form of a “Validator’s Leaderboard” to encourage a degree of gentle competition.
Importantly, the website also features a Results page where we are feeding back information gathered by the app to the public in a visual form. We have an interactive map to enable people to track results over time so they can get a feel of how their data has contributed to the overall picture.
Feeling like a bit of validation tonight….??
Information is provided on how to log in to the validation application. In addition, comprehensive information regarding the accurate evaluation of the image data is provided. This means that even if you’ve never set eyes on a horse-chestnut tree before you will still be able to take part and contribute valuable data. We will be measuring the amalgamated accuracy of the publics’ scoring against an expert’s in order to gauge the accuracy of this type of crowd sourcing. We’ll use the results to inform decisions about employing it in future projects.
The first screen allows the user to login via one of 3 different, authenticated accounts: Google, Facebook or Twitter. This provides us with a username for a leaderboard and also enables us to match sessions against individual IDs in case of abuse. Users with none of these accounts are given instructions on how to sign up for a Google account.
The second screen is simply an authentication notice from the service you have chosen to log in by.
Once into the actual web application itself, first time users will be presented with a message which links to information to help them in the validation process:
Users can then start applying leaf damage ratings to the images presented to them by the web app. There is an “Invalid” option to say that the leaf isn’t a horse chestnut leaf and also a “Don’t know” option should the image be unclear.
How the images are presented by the web app
After a deal of discussion we decided that presenting images to users in a standardised order made the most sense on balance. This was to make the stats we gather as useful as possible since it is likely that few people, if anyone, will validate all 5200 records (go on, I dare you). Therefore, if 10 people do 1000, at least we have a good number of images all evaluated multiple times to check accuracy on.
What we hope to do next
The team are very excited about the potential to further develop the app. The type of things we are thinking about include the following:
- User feedback – e.g. data could be pushed to the phone based on the user’s location, when required or opted for. In this fashion the app could alert you to “hotspots” (areas of particular interest to the user or biologist). This would enable recording effort to be focused or give people a good idea of where to look for a particular species.
- Dynamic mapping, to show distribution of records to app users. They can see their own submitted records immediately and those of others.
- Data management enhancements to the backend to facilitate analysis of data “on the fly”. Enhancements might include a user friendly interface with tools to enable non-technical people to access and use the data effectively and efficiently.
- Additional platform support to include Blackberry and Windows OS
- Templates to assist with optimal recording of photographic images (one problem we had was people submitting images that weren’t appropriately taken and couldn’t be verified)
- Facility to “share” app with friends (to facilitate viral spread of app)
Meet the people behind Leaf Watch: http://naturelocator.ilrt.bris.ac.uk/about/
The NatureLocator team will ensure that the app and website remain available for at least three years. The Conker Tree Science project itself looks set to remain operational for the foreseeable future ensuring that the app and website will be well utilised.
The code for the app will be made available under the following licences and people are free to adapt them to meet their own objectives.
Phone App: Modified BSD
Website App: Modified BSD
Data: Available September 2012* with a cc zero licence
* The scientists involved in the project will still be busily working away on the data until that point.
Table of Contents of Posts
About the Project, Progress, the App and Team
Publicity, Presentations etc