Pl@ntNet-API: integrate plants’ visual identification engine into your citizen science app

Pl@ntNet-API

Pl@ntNet-API enables citizen science apps and third-party applications developed by industrial, academic or associative stakeholders to integrate automatic plant identification features into their apps easily. To do this, it has developed an Application Programming Interface (API), allowing users to integrate the Pl@ntNet app identification engine.

Once integrated into a citizen science app, your users will quickly identify plant species. How? When a user uploads a plant photo to the app, the service returns the list of species and the likelihood of the image being species ‘x’ or ‘y’. Also, it includes a tool to find the most similar photo (species) from the Pl@ntNet database compared to an uploaded observation and even identify plants from multi-organ photos (different parts of the same plant), including the flower, the fruit, or the leaf.

As app developers, you will also be able to customize the flora you want your users to monitor. For example, suppose you want to monitor flora in the Mediterranean mountains. In that case, you can select this filter and the visual identification engine will help your users identify only plants living in this area.

This service is regularly updated and enriched because it is connected to the Pl@ntNet database, which is regularly updated with new flora images. Additionally, Pl@ntNet’s infrastructure and data-management software are highly scalable. So, if you are a developer of downstream applications, you don’t have to manage the workload yourself.

Using it will improve quality control of the app users’ identifications. As a result, you will contribute to having more trustable citizen science plant databases to use in scientific research to track invasive species and endangered species, among others.

Pl@ntNet-API has been developed by Inria in the Horizon 2020 Cos4Cloud project framework, coordinated by ICM-CSIC and part of the EOSC Portal. The service is ready to use and available in the EOSC Market.
 

Access the EOSC Marketplace now and get it!

31 January 2022