EarthAI is an advanced analysis platform for conducting geospatial analysis. The purpose of the EarthAI platform is:
- To make Earth observation data discoverable and easily available
- To reduce the barriers that exist for those who seek to learn more about the planet
- Enable all companies, NGOs, non-profit organizations, education institutions, and individuals to leverage the wealth of remote sensing data collected every day to answer important questions
The EarthAI platform is comprised of several key parts:
At the heart of EarthAI is our catalog of data. The EarthAI Catalog indexes public data sources like Sentinel, Landsat, MODIS, and NAIP. EarthAI Catalog also allows organizations and individuals to organize and manage their own spatial assets. EarthAI catalog has a python API available in EarthAI Notebook as well as an web-service API for integration with other applications
Built on top of the EarthAI Catalog, Earth OnDemand is a visual discovery tool to help users find discover geospatial data in space and time. With simple visual analysis tools, you can preview, view in full resolution, and compare images. Earth OnDemand also supports a variety of export options for GIS users, png download, and further analysis in EarthAI Notebook.
EarthAI notebook is a JupyterLab notebook pre-loaded with all the most commonly used geospatial libraries built on a scalable cloud-native architecture. Because notebook is built on Kubernetes, users can run their notebooks on a variety of infrastructures including GPU nodes, Spark Clusters, as well as a variety of individual server types. Notebooks can easily switch from one infrastructure to another so you can build and test your analytics on small notes, train your models on GPUs, and score your models on Spark Clusters.
RasterFrames is the open-source core of the EarthAI platform. Rasterframes is designed to enable at scale analysis of raster data on Spark. RasterFrames offers two key advantages to users seeking to process large volumes of imagery. First, it allows users to represent rasters as data frames. This makes working with rasters much easier. Second, it automatically can break raster files up into tiles and manage distributed computation jobs on a Spark cluster. This means that if you build your analytic using RasterFrames, you can scale it up to a global scale by moving your notebook to a larger cluster. No more do you need to re-write your code to manage the scale.