Amazon’s cloud computing platform – Amazon Web Service (AWS) has made available the data captured by NOAA’s GOES-R series satellites on its Amazon S3 platform. Focused on providing continuous weather imagery, NOAA operates a constellation of Geostationary Operational Environmental Satellites (GOES) for monitoring meteorological and space environment data for the protection of life and property across the United States.
The satellites provide advanced weather imagery with increased spatial resolution, 16 spectral channels, and up to 1 minute scan frequency for more accurate forecasts and timely warnings. According to AWS, the platform will offer real-time feed and full historical archive of original resolution Advanced Baseline Imager (ABI) radiance data (Level 1b) and full resolution Cloud and Moisture Imager (CMI) products (Level 2) data available freely for anyone to use.
While the GOES-16 ABI L1b and CMI data have reached provisional validation, please keep in mind that since GOES-16 satellite has not been declared operational, the available data is still considered preliminary and undergoing testing.
The data availability is the result of NOAA’s Big Data Project (BDP) to explore the potential benefits of storing copies of key observations and model outputs in the Cloud to allow computing directly on the data without requiring further distribution.
Amazon is at the forefront of offering a platform to various Cloud-based services, like Landsat, SpaceNet, Sentinel-2 and many others, and adding up a multitude of satellite data in the process. Also, in an interview given to Geospatial World recently, Jed Sundwall, Open Data Lead, Amazon Web Service, explained about the company’s involvement with Radiant.Earth – the open geospatial data platform created to leave a positive impact in the development of the world’s greatest social, economic and environmental challenges.
Data availability through such initiatives will allow people to access tremendous amount of computing without requiring them to buy their own computers. Today, if you want to experiment with a computer that has 16 GPUs, or 2TB of RAM, you can do that on AWS without having to buy and configure an expensive piece of hardware. Further, it will enable you to experiment with massive volumes of imagery with a range of computing tools available, without the risk of heavy investment in physical capital.
Value of open data
Opening up data does not magically solve problems or create start-ups. Organizations sometimes make the mistake of launching an open data portal thinking their job is done. But the first thing you need to think about when trying to open up data for innovation is if there is a real demand for insights you can get from the data. You have to think beyond simple “wouldn’t it be cool if…?” questions, and think about actual problems that can be solved with the data, and who will benefit from cost savings as those problems are solved.
As far as opening humongous amounts of data are concerned, that can only be done in Cloud. Earlier, if you wanted to share 100TB of data, you would need 100TB of storage space, and then you would need to figure out how to get a copy of that data. Then, you would also need the computer power to run analysis on it. This severely limited the number of potential users and the innovation that could take place. When data is staged for analysis in the Cloud, anyone can analyze it without needing to download or store their own copy. The Cloud grants equal access to data. Putting data on the Cloud really democratizes access to the data.