Orbital Sky

A huge number of satellites in Earth's orbit support our day-to-day life on the ground. Your challenge is to develop a method to improve public knowledge of these satellites, with an eye towards driving user engagement, enthusiasm, and exploration.

Satellite of Things

Summary

Most people don't know what a satellite does. Our challenge is to connect the daily life of any inhabitant of the planet with the solution that a satellite provides directly or indirectly in his life.

How We Addressed This Challenge

Taking into account that most of the people do not know what a satellite does, how it helps them to improve their quality of life and they do not even know that they exist and their location in space, we proposed in a fun way to be able to bring what we surrounds daily with the activity of a satellite. We believe that it is important that many more people are interested and thus, to be able to join this activity through the knowledge and appreciation that it deserves to potential students who wish to undertake the path of knowledge of this scientific sector.

This project focuses on the creation of an application that is trained by learning to recognize objects that people have in their daily lives. By taking an image, the software understands what object it is dealing with and associates it with related satellite data, showing it the importance it has in your daily life. Thus, in this way, we hope that a public that has no interest in astronomy or satellite issues, through a didactic way, gets closer to the knowledge of the machines that are in orbit now.


How We Developed This Project

An application with machine learning where the user opens the app and can choose any object that is around him, using his camera he will point to the object, the app will recognize said object, which will show through augmented reality the visualization of a satellite, this It provides the user with data on what type of satellites collect data referring to said object, that is, if the user points his camera at an object (in this case a glass of water) the app will show him which satellites collect water data, for example Copernicus satellites of the Marine Monitoring service, will show the user that there are Satellites that take data from the land related to water, for Marine monitoring, species, polluted waters, etc.




How We Used Space Agency Data in This Project

Below we see the types of satellites that there are, their functions, and how we search and obtain the information The series of Landsat satellites from NASA / US Geological Survey are similar to Sentinel-2 (they capture visible and infrared wavelengths) and can also capture thermal infrared (Landsat 8). This platform gives you access to images acquired by Landsat 5, 7 and 8. There are different sensors on board each satellite; Operational Land Imager (OLI) and Infrared Thermal Sensor (TIRS) on board Landsat 8, Enhanced Thematic Mapper Plus (ETM +) on board Landsat 7 and Thematic Mapper (TM) on board Landsat 5. Landsat 8 data has eight bands spectra with spatial resolutions of 30 m - exceptions are thermal band with a resolution of 120 m and panchromatic band with 15 m. The visit time for each, Landsat 7 and Landsat 8, is 16 days, with the two spacecraft compensated for one or the other to revisit the same place every 8 days. In this work, the stored Landsat 8 data will be used. The data contained in this series of satellites is used for vegetation monitoring, land use, land cover maps, change monitoring, etc.


ESA is developing five SENTINEL mission families specifically for Copernicus. The SENTINELS will provide a unique set of observations for Copernicus.


The Sentinel-1 imagery is provided by two polar-orbiting satellites, operating day and night, imaging C-band synthetic aperture radar, allowing them to acquire images regardless of weather. The main applications are to monitor sea ice, oil spills, sea winds, waves and currents, land use change, soil deformation, among others, and to respond to emergencies such as floods and earthquakes. The identical satellites orbit the Earth 180 ° away and at an altitude of almost 700 km, which offers a global visit time of 6 to 12 days depending on the area (check the observation scenario). The Sentinel-1 radar can operate in four modes. The spatial resolution depends on the mode: approx. 5 mx 20 m for IW mode and approx. 20 mx 40 m for EW mode


Sentinel-2 carries a multispectral imaging instrument (MSI), which offers high resolution optical images for ground monitoring, emergency response and security services, with a common purpose for coverage and change detection maps of the land, vegetation monitoring and burned areas. The MSI provides a set of 13 spectral bands: 4 visible (10m spatial resolution), 6 near infrared (20m) and 3 shortwave infrared (60m). Sentinel-2A and Sentinel-2B have a review time of 5 days. Its acquisitions are available in L1C and L2A processing modes, with L2A being atmospherically corrected.

Sentinel-3 is a low-Earth orbit moderate-size satellite compatible with small launchers, including VEGA and ROCKOT. The main objective of the mission is to measure sea surface topography, land and sea surface temperature, and land and ocean surface color with high precision and reliability to support the forecast systems of the ocean, environmental monitoring and climate monitoring. The Ocean and Land Color Instrument (OLCI) provides a set of 21 bands ranging from visible light to near infrared light (400 nm <λ <1020 nm). The Land and Sea Surface Temperature Instrument (SLSTR) provides a set of 11 bands ranging from visible near infrared to thermal infrared (554.27 nm <λ <10854 nm) The Sentinel-3 is less than 2 days of review time and provides images with a spatial resolution of 300m for OLCI and a resolution of 500-1000m for SLSTR.


Sentinel-5P provides atmospheric measurements related to air quality, climate forcing, ozone, and ultraviolet radiation. Its data is used to monitor the concentrations of carbon monoxide (CO), nitrogen dioxide (NO2) and ozone (O3) in the air, as well as the UV aerosol index (AER_AI) and different geophysical parameters of clouds (CLOUD ). EO Browser offers level 2 geophysical products. The TROPOspheric Monitoring Instrument (TROPOMI) on board the satellite operates in the infrared range from short wave to ultraviolet with 7 different spectral bands: UV-1 (270-300nm), UV-2 (300-370nm), VIS (370-500nm), NIR-1 (685-710nm), NIR-2 (755-773nm), SWIR-1 (1590-1675nm) and SWIR-3 (2305-2385nm). Its spatial resolution is less than 8 km for wavelengths greater than 300 nm and less than 50 km for wavelengths less than 300 nm.

The applications are divided into six main categories: services for land management, services for the marine environment, services related to the atmosphere, services to help emergency response, services associated with security and services related to climate change. .


For the use of this data you can use the EO browser or Sentinel Hub Playground, we use EO. First we select the place: Hamburg, Germany. Then the satellite that covers the area and contains the data that interest us, according to the phenomenon to be studied, in this case Sentinel 3 OLCI.



You can use the default options or customize the display of different satellite bands, multiplying by a certain factor to manipulate their brightness. Each band shows the reflective characteristics with the assigned red, green or blue color (the first band will appear red, the second green and the third blue).

A time lapse is created, created in EO Browser.

On the other hand, we will use Sentinel Hub Playground and Arcgis, linking with the Sentinel Hub Playground tool the layers corresponding to the satellites in WMS and through Arcgis we will access the terrain data with higher resolution, since this software is a powerful Information Systems Geographical.

We will follow the example with the city of Hamburg, Germany:

We will choose the corresponding satellite and copy the corresponding WMS code to link it:

We search the city in ArvGis:

We will apply the agriculture layer:

Now the image that the user enters will be compared with this image.










Project Demo

Prototype


Screen views


The following describes how the app works on each screen

In this screen, the user must enter with their email and password previously registered, in order to use the app


For registration you need an email and password

Here on this screen, once the user enters, he will be able to see the start of the app

En esta pantalla podemos ver cuando el usuario interartua con el entorno , en este ejemplo podemos ver cuando el usuario a punta la cámara hacía un vaso de agua ,en este caso la app reconocerá dicho objeto mediante el uso de machine learning

In this screen the user will be able to see what location the satellite has

In this screen the user will be able to see that there are different types of satellites and each one has different functions.


Scalability: For the future, the app will also be able to recognize voice, therefore, the user will be able to say the name of the satellite and information from said satellite will come out, this will be one more function in the app



Conclusion:


The idea is to be able to cover a large part of the general public who do not have knowledge about space, and technologies, to those who do not have technical knowledge, and thus be able to bring more people to issues about space