Can You Hear Me Now?

Human missions to Mars are moving from the realm of science fiction to science fact. Your challenge is to design an interactive application to explore the challenge of communicating with astronauts on Mars from Earth.

Speaking hands

Summary

Our idea simply is using speaking hands those used in the transactions between deaf and dumb.Since communication between astronomers is difficult, especially at the time of the current exploration, this has become a requirement. The idea will allow the astronauts to communicate by reading hand movements (sign language)and then converting them into speech on a screen that the other person receives. But this screen should be placed in a clear place for them, like a phone screen, helmet, etc. This will be done via a screen and the gloves will be placed in the suit. The project consists of three phases: reading hand's signs, processing, and translating them into speech on the Arduino screen.

How We Addressed This Challenge

Since there is a clear overview of the communication problems in space, both in terms of method and cost. We believe that the best solution is to use this device, as it will provide an easy way to communicate. Moreover, it is inexpensive. We are sure that it will save time not only that but also effort by reducing existing problems.

We developed this idea by taking it from the deaf and dumb template and making it available to astronauts. Speaking hands is well known in the deaf and dumb world and now we think it will also become known in the space world. Also, instead of receiving texts after translating them from signs to small devices in the hands of the people, the receive will be on a large scale through the devices of the astronauts.

It is logical that this project is important because the facilities it will achieve if used in a right way and for the problems it will address as well.

In the end, we aim to provide the simplest methods that help the astronauts to communicate with each other, because this will also facilitate research and experimental development that is carried out outside and will reveal important news in another world. Moreover, we aim to reduce the high costs of using wireless devices and radio waves so that the astronaut can talk to others.

An easy, simple and inexpensive solution consisting of Prototype is relatively easy compared to all NASA devices

Prototype:

1) Gloves contains a set of sensors, Arduino, and circuits

2) Screen to display the translation of the signals received from the gloves.

As we used Arduino Compiler to upload the codes into the controller, which takes the readings

of the input sensors, compare and convert them to constant strings. In addition, tests were performed by

observing the reading from the serial monitor of the Arduino compiler.

How We Developed This Project

After reading all the aforementioned challenges, we found the challenge of "connect: Can you hear me now?" is suitable for us for several reasons, including our passion for space science as well as programming and design. In addition to our desire to innovate and participate in solving challenges that astronauts face.

Our idea is based on creativity as well as the scientific basis. After analyzing the challenge and thinking critically, we found that astronauts completely lose their hearing in space. From here we started dealing with such deaf people. The prototype consists of a gloves connected to Arduino, circuit, and sensors. In addition to a screen to display the translation of the received signals from gloves. We tried to suggest a place for these two parts, and we found that the most appropriate solution is to insert gloves into the suit to become an essential part of it and hang the screen on the helmet from the inside.

Materials:

Inputting hardware materials

1) Flex Sensor: Sensor used to measure finger’s bending by resistance differences

2) MPU(6050): A set of sensors used to measure the leanings in 6 axis 

3) Special gloves: The sensors and the controller will be mounted on this gloves.


Processing hardware materials:

1) LCD screen: this screen is used for outputting the speech from hand moves .

2) Arduino Uno: The main controller of the project. The programming codes for this controller are open source and to control the screen


Software Materials:-

1) Arduino Compiler

2) Processing Software for simulation and testing

3) Excel for data analysis


We encountered some problems in research sources, some of them are not documented, and they are very rare, especially in this field. We were able to solve this problem by contacting specialists in the field of space in addition to programming. Finally, we were able to find a suitable, effective and inexpensive solution to this challenge.

How We Used Space Agency Data in This Project

- We used the NASA website to know what are the most indispensable things that astronauts suffer during their travels, and we found that the issue of communication is a very important need, so we decided to give a solution to this problem ...- We knew their flights periods and the materials available in spacecraft and we reached a solution Suitable for these materials and the environment ... - We knew how to be able to send signal-carrying information without using WI-Fi from the ground to space

Tags
#Speaking_Hands #plus_Ultra #NASA_Space_app #Deaf#ESA #JAXA #CSA #CNES
Judging
This project was submitted for consideration during the Space Apps Judging process.