Lunar Data Analysis

Lunar Iron

The Moon has been studied for 50 years by NASA, ESA, and JAXA, collecting archives that exceeds 100's of terrabytes of data from instrument measurements, image analysis, and survey missions. Onboard sensors and microcontrollers, cameras, and devices, will be sending millions of data and images into Prospector’s AI engine to quickly analyse and make real time decisions based on the image recognition and object detection features of artificial Intelligence and deep machine learning capabilities of Prospector’s AI engine. 

Operating as an autonomous mining rover with robotic mechanics requires a deep learning framework running on-device inference aimed at Prospectors microcontrollers and other devices, operating over a neural network. Our microcontrollers are small, low-powered computation devices embedded with Prospectors hardware using adaptive manufacturing techniques. 

By running machine learning inference on microcontrollers, Prospectors AI engine can control systems hardware devices and analyze data sets from; chemical sensors and spectrometers, cameras and image inputs, to controlling orbiting and maneuvering thrusters and other functions in conducting 100% autonomous operation.

Pretrained motion recognition, image classification, object detection, and sensor fusion models, using NASA's mission data from LRO and Clementine missions, with other lunar image archives of survey maps, topographical scans, instrument measurements, along with real-time on board spectrometers and detection instruments, will be analysed to recognize 1000’s of different types of objects and minerals.  The system will be able to perform scientific investigations of the Moon using publicly available data. These include the following missions:
- Lunar Crater Observation and Sensing Satellite (LCROSS),
- Moon Mineralogy Mapper (M3),
- Lunar Reconnaissance Orbiter (LRO),
- Gravity Recovery and Interior Laboratory (GRAIL),
Acceleration, Reconnection, Turbulence, and Electrodynamics of the Moon’s
Interaction with the Sun (ARTEMIS),
- Lunar Atmosphere and Dust Environment Explorer (LADEE),
- Lunar Prospector (LP),
- Deep Impact Lunar Flyby,
- Non-U.S. missions: Kaguya, Chang’e 1, Chang’e 2, Chandrayaan-1, Chang’e 3.

Prospector will be capable of processing 60 million images a day and capable of analyzing 1 image per millisecond for inference and 4 images per millisecond for learning, to convert the model to run inference to Prospector’s hardware devices, specifically location sources of water, controlling the orbiting and maneuvering thrusts, and monitoring excavation and electrolysis operations. 

The pretrained photoplethysmogram (PPG) model will pinpoint the shape of a lunar object with strict localization accuracy and semantic labeling allowing sensor hubs to perform extensive sensor algorithms providing real time data and analysis. Sensor hubs include data and machine algorithms for fusion from onboard instruments such as: gyro, magnetometer, accelerometer, spectrometer, x ray, IR and UV cameras, actuators and motor controlled devices. 

Share with your friends
  • 1
    Share