European Space Agency conducted Mars Express Power Challenge, a Machine Learning competition. I was placed in 13th position.
The challenge was to predict the thermal power(supplies power to the platform units and the thermal subsystem) required for a martian year(which is approx 2 earth years) for the Mars Express Orbiter satellite.
My name was featured in the Open Data Day in European Space Agency. Top 14 contestant names were shown. I should have spent more time for the competition. I was busy with my client engagements so I spent the last one month for the competition.
It was fun to play around with Space related data.
Training data was provided for 3 martian years(2008 to 2014), test data for the year 2014.
DMOP or detailed mission operation plan data contained the subsystem commands. Milli second data was provided. These commands were sent to sub system inside the Mars Express Orbiter.
Context data contained the solar aspect angles, the (x,y,z) angles of the satellite with respect to the Sun.
FTL contained the spacecraft pointing events.
EVTF contained the commands of the events which were sent to the satellite from ground stations. EVTF also contained eclipse events.
Contained various distance related variables like the distance from earth to mars, distance from sun to mars. and sun mars earth angle, and eclipse duration.
Data contained many null values. I tried imputing the data with multiple techniques but forward fill performed well.
I extracted multiple features like Start, value, command, dyns event, occurence number and trigger from the command text and label encoder them.
year, month, day and hour from timestamp.
Extracted point period duration.
Length of the description text. Grouped the count of eclipse events and the length of the satellite from the Mars.
I tried several models. Linear Models like Linear regression, Lasso, ElasticNet. Tree models like Decision tree regressor. Ensemble models like Random forest regressor, Gradient boosting regressor, Extra trees regressor and Adaboost regressor. And KNeighbors regressor, SVR.
XGBoost performed better than all the models. I did hyper parameter tuning to improve the models performance.
I fell down from 7 to 14 in the public leaderboard in the last few days and ended up 13th on the private leaderboard in the end.