I am Ginés Salar, Aerospace Engineer by University Carlos III (Madrid, Spain). As this years‘ GSoC edition comes to an end, allow me an opportunity to give a comprehensive explanation of my contributions to aerospaceresearch.net. From my university’s department of aerospace research, there is an interest to develop and test preliminary trajectory optimizers. This has led, in recent years, to the development of MOLTO (Multi-Objective Low-Thrust Optimizer). Such a tool would provide a two-step optimization process for one in three scenarios conceived: IT (Inteplanetary Transfers), OR (Orbit Raising) and 3BP (Three Body Problem). In this post I will not go into the details of these engines but I strongly advice the interested reader to access [GSoC 19′ |UC3M ] MOLTO – Mission Designer.
My efforts try to improve upon the MOLTO-3BP, specifically the first step. The classical approach to this problem searches for a set of ballistic trajectories patched by instantaneous impulses. It is assumed that by reducing the fuel consumption of these impulses, the initial guess improves. After that, the engine moves to the second step and introduces the actual control optimization to adapt the orbit to a truly low-thrust mission. There is little knowledge about whether this procedure delivers the best result or merely a local minimum. Providing an answer to this question is what motivated the work done.
2. Work Breakdown
Parallel works by other students have attempted to provide a database with sampled non-keplerian periodic orbits. Initially, these efforts were aimed to replicate real missions, or to try to improve them using their objectives as a guide. The purpose of this database would be to propagate invariant manifolds from the orbits in order to find ballistic transfers that involve libration point orbits. Finally, the aforementioned patching process is carried out by selecting a suitable Poincaré section and analyzing the trajectories‘ intersections with this surface.
Under this environment, my main objectives were:
Generalize the capabilities of these functions.
Structure the code into a single body.
Translate the existing code to Python.
Emulate libration point orbits transfers.
Provide a new metric for the trajectories‘ suitability with a shape-based approach.
The first point, was to isolate all constants from the rest of the code and allow an easy access and control of the studied system. Furthermore, the most interesting libration points, due to the small Jacobi constant required to access their neighbourhood, are the collinear points L1 and L2. We decided that the program should extensively cover both points and the motions around them. This way, the basic building blocks used previously to compute Halo and Lyapunov orbits were extended to be applied to L1 and L2, and tested for the Sun + Earth & Moon system and the Earth + Moon system.
From that, it is important to remember that the purpose of these orbits is to propagate the disturbed trajectories that emanate from them. This promotes the idea of generating a common access point that joins orbit creation and manifold propagation, as well as, post-processing. On that line, I homogenized the input/output requirements of both orbit families and standardized the procedure to any future orbit family. The idea still holds for any non-periodic trajectory that in turn becomes relevant to propagate manifolds from. These steps were crutial to figure out as they are particularly relevant for an eventual integration of this code as part of MOLTO-3BP.
Another relevant point is that the original code requires a Matlab’s licence. This reduces drastically the code’s accessibility from any external sources. This could be easily avoided by converting the code into an open-source language with similar inner workings. The obvious candidate was using Python as it is also a high-level language with similar flexibility to Matlab’s. This change also introduces the possibility of using any of the many freely available modules, such as numpy, scipy, spiceypy and matplotlib. This way of proceeding not only reduces programming time, but also execution times. Additionally, the program can implement features not currently present in Matlab, like the explicit Runge-Kutta method of order 8 included in scipy’s suite. This allows for more precise computations for the most sensitive problems.
Following the reshaping of the code, several testing ideas and possible future developments arose. One of the most relevant was the concept of emulating complex sequential orbit transfers, both homoclinic and heteroclinic. The code was provided with the necessary tools to discriminate which manifolds where required by the process plus the ability to iterate both the orbit generation and the manifold propagation processes.
Finally, the end objective was to reach a better understanding of the suitability conditions in order to provide better decision metrics for future optimizers handling this problem. This section is still under development. The initial idea still remains: reduce the ballistic trajectories to their complex frequencies, compare them, and deduce a figure of merit, such as the delta-v required for jumping among them. Preliminary frequency analysis have been started on top of the tools developed, specially for the 2D simpler case. There are already some promising results but much testing is still required to be able to ensure good performance.
3. Acknowledgments
In order to conclude, I would like to thank Manuel Sanjurjo for his constant and agile support during this enterprise. Without his vision this process would not have been nearly as smooth. Also, I thank David Morante, responsible for the creation of MOLTO, for assisting along the way. On a similar note, I take this opportunity to mention Andreas Hornig, as a fine and efficient manager of the community. Everything has been perfectly clear right from the start. Last but not least, I thank Google for running this program and give this sort of opportunities to students like me, it has been a great experience!
We are almost in the deadline of GSoC, and I have been working in a project called MOLTO. I didn’t publish any blog until this moment since there has been a lot of work and I’ve been really busy. In this blog I will describe the whole process behind this project that has been changing since it begins.
It has been an amazing experience where I’ve learned a lot, I appreciate the time from my mentor David Morante who has been really involved during all the program. I would like to thank him for all the support. Thanks for giving me the opportunity of being part of this incredible program, all the knowledge I got from this is invaluable for me.
By the way, it’s time to talk about the project, it started with my application where they were asking for a student who could create or improve their user interface as well as do some improvements in the algorithm code. But before going away, I will explain in brief words what is MOLTO. At first, the application was for work in the MOLTO-IT project which is a branch of a bigger project called MOLTO. MOLTO is a mission designer created by David Morante for his doctoral thesis. He divided the project into three branches that I will describe below:
MOLTO-IT (Multi-Objective Low-Thrust Optimizer for Interplanetary Transfers): It is a fully automated Matlab tool for the preliminary design of low-thrust, multi-gravity assist trajectories.
MOLTO-OR(Multi-Objective Low-Thrust Optimizer for Orbit Raising): It is an application for the preliminary design of low-thrust transfer trajectories between Earth-centered orbits.
MOLTO-3BP (Multi-Objective Low-Thrust Optimizer for the Three Body Problem): is a fully automated Matlab tool for the preliminary design of low-thrust trajectories in the restricted three body problem.
My main proposal –was specifically for MOLTO-IT but at the end it changes– was about make a great UI without losing Matlab efficiency and without the need of re-build the code in another language. Since Matlab is very limited for UI purposes. I proposed to create an architecture that could enable the communication between Matlab and external applications, using their python engine through an API. That’s why my proposal is called MOLTO-IT As A Service (MaaS). I quote myself:
The objectives of this project are to find optimizations in MOLTO-IT trying to reach the best performance due to the hard numerical process that this project does and create an API using this Matlab module as a service where the users will send a POST request with the necessary inputs, and it will return the necessary information. Such as parameters of orbits, time, fuel consumed, or even graphs, etc. This will allow the team to create a better and more attractive graphical interface without losing the Matlab efficiency. Creating in this way, the possibility to use this service wherever you want such as mobile, web and desktop applications.
Once we started talking about the project during the community bonding phase, we realized that we could improve the project thinking bigger and making some changes to the initial proposal. So we started thinking on work in MOLTO instead of only MOLTO-IT. Clearly, this new way to see the application changes some stuff such as primary design but the main goals will keep mostly the same.
Main Goals:
Extend capabilities of MOLTO-IT MOLTO.
Create API in selected programming language. (JS, Python)
If it is possible, create an MVP for MOLTO-IT MOLTO.
The proof of concept that I created for my application was some-kind different from how MOLTO looks right now. In the url below you can see my proof of concept of the API and UI during GSoC application:
And this one is the re-builded design after we started working on MOLTO. The flow was created in order to create a new mission in MOLTO-IT, so click start in MOLTO-IT button and just go through the flow:
As you can see there were some big changes, in my GSoC application I created a mobile application and I ended up creating a web application, but this iterative product development allow us to reach the main goal for MOLTO which was always to create an application available for anyone.
MOLTO IT
I will explain how MOLTO-IT works for avoid extra-explanations below, MOLTO-IT is the only one that right now is completely finished, MOLTO-OR and MOLTO-3BP are under development. We have been focused on making work this service, since OR and 3BP will work pretty the same.
MOLTO-IT is a fully automated Matlab tool for the preliminary design of low-thrust, multi-gravity assist trajectories. It means, it could allow us to know which is the best trajectory for interplanetary missions. Quoting its main goal:
The purpose of MOLTO-IT is to provide a fast and robust mission design environment that allows the user to quickly and inexpensively perform trade studies of various mission configurations and conduct low-fidelity analysis.
All of this is achieved through an outer loop that provides multi-objective optimization via a genetic algorithm (NSGA-II) with an inner loop that supplies gradient-based optimization (fmincon) of a shape-based low-thrust trajectory parameterization. At the end the mission designer will need to input a series of parameters, such as the spacecraft’s departure body, its final destination and some hardware characteristics (Launcher vehicle, mass, propulsion), as well as the range of launch dates, flight times and a list of available planets to flyby. The software tool then uses these data points to automatically compute the set of low-thrust trajectories, including the number, sequence and configuration of flybys that accomplish the mission most efficiently. Candidate trajectories are evaluated and compared in terms of total flight time and propellant mass consumed. This comparation is called pareto front and will look like this through the matlab plot:
After all the process is finished, we will be able to see the last generation which will contain the pareto points, every point is actually the best fit for the mission designer purpose, I mean if you want to go to mars and arrive in less than one year, you know that you will sacrifice most of your fuel, but if you are able to wait for a long travel such as 5 years, you will save up a lot of fuel. Whatever the point you select in the last generation, you could be confidence it is the most optimal solution. Btw, once you are in this part of the process you could select the most convenient pareto point for your mission and this will allow the tool to create the trajectory.
The trajectory is created by another functions that all they need is the mission configuration and the pareto point selected. After that you will be able to see the trajectory which will include everything a mission designer should know such as: Number of flybys, time, where to apply impulse, and more parameters, I attach an image below of how the plot looks like.
FIRST EVALUATION
All the process described before was during the phase of community bonding and maybe 1 week from first evaluation. During the first evaluation, I was mainly focused on the API since I need really double check everything will work. As you could imagine if something goes wrong with the communication between Matlab and the API, maybe anything could be possible.
The API was created within python language using flask, matlab engine for python, redis, celery, socket.io, and google drive (gspread). Why google drive? – It is something that I’ll talk about! –
The UI was created using React.js, Redux, socket.io client, recharts, and some other libraries. – Completely created using Hooks even for redux! –
During my regular meetings with David, we started thinking on what we’ll need to change in Matlab code in order to call the main function from the API. We quickly realized that we should change the main function in order to receive a json, at the end it was receiving an struct from an examples file. After that, a route was created in flask in order to receive the data from the UI, process it and finally send it to Matlab. The main purpose of using matlab python engine is that we could call Matlab functions within python, and the main function called „molto_it.m“ was the only one to call in order to trigger all the process. Until this point we were happy because everything was working like a charm. So I started working on the UI that finally looks some-kind different since I made some changes on-the-fly. We decided to implement an slider in the home page instead of the images, and implement a typer feature within the slider.
As we were advancing in the UI, we also realize that it would be a problem if all the content were static -We also think about the possibility to implement a user architecture to enable users save their missions, we knew that a database was needed but we were just trying to avoid it at least for GSoC purposes, but thinking in that feature for the near feature-. There is where Google Drive appears at least temporally, I proposed a feature where all the content could come from an spreadsheet that would be located in google drive, so every-time that we want to make a change, it would be as easier as just enter to the spreadsheet and change the content. Similarly for the collaboration component that would be updating the collaborators every so often. At the end, I would like to clarify that this feature will change, this was made just for MVP purposes. So, I finished my first evaluation implementing this feature that actually work effectively. ??
SECOND EVALUATION
At this point, we needed to worry about the Matlab’s response since the process it’s composed by two main tasks: The pareto front and the trajectory. The real problem was that both of them plot the results, based on the real-time data. So one of our options were just to send an image of the final plot or just find a way to send the data in real-time trough the API to the frontend. But there was another problem, once the process start it was returning the generations in real-time, which was a problem because the API was making a POST request which will wait for one response, so in this way we were just able to receive the first generation.
We were having problems due to the synchronous naturalness of python. In this case, the requirement was to being constantly sending data to the UI, at least every-time the software creates a new generation, in order to display the data in real-time in the UI. –Such a task!- I thought in the feasibility of using sockets for the communication, but it was tricky because I would need a trigger to let me know once the generation is finished. And obviously all of this should be parallel to the request. Talking with David, we agreed that the best way was creating a new file every time the generation was completed, so in this way I could create a socket that should be constantly looking for files in a temporal directory. So that’s what we did, every user that creates a mission, creates a temporal UUID that will add a temporal directory within the server, so the Matlab function will redirect all the created files to this directory. Once you have the first generation you will be able to see in real-time the plot directly in the UI. All the directories will be deleted within the same day at night.
In that moment, we had almost all the first part to plot the pareto front, but in the UI we needed to catch all the data and save it in the correct way. We should be able to get the data in any moment, and this data should be available in almost the entire application. That’s why, I decided to use Redux. Redux is one of the best tools for data management if you using React, so I implemented the Redux architecture in order to handle all data from the API. At the end, the store looks like this.
All the data come from a kind of form, where the user puts all the inputs in order to send the data to the API. This allows me to just send a POST request with all the data from the store, once the user finished all the flow, it also allows me to remember the selections of the user, so once you select something, you can go back and you will see that your selection is still there.
FINAL EVALUATION
In the final evaluation we were trying to finish minor details such as design details. for example, we have been testing all the application using a dropdown where we need to select planets, but of course there should be a better way to do this. So, that was one of the big new features where I could work. So, I used a library to display the planets in a cool way. You already saw the planets feature in the gif’s that I put before, but I will leave here a static image of the feature.
Of course, It was not all. In that point, we could plot the pareto front, but the last part requires to plot the trajectory, due to the times, we chose to just display the trajectory plot from Matlab in the UI, at least for GSoC purposes. ¿How we did it? As I explained before, once you get the final pareto front the user can select one point, the optimal point for your mission in terms of mass and time. So we call the API again at same route but this time with a flag. This flag means that you have the point of the pareto front that fits into your mission design, so Matlab function is enabled to detect it and just create the trajectory instead of call the genetic algorithm. Something cool is that you could go back and select another pareto point, and just call the API again. This will create the trajectory for the new selected point. It allows the users to iterate between different configurations for the same mission almost instantly. Btw, it finally looks like this – is the last view where you could share or download your preliminary results of your mission or create a new one.
The last feature that I implemented is related to something we saw in the second evaluation. As you know python works synchronous and every task will be lineal. And also as you know until this point, every request will long as much as the number of generations the user request. So if the user request a genetic algorithm using 200 generations, the server will be busy a lot of time. The problem remains in the fact that if 3 users design a mission at the same time, they will probably have some issues because 2 of them will wait more than the normal request. So in order to avoid this issue, I started using threads, and parallel tasks. How I did it? Using celery, redis, and eventlet. This allow me to manage many requests and start them in the background. So the server is always available for new users without affecting the running times. ?
Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well. The execution units, called tasks, are executed concurrently on a single or more worker servers using multiprocessing, Eventlet.
Eventlet is a concurrent networking library for Python that allows you to change how you run your code, not how you write it.
Redis is an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker.
Something cool that could be a new feature, is the fact that if the user remember the UUID that it creates within the mission. I mean if you could send it to the user through email, they could create the mission, go to take lunch or wathever, come back and put the UUID and the application will find the created directory, and will display the final plots in just seconds! —- This is possible thanks to the sockets we use to search for files and directories –
I would like to say that there are a lot of features that I didn’t take in count for this blog, just for become it short, since there is a lot of information. I just went through the most important features. But if you have some questions due to other component or something else. Please don’t hesitate in let me know. One last thing, the last month, we had some problems with the servers, that’s why there is no production application now, hopefully on monday 26 august, everything will work again, and then I will push the application to production. Once it works, I will edit this post in order to share it. Right now, everything is working under development.
What’s next?
As I wrote before, there are some TO-DO’s, where I will be working the rest of the year. We are open if anyone wants to contribute to this project.
1.- Create and implement architecture for save users and missions. (mongoDB)
2.- Send email with UUID, so the users could come back after they create the mission.
2.- CMS for sliders and collaborators.
3.- Improve responsive application.
Repositories:
You will find the documentation of every project within the readme.