Subscribe to our newsletter and get exclusive digital updates straight to your inbox.
HOSTmi Team successfully delivered first miGUIDE
October 12, 2021 - Pouya Haschemi
During the last months, the team of HOSTmi was building a dedicated enterprise solution based on the same software which is used for our miSMART Platform. This Software is called miGUIDE.
What is miGUIDE
miGUIDE is a web-based requirements and asset management tool which enables direct interaction between payload owning customers and the Launch and Hosted Payload offering companies (carrier) without the need of intermediaries as it is required in miSMART for instance. This cloud-based white-label Software-as-a-Service solution opens the “door” to own customers to enable efficient and targeted interactions. Each of the customers have a private account and can log in to a dedicated closed instance.
Designed for the space industry, the solution streamlines all mission-related interactions (inquiries, requirements specification, file exchange, life cycle management) between a dedicated carrier and its customers and in addition automates many aspects of the processes which currently are being carried out manually across different platforms and software. The aim of miGUIDE is to further lower mission management costs and support companies in scaling service offerings by the matchmaking capabilities of the software.
The first rollout
Within the SLOTD4.0 Project where HOSTmi is responsible for all digitization aspects, we have build a dedicated version of the miGUIDE SaaS in order to enable German Orbital Systems (GOS) and RWTH Aachen a smooth internal as well as external interaction towards customers. Both organization are building the decoupled satellite system which can host a wide range of payloads.
On the one hand the software allows the providing party (in this case GOS and RWTH) to digitally specify all required system engineering aspects of the offered Hosted Payload solutions. This allows the transparent and data based display of all available solutions to their customers.
On the other hand their customers are able to specify the system engineering aspects of their desired mission and the payload to be flown (e.g. IOD Payload) through a customized questionnaire tailored to the needs of GOS and RWTH.
Commercial rollout begins
With having demonstrated the high quality functionality of our white label SaaS, we are now offering the miGUIDE solution to the entire space industry which allows high customization on configuration as well as on service offering and customer relationship management between space carriers (launch and/or hosted payload providers) and their own customers.
To be clear:the tool can easily be customized to a desired color scheme and be hosted on a domain or subdomain of interest.
Within the development phase of the tool we have already involved multiple European and US launch providers as well as satellite manufacturers, so the entire software is built by space engineers for space engineers!
Contact us at email@example.com or download the miGUIDE User Manual and get a free demo in order to gain a first impression of this next generation asset and requirements management web application which you can purchase, customize and host directly on your website to enhance the customer journey of your clients to the next level.
Spotlight on Space Talents presents: Überflieger winner EXCISS
June 22, 2021 - Guest Author: Tamara Koch, Oliver Christ, and Philomena-Theresa Genzel (Publisher: Shahrokh Khodabakhshi)
With our format Spotlight on Space Talents, we want to open a platform for young and talented groups, students, NGOs, universities and other people from the so-called "space generation". In this edition, we would like to present one of the 3 winners of the first batch of the Überflieger Program, the team from the Goethe University Frankfurt with the experiment EXCISS:
We are a group of geoscience student (bachelor, master, PhD students), a physic student, two professors, the members of the Hackerspace Frankfurt e.V. (a Maker Club) and further employees of the Goethe University including the members of the workshop.
Since the majority of our team has a background in geoscience, we are interested in the evolution of terrestrial planets like the Earth and especially the building processes which led to these planets. Meteorites represent material from the outer space, which bears information about how planets like our Earth have formed. Being a geoscientist gives us the opportunity to hold this kind of material in our own hands after it travelled millions of kilometers through space. With a variety of modern instruments, we are able to analyze meteorites thoroughly to unravel more and more puzzle pieces of our Solar System. When we heard about the Überflieger competition, we were fascinated by the opportunity to develop and carry out an experiment aboard the ISS and learn more about planet formation in an environment of zero gravity.
Dust melting under micro-gravity conditions
Meteorites originate mainly from asteroids, which represent one of the steps during planetary formation. A major part of these meteorites contains small spherules with sizes of less than one millimeter, which are called chondrules. Chondrules have probably formed more than 4.6 Billion years ago when our Solar System consisted of a dense nebula of gas and dust particles. An event of unknown nature melted these dust particles to tiny spherules. Over time, these spherules aggregated to larger bodies like asteroids, which subsequently created planetesimals and eventually planets through collisional processes. Therefore, chondrules are also called the building blocks of our Solar System. However, even after decades of research dedicated to the origin of these building blocks, the exact formation conditions and the resulting mechanism still remain mysterious and strongly debated in the scientific community.
As we decided to join the DLR competition, we quickly agreed that chondrule formation might be a very interesting topic to study under long-term microgravity conditions because an experiment like this has never been carried out before. Among the scientific community, a variety of hypotheses regarding the formation of chondrules have been postulated involving the formation of chondrules by lightning discharges in the early Solar Nebula. To proof whether or not this hypothesis, and of course all the other, are realistic, it is necessary to create an environment similar to the early stage of our Solar System. Beside other crucial parameters, this includes the usage of long-term microgravity and the International Space Station (ISS) is the only place where this parameter can be met. Thus, we decided to develop an experiment, which simulates dust melting under micro-gravity conditions by electric arc discharges on the ISS. In this experiment, dust particles consisting of a mineral, which is typical for natural chondrules in meteorites, were freely floating while being exposed to arc discharges. In order to obtain preliminary results, the experiment was filmed so we could analyze if particles melted and fused to chondrule analogues.
After we were successfully chosen as one of the winning teams, we immediately started with the experimental set-up of the experiment. Our main problem was the development of a completely new experiment which had to be in accordance with the provided energy, the limited space, data transfer possibilities and NASA safety requirements (for example, electronics which induce arc discharges and Li-Ion batteries had to follow especially strict rules) in a very short time. We achieved great help from the members of Hackerspace Ffm e.V. and friends, which supported us with the development of the software and hardware without whose help the experiment would not have been feasible. At the end, we were happy that our electronics and data transfer worked almost as planned but beside all difficulties, we performed the very first experiment of this kind aboard the ISS. With the return of the experiment, the probably most interesting part (from the scientific point of view) begun: the analysis of the transformed material in our laboratories at the Goethe University.
The EXCISS experiment has already achieved new scientific knowledge and the data will be published soon. Furthermore, the current planning involves a succession experiment of EXCISS, but no official commitment has been made. Stay tuned or reach out to the team to get more information if you are interested.
Spotlight on Space Talents presents: Überflieger winner PAPELL
June 16, 2021 - Guest Author: Christopher Behrmann, Daniel Bölke, Nicolas Heinz, Saskia Sütterlin, Felix Schäfer, Franziska Hild, and Manfred Ehresmann (Publisher: Shahrokh Khodabakhshi)
With our format Spotlight on Space Talents, we want to open a platform for young and talented groups, students, NGOs, universities and other people from the so-called "space generation". In this edition, we would like to present one of the 3 winners of the first batch of the Überflieger Program, the team from the University of Stuttgart with the experiment PAPELL:
PAPELL is an experiment investigating ferrofluid manipulation in microgravity on the International Space Station for applications such as a mechanically free pump. This experiment was built by a team of 35 students from the Small Satellite Student Society University of Stuttgart (KSat e.V.) starting back in May 2017 and was financed through funding from the German Space Agency (DLR), and donations from industry supporters. With this article Christopher Behrmann, Daniel Bölke, Nicolas Heinz, Saskia Sütterlin, Felix Schäfer, Franziska Hild, and Manfred Ehresmann are helping us to better understand the experiment PAPELL and the experiences of the team during the exciting times.
Pump Application using Pulsed Electromagnets for Liquid relocation - PAPELL
Space is hard, making components work reliably in space is even harder. With the PAPELL project we explored a novel principle to operate mechanisms in space. For this we combined magnetisable liquids (ferrofluids) with magnetic fields. By this, we were able to show that it is possible to intentionally manipulate such fluids in space to replace conventional mechanisms. All this was possible due to a passionate team of more than 35 students from the University of Stuttgart.
The idea of using magnetisable liquids and magnetic fields to achieve a new type of mechanism was a spontaneous idea inspired by early achievements of Steve Papell. The general idea for this project was initially presented to a small team, which then was subsequently developed as an experiment conducted on board of the International Space Station.
The small team consisted of members from KSat e.V. KSat e.V organizes students from various technical fields with passion for space and combines a variety of talents. By utilising experience from previous experiments and space projects on sounding rockets and cubesats the team was able to present a winning proposal during the Überflieger competition of the DLR, in which a slot for experimenting on the ISS was awarded.
Experiments in space are exciting but challenging. Conventional mechanics are generally simplified or try to be completely avoided due to high costs stemming from high qualification requirements. By removing these mechanics and replacing them with fluidic mechanisms allows us in principle to execute more experiments in space with less costs and risks involved. Ideally, this technology will be used in the future when humans will explore space beyond Earth orbit, where highly reliable and maintenance-free components based on magnetic liquid manipulation are an important utility. This for example could be highly reliable attitude control systems or maintenance free pumps.
Challenges and Achievements
The main challenge of the participation in the Überflieger competition was the limited time. The whole experiment had to be designed from scratch in less than one year. This included the full engineering process for the mechanical and electronical design, part selection, procurement, software development, testing, and documentation for ISS compliance. All this had to be performed while still studying full-time at the University of Stuttgart.
Apart from the time constraints, once the experiment was installed on the ISS the operation was a challenge in itself. A time limit of 1-3 hours per day (late evenings due to US time shift) was given for interacting with the experiment. Occurring problems had to be resolved fast and efficiently, as the number of experiment days were very limited. New data had to be analysed immediately and new experiment plans had to be developed for the next operation window. Many days were concluded with more new unknowns than success stories and the state of the experiment hardware was also uncertain for many days due to in-orbit debugging.
Solving all these challenges let to the greatest achievement for the PAPELL team which was to see the videos of the ferrofluid movement in space for first time. This success was especially heartfelt as the endurance during the challenging fast design and development process of the experiment finally paid off in the behaviour of the magnetisable liquid manipulation we hoped to see in space.
We as the PAPELL team are very proud to have been able to form a team that was capable to design an experiment from scratch in a very short time for the ISS, which was part of the mission of Alexander Gerst. Moreover, we finally succeeded in performing our experiment which then as a bonus even survived the transport back to earth and we were able to analyse the experiment further even beyond the orbital stay.
The main supporter of our experiment was the German Aerospace Center who issued the Überflieger competition, which awarded us an experiment slot on the International Space Station as well as provided funding for the development of the experiment.
DreamUp / Nanoracks gave valuable advice and support for the experiment design which helped the experiment to fit into their experiment rack on the ISS, as well as technical advice for space qualification requirements.
KSat e.V. provided us with a platform to assemble a team of passionate and dedicated students. Additionally, KSat e.V. aided in acquiring additional funding to offset travel costs as well as specific component sponsors.
Hereby we would like to thank again all supporters and hope for further exciting collaboration opportunities.
What the Future holds
Persistent hard work is necessary for success but having fun and great moral in the team is equally important. Communication is the key. The project does not end with the launch or the end of the experimenting phase, but only when the last analysis is done and finalised.
Currently within KSat e.V. several component development activities based on magnetisable liquid manipulation are on-going. Oscillators, pumps, switches, and reaction wheel prototypes have been developed and the teams intend to test these prototypes on a REXUS sounding rocket flight.
If you are interested in supporting these activities feel free to contact the team. Donations for the Small Satellite Student Society of the University are always welcome.
Spotlight on Space Talents presents: Überflieger winner ARISE
June 08, 2021 - Guest Author: Felix Jungmann (Publisher: Shahrokh Khodabakhshi)
With our format Spotlight on Space Talents, we want to open a platform for young and talented groups, students, NGOs, universities and other people from the so-called "space generation". In this edition, we would like to present one of the 3 winners of the first batch of the Überflieger Program, the team from the University of Duisburg-Essen with the experiment ARISE:
ARISE – Exploring the origin of planet formation on the ISS. This experiment was built by a team of 5 students from the University of Duisburg-Essen starting back in May 2017 and was financed through funding from the German Space Agency (DLR). In this article Felix Jungmann is helping us to better understand the experiment ARISE and the experiences of the team during some challenging but exciting times.
Planets form in disks of gas and micrometre-sized dust around new-born stars. At the beginning the dust can grow by hit and stick in collisions but reaching sizes of centimetres this growth is stopped because particles only bounce off. We built an experiment to investigate how it is possible that aggregates grow beyond that size to become a planet, at the end. Our idea is that electrostatic charge, achieved by previous collisions, can be a powerful glue holding the dust together. Within the project Überflieger, our experiment ARISE mimics these conditions in the microgravity of the International Space Station (ISS) and shall test if charge can induce clustering.
Journey to Space
How planets are formed in detail is still an unsolved mystery. Knowing from telescopes which are scanning the sky that planets are no exclusion, many researchers are trying to design models to understand how such formations are possible. Results are always the same, the beginning and the end of the growth phases are well-understood, but what happens in between is unclear. At first, dust around a new-born star can grow just by hit and stick and later, gravity holds things together. But what happens with centimetre or decimetre sized bodies? Having in mind that billiard balls do not tend to stick to each other, this is a question that cannot be answered easily.
We are a group of PHD-Students from the University of Duisburg-Essen doing research on exactly this question for years now. Since the places of planet formation are very far away, we cannot observe this growth phase with a telescope. Therefore, we design experiments on earth that are mimicking space conditions and try to observe growth in the critical phases. Since weightlessness is omnipresent in space, we need to use special platforms that are providing microgravity for our experiments. These are for example parabolic flights, sounding rockets or the drop tower in Bremen. A big disadvantage of all those is that their microgravity time only last for seconds. To test our idea, we needed much more time and therefore, we applied to be part of the Überflieger program by the DLR.
Luckily, in May 2017 our idea was accepted to start its journey into space. The schedule was quite tough, so we only had 9 months to design and build a space-ready experiment that meets all the challenging requirements. The most difficult point was that its volume was limited to 1.5 litres and the whole setup needed to be placed in a small aluminium box. But also, the power supply and the weight were limited to a minimum. We already had a lot of experience with other platforms of microgravity, but this was definitely a different story.
So, Tobias Steinpilz (project lead) designed the electronics, Gregor Musiolik and Felix Jungmann the software and Maximilian Kruss and Tunahan Demirci the mechanics. We were supported by many people, such as Prof. Gerhard Wurm and Jens Teiser who gave advise in scientific issues, by Manfred Aderholz in technical issues but also by all other members of the astrophysics group in Duisburg. Without the help of Softwareentwicklung Recktor who designed the software frame for an experiment on the ISS we would not have been able to acquire that much data.
All in all, the construction of the experiment was a very exhausting time, but when all work was done in March 2018 it was a great moment to pack our experiment in a box for shipping. After a lot of tests by NASA our experiment was ready to fly with Falcon9 on the 29th June, 2018 to the ISS. For this, we were invited to watch the launch at Cape Canaveral. It was a great feeling to know that something built by our own is now on the way into space. What made us even more excited was the first image in July from the inside of our experiment chamber. We are proud to say that everything worked as planned, all motors are moving, and we could download much more videos than expected. The interpretation of all the acquired data is still a big step to do.
More answers ahead of us
Sending an experiment to space is a tough job! There are plenty of technical as well as organizational challenges that are not easy to solve. Nevertheless, it was worth the effort and a unique opportunity. We have learned a lot about processes in space but also about how to construct further experiments which are even more efficient in exploring planet formation. The results of ARISE gave us abundant input to the follow-up experiments on a stratospheric balloon (BEXUS29 – REXUS/BEXUS) but also on a sounding rocket (SubOrbital Express 3 – M15) provided by ESA. ARISE started further topics of research and these will help to give even more answers on the question of how planets form.
Up until now we already have published two papers concerning the setup and the first results which you can find here:
Ground segment is a term used in satellite communication that includes all components installed on the ground that are needed for a spacecraft mission. The main purpose of the ground segment is to control and monitor the spacecraft as well as to distribute and process the spacecraft data. As shown in the conceptual picture bellow the ground segment can consists of a multiple Ground Stations, which provide communication with the spacecraft. Another main element is the Mission Control Center from which the spacecraft is managed and operated. The work conducted inside of the Mission Control Center is called Mission Operation. The Mission Control Center also serves as a network hub to distribute the data to the relevant users. There are several different users like Operator, Simulation Officer, Scientist, etc. and they all work with the data provided through the ground segment.
Ground Station as a Service
The ground station, as the link between the control center and the spacecraft is a key element in every space mission. A lot of different types of data sets need to be transmitted to and from Earth or between individual spacecraft: so-called telemetry for spacecraft monitoring, commands for control, and payload data.
As in a previous article “Benefits of Cloud Computing” described, cloud computing is a growing and powerful technology. Some cloud providers like AWS Ground Station or Azure Orbital provide now fully managed Ground Station as a Service that lets you control satellite communications, process data, and scale your operations without worrying about building or managing your own ground station infrastructure. Because it is closely connected to the cloud infrastructure these services can benefit from big data analysis, fast data processing and also cloud computing for high performance AI applications.
“Houston, we have a problem!”
The success of a space mission depends not only on a properly designed and built space segment and the successful launch via a launch segment. It also depends on the successful Mission Operations, carried out by a team of experts using the infrastructure and processes of the mission’s ground segment.
A failure on a space mission is always possible and the damage really high, specially on human exploration missions. Therefore, an important task of the operator in mission control is to act quickly and appropriately, if a problem occurs. Like on the famous Apollo 13 mission, an oxygen tank in the service module (SM) failed two days into the mission. Multiple options needed to be considered and actions performed to save the astronauts’ lives. An example for this was a special task force that improvised on how to make an adapter for the ventilation system with the little resources available onboard the Apollo 13. This example illustrates how complex and dangerous a space mission can end up being and that not everything can be predicted in advance.
We hope that with this short article you got a brief inside of some of the ground segment elements needed for a space mission. Let us know if we can support you on your next successful space mission!
Überflieger - Student Experiment on the ISS
May 25, 2021 - Interviewee: Johannes Weppler / Interviewer: Shahrokh Khodabakhshi
Read this interview to find out how the Überflieger program can possibly help you launch your ideas into space. We at HOSTmi have interviewed Mr. Johannes Weppler from the DLR Space Administration about this particular program to help you better understand it and to hopefully initiate some great upcoming proposals. Stay tuned for more articles regarding this program and to read more about its previous winners in the upcoming weeks!
DLR (Deutsches Zentrum für Luft- und Raumfahrt/German Aerospace Center) is the Federal Republic of Germany's research centre for aeronautics and space. It conducts research and development activities in the fields of aeronautics, space, energy, transport, security and digitalisation. The Überflieger program is organized by the German Space Agency at DLR which plans and implements the national space program on behalf of the federal government. The department for Research and Exploration within the German Space Agency is responsible for Überflieger.
Überflieger is a national competition for students at German universities to design and build experiments for being operated on the International Space Station (ISS).
Could you tell us what this program is?
The Überflieger competition was started in December 2016. Students from all German universities were invited to submit their ideas for ISS experiments. The most important prerequisites were that the experiments had to fit into a container of 10 x 10 x 15 cm size and that they would not pose any harm to the ISS. The submitted experiment ideas were reviewed and evaluated by an expert panel and in the end three winning teams were selected.
The three winning teams were supported financially and technically to complete the design of their experiments, to build it and to operate it remotely on the ISS. In the course of 2018 all three experiments were sent to the ISS and ran for at least 30 days producing valuable scientific data for the student teams.
Additional to the educational benefits of being part of the program, the winning teams got the chance to witness the launch of their experiments to the ISS in person at the launch sites in the USA.
Why was this program introduced?
Besides its work in research and technology, one of DLR’s goals is to foster the education of the next generation of scientists, engineers and explorers. DLR is targeting different age groups with its educational activities: from primary schools to universities. Überflieger offers the unique opportunity to students to gain hands-on experience in implementing a micro-gravity ISS experiment. Something that is very rare in their regular curriculum. With its constraints in size, mass and power Überflieger challenges the participants to find innovative solutions to maximize the benefits from their experiment.
Who and how could one participate?
Überflieger 2, the next round in the program is currently under preparation. We expect the publication of the call to submit proposals sometime in summer or autumn 2021.
For Überflieger 2 DLR will cooperate with the Luxembourg Space Agency (LSA) thus enabling university students from Luxembourg to participate as well. The program will be open for students (undergraduate, graduate and PhD) from all universities in Germany and Luxembourg. Once the call is published a dedicated website will provide all necessary information for how to participate.
What characterizes exceptional ideas in your opinion?
Size, mass and power that will be available to the students for their experiments is very limited. Ideas that use innovative solutions for maximizing the output of the experiments is what we are looking for.
What support do you provide to the winner teams?
The winning teams will get partial financial support for expenses related to their project. Technical experts will assist them in how to design and build an experiment that is suitable for the ISS and feasible to be implemented by the student teams.
How do you see the program evolving over the next 5 to 10 years?
It took five years from the start of Überflieger to the start of Überflieger 2. In future we would like to have a faster sequence of competition rounds thus enabling more students to participate in the program. We are also open for extending the program to other cooperation partners both within and outside the space community.
Who should students and readers contact in case of questions?
Questions can be sent to firstname.lastname@example.org. All relevant information will be posted on the dedicated Überflieger website once it is launched.
Last but not least, what would you tell student teams to motivate them to apply to this program?
Think about what makes the environment on the ISS so unique and how it could help to solve problems in science and technology. Assemble an interdisciplinary team (not only engineers or physicists) and start brainstorming ideas. Being one of winning team will cost you a lot of your spare time for the next one and a half years, but it will award you with hands-on experience that few universities can offer and that employers are looking for and with the once in a lifetime experience of seeing your experiment blasting off with a rocket and feeling the powerful waves caused by its engines.
Asset Administration Shell - The architectural standard to enable interoperable Digital Twins
May 18, 2021 - Pouya Haschemi
The Digital Twin has gained a lot of attention in recent years. The space industry is also becoming increasingly interested in Industrie 4.0 and the digital twins. ESA alone has launched several projects in recent years with upcoming projects in pipeline with the common goal of facilitating communication between satellite manufacturers, their suppliers and ESA. The focus is on the standardization of engineering models, the development of common system engineering standards and the creation of collaboration software. All these initiatives lead to a holistic concept of a digital twin according to Industrie4.0 compliant processes.
Digital Twin – the term
The digital twin (DT) represents a new paradigm. DTs are connected to a real-world counterpart and use various technologies and other paradigms such as simulation, artificial intelligence, and augmented reality to optimize the manufacturing and operation processes of their counterparts. DTs are being built at an accelerating pace in both industry and academia, and the term digital twin has reached multiple degrees of acknowledgement. The term digital twin has evolved since its first appearance in 2003 by Grieves who proposed that a digital twin is consisting of three parts: physical product, virtual product and their connections. In 2012, NASA defined it as “a multi-physics, multiscale, probabilistic, ultra-fidelity simulation that reflects, in a timely manner, the state of a corresponding twin based on the historical data, real-time sensor data, and physical model”. 2019, the Industrial Internet Consortium (IIC) proposed to use the following standard definition which will be also the basis definition for this article:
“Digital representation, sufficient to meet the requirements of a set of use cases”.
Digital Twin silo problem
To date, different organizations across the globe start implementing the digital twin on different approaches and proprietary structures. While trying to solve the issue of interoperability and digitization by applying digital twins, these approaches generate by themselves a totally new problem: The digital twin silo problem, meaning information silos that inhibit rather than foster cross-system information sharing with partners and enterprises. Therefore, a common conceptual basis of the digital twin concept as well as information and interaction models are of importance in order to enable interoperable digital twins. This is why, in recent years, the German associations BITKOM, VDMA and ZVEI, together with German industry, have developed a standard that is intended to enable interoperable digital twins in accordance with the Industrie 4.0 standard: Asset Administration Shell (AAS)
The Asset Administration Shell
The Asset Administration Shell (AAS) is a central concept of the initiative Plattform Industrie 4.0 to enable interoperability. The AAS enables the integration of a physical asset into the Industrie 4.0 network. Thus, the AAS is the digital standardized representation of the asset to enable interoperability between applications and organizations. The AAS information is continuously updated throughout the lifecycle of an asset. In addition, multiple assets and their associated AAS can form a thematic unit and share a common AAS (system-subsystem relationship). One of the main aspects here are the concrete lifecycle phase dependent data formats (such as XML, JSON, RDF, OPC UA and AutomationML) that enable interoperability. In order to be able to make use of the offline mode for interoperability, the package format AASX was introduced that is specifically required for this application.
In general, the AAS is composed of three elementary aspects, namely data, models, and service interfaces.
Data in AAS: An AAS contains data about its real counterpart (asset) collected throughout its lifecycle. These are design data, manufacturing data, and operational data.
Models in AAS: An AAS contains a variety of computational and visualization models. These are:
Models related to natural laws
Geometric and material dependent models
Visualization models and simulations
Service interfaces (API's) in AAS: An AAS contains service interfaces for software applications to access its data, execute commands, or retrieve models. The service interfaces are in the form of RESTful API. These API's enable interactions along different AAS, interaction between AAS and applications, as well as interoperability across enterprise boundaries.
Summarizing, the ASS can be described as the architectural implementation of the digital Twin for Industrie 4.0. The first German companies (e.g. SAP, FESTO) have already demonstrated prototype implementations of the AAS and are expanding this strategy to extend their product portfolio in this area. We at HOSTmi are also pursuing this long-term goal of using this deeply embedded standard to simplify and centralize interoperability along the entire value chain and lifecycle of a satellite and its mission using platform-based applications.
The foundations of what we now refer to as Design Thinking were introduced by Herbert Simon in his Landmark 1969 book, The Sciences of the Artificial. Since then, several more studies have been published that describe the principles of Design Thinking and how they apply to various business models.
“The most secure source of new ideas that have true competitive advantage, and hence, higher margins, is customers’ unarticulated needs,” — Jeanne Liedtka.
Design Thinking is a collection of protocols and five non-identical phases commonly referred to as Design Methodology. These can be applied to a variety of industries and offers a solution-based approach to complex problems that are not well described.
It is a human-centric approach to innovation that is primarily concerned with creating new concepts capable of transforming all complex issues into healthier alternatives.
In this process, all the decisions are constructed on what customers actually need rather than depending on historical facts and figures.
Design thinking is more than a process. It's a never-ending investigation. Many of the challenges we face are complex and fundamentally human, and we live and work in a world of interconnected systems.
Phases of Design Thinking
This step entails interviewing experts to learn more about a topic of interest, as well as studying, communicating, and interacting with people to better understand their views and motives, as well as immersing yourself in the physical world to gain a deeper personal understanding. Most of the business models aim to comprehend their ideal client's needs and desires, but only a few do so in a relational sense. This is at the heart of what Simon Sinek discusses in his book Start With Why: How Great Leaders Inspire Others to Act. People aren't interested in what you do; they're interested in why you do it.
Depending on time limits, a significant amount of data is collected at this phase to be included in the next phase and to create the best possible interpretation of the consumers, their concerns, and the difficulties that accompany the creation of the specific product.
In this Phase, the facts are brought together which were generated and collected during the Empathise phase. This is where you can synthesize and analyze your findings in order to identify the key issues in a human-centred manner.
It's essential to determine the challenge after we've recognized the unstated requirements of others through empathy. “One should aim to characterize the issue as a problem statement in a human-centred manner,” say Rikke Dam and Teo Sang.
In this third phase of the design thinking process, designers are able to start developing innovative ideas. With this foundation in place, the team will begin to "think outside the box”. It's imperative to start the Ideation process with as many suggestions or problem solutions as possible. At the completion of the Ideation process, you may have chosen some other Ideation methods to aid you in investigating and testing your concepts in order to determine the best possible way to solve a problem. This phase of coming up with more effective solutions to the consumers' implicit needs, will only begin when those needs have been understood in empathy and the problem has been identified.
In terms of how we build concepts to solve problems, prototyping often extends to non-physical solutions. Physical prototyping is absolutely essential at points, but the overall goal of prototyping is to implement solutions in a stable environment.
The design team is responsible to create a range of low-cost, scaled-down models of the product or particular features contained within it in order to explore the problem solutions created in the previous stage. Prototypes might well be shared and reviewed within the design team, in other divisions, or on a selected number of individuals who are not part of the design team. The outlier is an observation with the aim of finding the best solution for each of the problems found in the previous three phases. The ideas are incorporated in the designs individually and based on the users' feedback, they are either approved, modified, and re-examined, or rejected.
By the conclusion of this point, the design team would have a better understanding of the product's challenges and issues, as well as a greater sense of how potential consumers will respond, perceive, and react while dealing with the final product.
Since Design Thinking doesn't always flow in a straight line, there are instances when prototyping leads back to ideation, and when identifying the concern necessarily requires more time to be spent on empathizing to reassess the customer's needs.
With this recursive existence, by the time we get to the end of the Design Thinking process, testing will only be ensuring the last step in our solution. It can even restart the whole process from the beginning at times. One of the defining characteristics of Design Thinking is the ability to pass fluidly through all five stages. The results derived during the testing phase are also used to reinterpret one or more issues and inform the understanding of the users, the requirements of usability, how people think, act, and feel, and also to empathise in an iterative process.
Terraform and Ansible, frameworks used to automate cloud infrastructure
April 26, 2021 - Muhammad Faiz Usmani
The Why’s and What’s
Before the concept of DevOps, the deployment of apps into various environments was a daunting task requiring time, effort, and high skills. A separate Operations Team with extensive knowledge and experience of various servers, virtual machines, and host OS’s (Windows, Linux) used to manually access the server and configure it according to the requirements of the app. And in the cases of small organizations usually, developers of the app are in charge of doing this activity - Well, not a good sight to imagine.
Now thanks to the improvements in the fields of cloud applications and DevOps in recent years, these tasks are becoming less complex. The virtual servers on clouds and onsite physical servers can be provisioned and configured easily in minutes just by executing a script written in a user-friendly language (mostly YAML). This is possible by a concept known as Infrastructure as Code (IaC) which allows us to deal with servers, networks, security groups, databases, etc. as if they are a part of a software. The leading solutions in the market nowadays for IaC are provided by Terraform, Ansible, Chef, Puppet, and some more.
Let’s look at some of the advantages offered by Infrastructure as Code:
Code once, iterate multiple times: This is possible as the configuration files are coded initially for creating and configuring the environment for an app. These files can be used to spin up multiple environments (for example testing, production, staging, etc.) but also these can be used again to deploy similar environments for apps with the same OS and Package requirements. They can also be tweaked a little bit if the configurations change and “Voilà!”: a new environment is configured and ready to use.
Immune to Human error: As previously the infrastructure was configured manually and by different people, there used to be errors known as infrastructure deviations. These were minor differences in the configuration of environments due to the manual process of deployment by different people on multiple servers. Now because of using scripts for provisioning and configuration, this process is standardized and immune to human errors.
Easy scaling: When there is high incoming traffic in your app at a particular time of the day or year, you can create new environments with just a few clicks and commands and deploy multiple instances of your app. Similarly, you can bring down environments when there is less traffic leading to cost efficiency. Scaling environments up and down is as easy as it gets.
Now let’s understand a bit about terraform and ansible and how they can be used in conjunction with each other.
Terraform – The Maestro Provisioner
Think of Terraform as a Master Civil Engineer/Builder, whom you will provide your needs and it will go ahead and build it for you in a matter of minutes. Further, before building the actual thing it will inform and take your consent about the plan it is going to implement.
In the language of software development: It is a tool for creating, modifying, and versioning infrastructures safely, swiftly, and efficiently.
The General structure of Terraform:
main.tf – For more simpler projects this is the file that has all the code to build up the infrastructure. It contains all the modules and data sources that are needed. If the project is bigger and complex, the files are logically split based on the functionalities.
variables.tf – It stores the declarations for the variables which are referenced in main.tf.
terraform.tfvars – It is used to define the default values of the variables.
The configuration files written in HashiCorp Configuration Language (HCL) describe the resources and components which are required to run an app. Terraform then will go ahead and produce an execution plan describing the steps to reach the desired state, and then goes ahead and executes it. If there are changes that are applied to the configuration files, Terraform sees what have been changed and based on this it creates incremental execution plans.
Terraform is supported by multiple cloud providers and the infrastructure that can be provisioned by it includes app instances, networking resources, databases, SaaS features, DNS items, and more.
The main difference between Terraform and other IaC tools is that it does not re-provision resources that are already successfully provisioned if it encounters them in some future execution of the configuration script. Instead, it focuses on the new additions and the failed resources from the previous executions.
Understanding Terraform Workflow
A basic workflow for using Terraform to build the cloud infrastructure will follow the steps shown in the block diagram bellow.
Scope: Identify the resources for the project. Some examples of resources are virtual machines, resource groups, network security groups, load balancer, etc.
Author: In this step, we need to create the configurations of identified resources using HashiCorp Configuration Language (HCL).
Git Repository: After creating the Terraform configuration for the given project, these configuration files are added into the Git repository of the project (or any other version control system which is being used to store and manage the code).
Cloud: Then the repository is linked with the cloud provider for example AWS or Azure to run the scripts in their particular environments to generate the infrastructure.
Initialize: To download the appropriate provider plug-ins for the project and initialize a working directory with the Terraform configuration files, the Terraform initialize command is executed in the project directory containing the Terraform configuration files.
Plan & Apply: Execution of the Terraform plan command creates the execution plan to verify the infrastructure creation process. Further, Terraform performs a refresh in order to determine the necessary actions to be performed to achieve the expected state of the infrastructure. This command helps users to know in advance that the infrastructure being created is matching the expected infrastructure.
After a successful verification of the execution plan, the Terraform apply command is executed to build the real infrastructure with the specified resources. It also creates a state file that stores the existing state of infrastructure and compares the configuration modification files with the existing configuration of the deployed environment.
Ansible – The Configuration Expert
Now after having the basic infrastructure, you find out that it requires some modifications and touch-ups. For this Ansible is your Go-to-guy. Think of it as an experienced craftsman.
Although Ansible is a tool that can do a lot more than configuring existing infrastructures. It is primarily known for the configuration tasks as it is very handy while using Ansible. A whole network of computers can be configured at once using Ansible.
The primary architecture of Ansible consists of two kinds of nodes – Control and Managed Nodes
A Control Node is the main system that Ansible has installed, while the Managed Nodes are the client ones that are connected to the Master Node using SSH or any other authentication technique. The Control Node has a host inventory file containing the IP addresses of the Managed Nodes.
The Control Node sends configuration programs which are known as modules to the Managed Nodes. These programs are saved as Ansible playbooks (terminology for the Ansible specific Configuration files written in YAML) on the Control Node. These modules compare the states of various Managed Nodes to what is mentioned in the modules, finds the mismatch, and then updates the state of the Managed Nodes.
Why using Ansible for Configuration Management
Simple and easy to use – The syntax of Ansible is written in YAML which is a human-readable data serialization language. That means it is very user-friendly and the learning time required to understand it is comparably less. Also, the tasks executed by Ansible are in a synchronous manner by default (no asynchronous-ity involved, phew!). That is why people who are not into software development can also understand the logic written in certain Ansible playbooks.
Community Support – Ansible has good community support and has 1400+ modules on its official site for certain tasks which can involve network management, databases, security groups, web servers, containerization, and monitoring.
No special Needs – Ansible is designed to work on the push technology, and it does not require any additional software to be installed on the client’s side (Managed Nodes). A SSH connection is all that is required by it to start managing remote clients.
Extensible and Flexible – The primary component of a Ansible module is JSON, which means the modules can be tweaked easily according to specific requirements. So, we can say that Ansible is extensible via any programming language. Further, Ansible uses the ‘Batteries included’ approach which means there is no for a custom plugin or code required by it for integration with various cloud platforms. It tries to give the best and out-of-the-box functionalities by itself.
We hope that with this short explanation on Terraform and Ansible, you know a little more on the frameworks used to automate cloud infrastructures. Let us know if you have something to add!
Model Based Systems Engineering
April 05, 2021 - Hilal Karaca
Model Based Systems Engineering (MBSE)
Model-based systems engineering (MBSE) is a method of systems engineering, which is not solely based on documents but rather focuses on models for the exchange of information, design, analysis and implementation of systems for their development. MBSE offers certain advantages compared to document-based engineering. The (manual) authority record of documents is removed by switching to digital modelling and provides a faster distribution and reaction to design changes to the engineers, enabling a better communication and analysis of a system design before its building. This furthers the automation process, increases the aptitude to manage highly complex systems, improves quality and knowledge preservation and ultimately reduces costs.
Systems Modelling Language (SysML) and Unified Modelling Language (UML)
Languages, such as Systems Modelling Language (SysML) and Unified Modelling Language (UML), and 3D CAD (Computer Aided Design) model tools are commonly used means for MBSE. UML is considered a standard for software development, but it is not suitable for system modelling by itself. This resulted in the development and use of SysML, which is an extended subset of UML.
SysML is a graphical modelling language utilized for the specification, analysis, design, verification and validation of intricate systems. Subsystems or elements of systems can be any hardware, software, data, personnel, processes and assets. Hence, with SysML, systems information can be collected, properly managed and modelled. SysML and UML visualize models with diagrams, in which blocks represent parts of a system. These blocks can contain any information and stand in relation to other blocks. Prominent examples of SysML and UML are Requirement Diagrams, Use Case Diagrams, Class Diagrams, Activity Diagrams and Sequence Diagrams.
How we make use of UML and SysML
One of our core capabilities is to develop and operate online Business-to-business (B2B) platforms. miSMART reflects this kind of a B2B platform for payload owners and space mission providers. While we do build and operate web applications, we are also offering our knowledge and software solutions as Software-as-a-Service (SaaS) for specific and tailored applications to various clients. SysML and UML are key factors in the development and management of these systems, particularly during the initial phases of a project. Furthermore, they are valuable instruments for Requirements Management, which ensures the preservation of the quality and the value of the requirements after their formulation until the end of the project. MBSE enables instantaneous communication and information exchange for automated processes and makes it possible to create Industry 4.0 conform processes and systems from the very beginning. With Industry 4.0 being the new technological directive, it is our mission to establish ourselves on the international aerospace market as a provider of flexible systems with optimized production, efficiency, costs and overall quality.
Want to talk about MBSE? Hit us up!
Benefits of Cloud Computing
March 23, 2021 - Sushma Kaushik
What is Cloud Computing?
Over the past few years cloud computing has gained widespread importance. It has significantly changed the way IT resources are utilized. More and more businesses are shifting towards cloud solutions. Some organizations have adopted well with its dynamic scalability and usage of virtualized resources as services. Cloud computing provides resources such as database, server, network, storage, infrastructure, application which are accessible via phones, tablets, laptops, or desktops from any geographical location provided they have access to internet as shown in the Figure below. Startups can specifically benefit from cloud services as they do not need to invest on physical datacentres, storage server, and other IT equipment as well as IT professionals to set up the physical infrastructure. They can manage, store, and process the data via web interfaces which is a high advantage for companies with a comparatively low IT budget. The service provider takes care of the storage, server and other IT aspects which makes business models in certain ways more scalable and sustainable.
What are Cloud Computing Models and their types?
Cloud Computing provides three main services, namely Software as a Solution (SaaS), Infrastructure as a solution (IaaS) and Platform as a Solution (PaaS).
3 different types of cloud setup exist: Public cloud, Private cloud and Hybrid cloud.
Public clouds are ideal for applications that are envisioned to scale over time. They provide the flexibility to change server resources at any time in the public cloud environment and they are shared by one or more users.
Private Cloud work similar to public clouds, except that the cloud is not shared with any other user.
Hybrid clouds combine the power of both public and private clouds. It incorporates server resources of public clouds with reduced infrastructure clouds and increased processing power of a dedicated server.
The corresponding business of a company will ultimately decide on the type of cloud solution based on the requirements the business model.
What are Cloud Applications?
Cloud Applications are software, primarily accessible via internet browsers and APIs (Application Programming Interface) and provide the functionality of native applications but they use cloud as infrastructure. They are loosely coupled, small, and independent services. Some examples of cloud applications are Google Docs, Office 365 and Dropbox. Modern cloud application use APIs and technologies that are native to the web. They have a more intuitive user experience and offer competitive functionalities when compared to native application. Most importantly, cloud applications can interact not only through web interfaces but also via APIs. This aspect is specifically important because integration and automation are key benefits of the cloud. The application and its data can be virtually controlled via scripts written in the APIs. Various cloud apps can be integrated with each other with the help of APIs to generate user specific workflows. A simple example would be integrating Teams, Jira, and GitHub with one another.
There are several advantages of developing cloud apps and moving to cloud solutions. They resolve the issue of large-scale distributed software deployments. Since the cloud apps are accessible via browsers, they are easy to update. They are also fine grained and provide centralized, user and data control. The admins are provided an extensive access management system via dedicated control interfaces which highly reduces the complexity of business software management. Cloud apps being accessible via browsers pose no threat from malware installation. Vulnerabilities can be fixed as soon as they are discovered and delivered to users once they hit the refresh button. Cloud apps also provide companies an insight to the users of the various resources of the application and the way they are used, which again provides new insights. In terms of licensing, cloud solutions mostly follow the pay per use model which reduces cost factors from different point of views. Due to a less complex IT infrastructure a smaller IT team is required to run a software, which in turn reduces cost factors from a business perspective but also reduces overall software costs as a business is not required to own the infrastructure.
If you are interested in learning more about cloud computing or have comments on this article, feel free to hit us up!
Givemefive/HOSTmi Space Science Contest for Kids!
March 18, 2021 - Pouya Haschemi
We are celebrating the 60-year anniversary of the first human spaceflight of Yuri Gagarin. Therefore, HOSTmi in association with its partner Givemefive has launched a unique Space Competition for Kids and teens to give them the chance to launch their work on a sub-orbital rocket. The launches are on micro launcher from Europe and the US carrying Givemefive-students' payloads to a sub-orbital orbit. The competition is open for students of age 8-18 years all around the world. With this, any kid can become a Space explorer from just being Space Enthusiast. Shri V Gnana Gandhi Ex-ISRO Program Director is part of the Jury members along with Givemefive.ai co-founder Ashwini Ramesh and HOSTmi co-founder Pouya Haschemi.
For this activity HOSTmi is in charge for organizing and managing the launch activity where it assures the overall success. HOSTmi makes use of its global launch providers, especially those from Europe, in order to accomplish this mission.
Givemefive has developed a proprietary online platform to educate kids to learn, build & launch their very own space science products. Givemefive offers courses on space science, which are created specifically keeping in mind kids age 8-18 years. Givemefive helps students across the globe to harness their curiosity and learn the different concepts from basic to advanced level through a combination of one-on-one and group live classes.
SLOTD4.0 Market Study
February 09, 2021 - Shahrokh Khodabakhshi
SLOTD4.0 Market Study
SLOTD4.0, short for “Structure and lightweight construction optimized standard payload module and satellite bus connection for the transfer of technology demonstrations into Space and Industry 4.0 compliant processes”, is a joint project between the German industry and university funded by the Federal Ministry for Economic Affairs and Energy and supervised by the German Aerospace Center (DLR) Space Administration.
Within SLOTD4.0 three main areas are in focus:
Modularity: The applied system architecture enables high flexibility through all project phases.
Standardization: The satellite bus, payload module and coupling interface enable considerable cost reductions in manufacturing and integration.
Digitization: A software-based end-to-end service enables the centralized management of all customer and mission-relevant information.
In order to better assess the development activities and project achievements, a market analysis was carried out to identify the needs of potential payload owners. In the following we discuss some of the main findings of a survey that was carried out.
The survey was split into 3 sections to identify requirements within the areas of Mission and Launch Requirements, Spacecraft Technical Requirements and Service Requirements. For the purpose of this article, we would like to point out only some of the results which we find very interesting.
Participants were asked about the purpose of their payload to get a rough understanding of the market and almost half of the participants stated their payload purpose to be a technology demonstration (IOD/IOV) while another third of the participants payloads is to realize an Earth Observation Mission. For two-third of the cases a mission duration of 1-5 years is desired while only one-third requires a less or longer mission duration.
The specifications of the payload themselves gave us an indication on how a potentially suitable spacecraft must be developed. 75% percent of all payloads have a mass of 20kg or less. Equally distributed within the mass categories of 1kg to 5kg, 5kg to 10kg and 10kg up to 20kg. Further, a significant trend towards bigger payloads is identified if we look at the payload volumes rated to be useful for the payload owner.
Considering Moore’s Law, the computer chip performance and along with it the processing power of technologies doubles every 18 months. Spacecrafts are no exceptions creating the need of increased onboard data processing. This is also reflected in the survey in which over 85% of the participants stated that onboard data processing is of high importance for their applications.
While payload technologies tend to be more complex and powerful than before, you would expect companies to be more conservative when it comes to risk taking. But contrary to this expectation the survey results show a trend towards a willingness to take higher risks for space missions when a potentially higher risk results in lower costs.
In recent years, we have seen a strong trend towards miniaturisation in the space industry through the intensive development of CubeSat technologies. This development trend was mainly driven by the high launch costs. However, launch costs have also fallen drastically in recent years and development trends indicate that they will fall even further. The question now is whether further miniaturisation of satellite technologies is still necessary to the same extent as before. The results of this survey may indicate that the trend of miniaturisation is decreasing, and payloads are again increasing in size and mass.
However, the majority of participants in this survey were European and therefore this thesis may only be applicable to this market.
What is your opinion on the market development regarding payload size, payload mass, processing power and launch costs? We would be very happy to receive your feedback and open the discussion!
With our format Spotlight on Space Talents, we want to open a platform for young and talented groups, students, NGOs, universities and other people from the so-called "space generation". In this edition, we would like to present the educational platform Givemefive:
Background of Givemefive
The Space Science vertical at Givemefive is headed by Ms. Ashwini Ramesh, an Aerospace engineer and the academic advisor is Mr. Vasudevan Gnana Gandhi, a Pioneer of Indian Cryogenic Rockets for PSLV, GSLV and ex-Project director and program director at Indian Space Research Organization (ISRO).
Space is the most exciting yet challenging sector for students. Thus, Givemefive aims at enlightening the youths about the wide variety of options available to pursue their dreams in the Space sector. We have three learning tracks namely:
Next Astronaut: Students are trained mentally, physically, and technically to be the future astronauts
Space Architect: Students learn about building colonization on Mars or any habitable planet
Space Explorer: Students discover all the mysteries about our Universe right from its evolution till exploring the unexplored celestial objects
Before starting Givemefive, we started analyzing the bottlenecks for the students to learn about space science & programming by talking to 1000+ students across different countries and analyzed their feedback. The main thing that came out clearly out of the analysis was there is no structured way for the students to learn about space science, even though there is a huge demand.
Givemefive started with teaching coding to young minds in the year 2018 by Ramya Bhaskar, an Academician & Entrepreneur. Within a year, Givemefive received the best innovative and promising education startup award by The Education Growth Summit 2019. Ramya was introduced to Ashwini by a family-friend and both came up with an idea to educate the future generation about the mysteries of flight, rocket, space, universe, and more.
Most of the kids are attracted to stars and they dream of reaching there one day
Not everyone fulfills this childhood dream majorly due to a lack of knowledge and proper guidance. Therefore, at Givemefive we bridge this gap and provide insight that helps kids to choose what they wish for. Amid the space industries booming around the world as well as the fact being that the future is all about space explorations and missions, this is the right time to provide a structured learning path to kids. Kids of this era are talented and are already doing marvel in every field but hardly any achievements in the space sector are seen. Thus, we are determined to boost their interest by providing them all necessary tools.
So far, we have interacted with kids from 9 different countries through webinars, one-on-one classes, and batch classes. Every kid comes with high motivation, expectation, curiosity, and doubts, we are proud to say that we meet their spirit, and this marks our true success. The most challenging part is to convince parents that their kids are already at the level to understand the concepts behind flights, rockets, trajectory design, and more. We can crack this difficulty by making the kids explain what they have learned during the class to their parents or exhibiting their projects. Through this, we get feedback from parents that they not only feel proud about their kid but also get to know many concepts which even they were unaware of and their childhood dream is being fulfilled now. So far, the biggest achievement for Givemefive is when a student says he/she does not want the classes to get over and their urge to gain deeper knowledge shoots up. We are proud that what we teach reaches the kid’s heart and this will remain there forever.
All our course contents are created by approved aerospace engineers and evaluated by ex-ISRO scientist Mr. Vasudevan Gnana Gandhi. With all these as a background and a partnership with HOSTmi, we look forward to developing a CubeSat built by young minds from various parts of the world for the betterment of humanity.
Every kid is unique and has the potential to make an impact on this world. Kids should be encouraged to gain more practical experiences rather than mere theoretical knowledge. With more and more interaction with the diverse set of students across the globe, we are realizing the depth of curiosity as well as the lack of tools & software available to educate the students about Space & Rocket related learning. This is helping us in building our platform & courses and we already have our courses customized to different age groups. You can book a Free class on http://www.givemefive.ai and experience our classes.
Givemefive has developed a proprietary platform and courses on space science, which are created specifically keeping in mind kids age 8-18 years. Givemefive helps students across the globe to harness their curiosity and learn the different concepts from basic to advanced level through a combination of one-on-one and group live classes. Our students are working on building CubeSats, UAVs, etc. With our partnership with HOSTmi, we have got a great partner to connect us with the world's best space service providers and make it easy to take our students' work literally out of this world!
Connecting the Last Mile delivery with the First Mile delivery!
December 16, 2020 - Pouya Haschemi
HOSTmi’s goal is to reduce space mission costs by placing its solutions for optimized Inventory and Requirements Management, Central Administration, and Matchmaking between space mission carriers and space payloads!
HOSTmi offers and uses its B2B platform technology to streamline the requirements management and matchmaking process between payloads and diverse categories of payload carriers (such as satellite platforms or launch vehicles). After the rebranding of the B2B platform to miSMART, we are delighted to announce the listing of Momentus - a Last Mile delivery provider for launch and hosted payload with future capabilities up to 1000km.
With its first in-space transportation vehicle, Vigoride, Momentus is planning to take up to 300 kg to Low Earth Orbits. Vigoride vehicles are designed to be able to perform altitude changes of up to 1000 km, inclination adjustments of up to 5 degrees and LTAN repositioning, as well as being capable of hosting payloads for Science and Technology Demonstration purposes.
Momentus is sharing our vision of digital assistance solutions and enables us to offer our customers both flexible software solutions as well as responsive realization options. Find all hosting and launch options of Momentus ONLINE! Register now, specify your mission and payload characteristics, and get matched for flights in 2021 and 2022: https://dashboard.hostmi.space/auth/register
Spotlight on Space Talents presents: TU Darmstadt Space Technology
September 16, 2020 - Guest Authors: Robert Bruns, Wiebke Retagne, Hanjo Schnellbächer, Fabian Burger, Roman Frels (Publisher: Pouya Haschemi)
With our new format Spotlight on Space Talents, we want to open a platform for young and talented groups, students, NGOs, universities and other people from the so-called "space generation". In this edition, we would like to present the student organizationTUDSaT from the Technical University of Darmstadt:
TU Darmstadt Space Technology (TUDSaT) was born in August 2016, as a group of 10 people realized the growing interest in all things space related among students of the Technical University in Darmstadt, Germany. This intuition has proven to be accurate over the last 4 years, as the organization now has over 120 members from more than 10 different disciplines, all sharing the same interest in space flight. The association is organized into three sections, most noticeably the CubeSat- and the Rocket-team, where all things technical are discussed. The third part is the PR-Team, where the goal is to ignite the passion for space among the general population.
“As resources are redistributed, we will seize this opportunity to get involved in space travel” says 1. Chair of TUDSaT, Robert Bruns. “The stars have never been so close, the enthusiasm for space travel is immense and our students are eager to prove themselves in the disciplines of astronautics.”
Space safety and sustainability are emerging challenges in the space industry, with ESA contributing 400 Million Euro out of their budget towards research. TUDSaT is at the forefront of this development by conducting research of optical identification methods which will help to locate objects in space and track their movement.
Fabian Burger, the CubeSat Project Manager explains: “for this purpose we utilize passive retroreflectors and reflective foils not unlike cyclist use them on their bicycle and vest to broadcast to any observer with a headlight their attitude and movement to simplify collision avoidance.” Besides localization, identification after multi satellite launches is especially difficult. Therefore, the team plans to demonstrate an active optical identification payload, which tackles this challenge. “With our mission, we will prove that space safety can be increased for the benefit of everyone without great cost.”
Although with the previously mentioned payload the mission return is already satisfying, another objective is to contribute to the growing open source community by creating an open source CubeSat-Platform. “By doing so, we give other teams the opportunity to build on existing development without having to start from scratch”, adds Wiebke Retagne, second Chair of TUDSaT. This frees up time for other relevant tasks, like working on the mission.
Besides the development of a CubeSat, TUDSaT started the development of their own sounding rockets. Solid and hybrid rocket engines are common among student groups and a great way to get started with rocketry. “Their professional significance however is minor, and we are striving for a System that can contribute to current scientific questions”, declares Roman Frels, Rocket Project Manager of TUDSaT. “Therefore, we decided to build a liquid fuelled rocket.” Additionally, a liquid fuelled system has the potential to not only break the European Apogee Record for student teams, but also to raise the bar a lot higher for times to come.
Right from the start, it was clear to the team that the engine would be the most challenging part of the rocket, regarding development effort as well as raising necessary funds. The development was started by dividing the system into smaller subsystems e.g. igniter, injector, ignition chamber. Testbenches for the igniter and injector will be implemented soon. Subsequently they will undergo some design iterations. “When deemed adequate the subsystems will be integrated into an engine testbench, which will be used to design the final engine”, explains Roman.
The way Forward
TUDSaT is currently financed via donations. Hanjo Schnellbächer, the Treasurer of TUDSaT discloses the associated issues: “One big current challenge for us is fundraising.We hope to attract more sponsors, as sufficient funding becomes crucial in order to progress through the upcoming stages of development in each project respectively."
At the same time Robert is enthusiastic about the current situation: “We are proud to host so many talented young minds. It is an honour to see how ideas come to life.”
But the team also struggles, admittedly, with everyday obstacles, like collaboration, lack of workspaces and the ever so gruesome exam period. Despite that, the team is iterating fast and continuously, part after part the projects are being pushed further.
“Thanks to our supporters, our Alma Mater and most of all our talented engineering staff, we are on the verge of being able to start building”, complements Wiebke. “But we lack space and resources. As we lack professional workspaces, our work depends on your support.”
Continue the story behind the young Space Talents of TUDSaT on their social media as well as their Website.
HOSTmi offers services of Agnikul to significantly enhance its launch portfolio
August 18, 2020 - Pouya Haschemi
HOSTmi’s goal is to reduce space mission costs with its solutions for optimized Inventory and Requirements Management, Central Administration and Matchmaking between space mission carriers and space payloads!
HOSTmi offers and uses its B2B platform technology to streamline the requirements management and matchmaking process between payloads and diverse categories of payload carriers (such as satellite platforms or launch vehicles). To increase the flexibility of the offerings we are pleased to announce the listing of Agnikul - a flexible smallsat launcher from India with launch capabilities up to 700km. Agnikul offers an innovative approach to launch flexibly from different locations around the world.
Agnikul builds launch vehicles that can take up to 100 kgs to Low Earth Orbits. Agnikul’s mini launch vehicle - Agnibaan - is propelled by a single piece, 3d printed LOX/Kero engines. Agnibaan’s design allows for launch access anywhere, anytime with a completely modular approach to configuring a launch vehicle.
This partnership will continue to disclose the need for digital assistance solutions to aggregate, manage, and administer the diversified offerings and solutions in a targeted manner.
We are pleased to have found in Agnikul another partner who not only shares our vision but also enables us to offer our customers both flexible software solutions and also responsive realization options.
IOD/IOV – Learnings from the past
July 23, 2020 - Shahrokh Khodabakhshi
Currently there are many In-Orbit-Demonstration and Validation (IOD/IOV) initiatives across the space industry landscape. These initiatives are important and necessary! But it would be careless to neglect activities and projects already carried out in this direction and not to build on them. Therefore, we want to briefly showcase the following 3 activities carried out in 2015/16 by 3 different consortia.
Despite heavy investment programs from the EU, new technologies hardly find a flight program to gain flight heritage especially in GEO orbits. PLUGIN aimed for developing an open standard interface for hosted payloads on board of commercial GEO satellites together with associated procedures and a business model. The goal was to enable fast growing technologies in the European Landscape by creating a multi-platform IOD/IOV Service using Hosted Payload slots on GEO satellites. In order to ease the process of “matching” GEO host opportunities and the hosted payload or IOD/ IOV candidates, the consortium leader AIRBUS DEFENCE AND SPACE SAS maintained a comprehensive and updated data base.
IN-orbit Validation of European Space Technologies – INVEST
The goal of this project led by ISIS - INNOVATIVE SOLUTIONS IN SPACE BV was to offer a multi-platform IOD/IOV community to enable a more rapid maturation of European space technologies and components. Since program managers of space missions rarely select unproven technologies and at the same time these technology providers look for flight opportunities to prove their technology, there is a technological valley of death. INVEST addressed this gap. One of the activities was to provide a “Matchmaking” between technologies in need of IOD/IOV and service providers offering (near) flight-ready platforms. For this matchmaking, a database of various options was created to enable a rapid and cost-effective match.
Within the project IODISPlay, the consortium led by GMV AEROSPACE AND DEFENCE SA assessed the European IOD/IOV situation, both in terms of needs (technologies in need of flight heritage) and capabilities (carriers and launchers). The consortium identified over 150 technologies across Europe in need of an IOD/IOV mission. Based on the ESA Technology Tree, the project investigated that the most demand for IOD/IOV were in the domains of “propulsion technologies”, “Space Systems Control”, “Onboard data systems”, and “RF Systems, payloads and Technologies”. To handle the high amount of potential missions and to match them with available carriers, a mission configuration tool called MITO was developed. MITO can intelligently analyze a database of carriers and technologies to intuitively provide a number of IOD/IOV missions. Due to the limited project duration, the consortium further suggested to repeat the exercise of identifying the market need on a regular basis and thus to update a database as it is a powerful tool especially for policy makers and analysis.
All three activities were carried out under the program H2020-EU.220.127.116.11. - Enabling European competitiveness, non-dependence and innovation of the European space sector. Through close cooperation between these different activities within the "Workshop on IOD opportunities and priorities in Europe", partial results of all individual projects were collected and streamlined. A core element of all projects was a database and associated software solutions with "matching functionalities". Taking this result and all other important outcomes of each of these and other IOD related activities, it is obvious that upcoming IOD/IOV related activities and solutions must include innovative and regularly updated software products besides standard interfaces and procedures to automate and scale new concepts.
We also would love to hear more about your learnings and experience when it comes to scalable IOD/IOD solutions. Reach out to us and tell us your story and experience!
May 27, 2020 - The SLOTD4.0 consortium
Modularity, standardization, and flexibility in space technology have increased significantly in recent years. Coupled with novel processes and digital approaches, the possibilities are almost limitless. How about replacing payloads for space missions shortly before launch? Or booking a satellite bus as easy as booking freight capacity in airplanes? This is exactly what the newly formed consortium with RWTH Aachen University (SLA), German Orbital Systems and HOSTmi wants to realize.
The joint project “Structure and lightweight construction optimized standard payload module and satellite bus connection for the transfer of technology demonstrations into space and industry 4.0 compliant processes – SLOTD4.0” is funded by the German Federal Ministry for Economic Affairs and Energy and administrated by the German Aerospace Center (DLR) Space Administration.
The main objectives of SLOTD4.0 are the introduction of modular end-to-end processes, the acceleration of development and production, the shortening of delivery times, the simplifying of ordering procedures, and the standardization of interfaces and engineering activities.
Within this project, a 16U CubeSat, consisting of an 8U satellite-bus-block and an 8U payload-block will be developed by RWTH Aachen University and German Orbital Systems. The blocks are equipped with an intelligent Space System Interface (iSSI), heritage of the former iBOSS project, a standardized multifunctional coupling device transferring mechanical loads, electrical power, and data. The patented iSSI interface allows coupling and decoupling individual payload-blocks, developed by RWTH Aachen University, to a standardized satellite bus, developed by the project partner German Orbital Systems. The functional separation of satellite-bus and payload, enabled by the defined standards of the interface, allows an independent AIT of bus and payload. This leads to increased mission and commercial flexibility regarding late loading issues or payload replacement needs.
Industry 4.0 and digitalization
In addition, based on Industry4.0 technologies and processes in combination with innovation strategies, HOSTmi will develop a service architecture with different procedures to map the product life cycle phases of the modular satellite bus from ordering and acquisition to manufacturing and operation. Focus here is on the "Customer Journey", plotting the phases of payload requirements management to AIT. Such an environment supports central administration of all user and mission relevant information, and processes all technical specifications and requirements in an automated, simple and standardized manner.
The primary structure must be lightweight, integration friendly regarding working space, offer predefined mounting points for a wide variety of components, solutions for harness issues and in order to meet current and future economic demands be optimized for large-scale production endeavors. This approach will open up new use cases beyond SLOTD4.0, such as in-orbit interchange of payloads or distributed apertures and sensors which underline the potentials of the technology. In addition, advanced algorithms could automatically determine the compatibility of different payloads with characteristics of hosting satellite buses, or identify missing or incompatible properties for payload customers, and, thereby generating feasibility awareness at an early stage.
SLOTD4.0 is funded by the Space Administration of the German Aerospace Center (DLR) with grants from the Federal Ministry of Economics and Energy (BMWi) based on a resolution of the German Bundestag. Funding reference numbers: 50RA2000, 50RA2003, 50RA2004
COVID-19: An opportunity to drastically improve the customer journey
April 30, 2020 - Pouya Haschemi
To sum up the year 2020 so far in one word, will have to be “gloomy”. Trade shows, conferences and various other events have all been cancelled and no further events are likely to take place this year except for smaller business meetings. More than 500 exhibitions have now been cancelled worldwide. But what does this really mean for the space industry and the many face-2-face business talks that are now cancelled?
First of all: do not panic! The problem affects everyone, both supplier and buyer; both sides must now progressively learn to compensate, not only because of the personal meeting restrictions, but also because of the significant increase of home office usage. Searching for products, comparing offers, as well as purchasing items and services has inevitably shifted to the Internet even for essential goods. Your customers have realized that physical contact is not possible at the moment and are looking for adequate alternatives. In addition, your B2B contacts have very likely never been so open-minded about exploring new technologies and digital tools as they are now. This in fact must be the foundation to come out of this crisis even stronger. The current situation forms the basis for not only thinking about increased digitalization strategies and online presence in the overwhelming B2B-oriented space industry, but also for finally putting them into reality. Instead of sticking your head in the sand, it is now time to take decisive actions so that you are constantly two steps ahead. Not just now, but also for the future.
Building the digital bridge
Due to the lack of personal interactions with your customers, the time has finally come to create digital touchpoints for your clients. This gives you the chance to showcase your products by making them accessible throughout the year worldwide instead of exhibiting them at a locally limited fair for just a few days. With a strategically oriented web presence, you become increasingly independent of physical meetings and create a coherent customer journey across all touchpoints.
In order to master this challenge, the following aspects need to be enlightened first: For most marketing experts , the key element of events is the distribution of some form of content, such as information about new products, new business strategies or the chance to build a pioneering role. The objective now is to precisely enable this information via new channels. Digital touchpoints in particular are ideally suited for summarizing content into target group-oriented packages and tailoring them to the needs of said target group. Targeting groups does not necessarily mean addressing customers personally, however it does mean to consider any valuable and beneficial content in line with actual needs.
The spread of the coronavirus will boost the use of digital tools across the globe, as people spend more time at home and communicate less in person. Once customers have become accustomed to a digital process, they will continue to use online services even after the Coronavirus has disappeared. Despite all the difficulties that the pandemic brings with it, it also provides the motivation for decisions that many companies have been putting off for years.
HOSTmi is committed not only to support the space industry in a time of unprecedented crisis, but also to facilitate the next phase. We have always pursued objectives with our online solutions that can help to implement a long-term digital strategy. In a nutshell: this virus could become the biggest driver for digitalization within our industry. It becomes obvious if we look at the mandarin word 危机 (Wéijī) for crisis, where the character 危(wéi) means danger and at the same time 机(jī) means opportunity.
Vote now for HOSTmi!
February 04, 2020 - HOSTmi
We are nominated for this year's connect breakthrough 2020 award.
And you can help us to make it to the next round!
You think that HOSTmi deserves a place in the final in Dresden? Then vote here NOW!
The start of an easy way to access space
October 19, 2019 - Shahrokh Khodabakhshi
The recently agreed collaboration between Deployables-Cubed and HOSTmi marks the start of a new way to find flight opportunities for space payloads. Read more about Deployables-Cubed and the envisioned goals of this collaboration in this article.
In space, the trend goes towards small and standardized satellites, so called CubeSats, allowing economic access to space. These satellites have an edge length of 10cm for a single 1U limiting greatly the missions that the satellite can fulfill. A way around this limitation is the use of deployable structures like appendices, antennas or sails which are unfolded once in orbit allowing new high-performance applications that are currently only possible with larger satellites. To tackle this problem, the Munich based new space company Deployables Cubed develops miniature actuators and CubeSat deployables with the goal to achieve Europe’s non-dependence on these miniature actuators and the cube satellite deployables they enable.
Products: CubeSat Actuators (Pin Puller and Release Nut) & CubeSat Deployables (like booms, antennas, etc.
Roadmap: Qualified actuator by beginning of 2020, first deployables developed by mid 2020
“Deployables-Cubed sees in HOSTmi a partner that will enable us to perform fast and economic in orbit demonstration flights of our developed actuators and deployables” says Thomas Sinn CEO of Deployables-Cubed. “This in orbit demonstrations and verifications are essential to raise the Technology Readiness Level (TRL) to the necessary level to have our products space qualified.”
Breaking up conventional value chains and creating a single value network is one of the core elements HOSTmi is aiming for. The collaboration with Deployables-Cubed lays an important cornerstone for HOSTmi to prove its services and to go through all processes based on real cases. This pilot project is aimed to showcase the advantages of online services inside the space industry, which will increase the user experience within the procurement phase.