A month, a partner: UCit

A month, a partner: UCit

CTO of UCit, Benjamin Depardon has a dual background of engineer and IT researcher: first an engineer at Lyon INSA, he obtained his Master and then Doctorate from the Lyon’s Ecole Normale Supérieure in the domain of HPC (High Performance Computing) and management of distributed heterogenous computational environments such as Grid and Cloud. He started his career as a technical director in an INRIA-related start-up: SysFera. This start-up shared the same HPC vision as UCit, which is that the HPC must be simple and transparently delivered to democratize its use.

UCit positions itself as an expert regarding the challenge resulting from the constant changes and evolutions in the HPC world. UCit offers a set of solutions enabling:

  • To simplify access and use of HPC infrastructures to transcend its complexity (towards a “HPC as a service” concept) and to allow its democratization
  • To optimize the use of supercomputer to reduce costs, availability, and energy consumption
  • To integrate the Cloud Computing and the flexibility that arises in the new workflows, notably in terms of Deep Learning and the data processing.

A common vision

The AQMO project is fully in line with the UCit vision: AQMO aims to deliver air quality data processing services fully transparent to any users of air quality data processing services. This necessarily requires a complete disregard of the underlying infrastructure that performs the processing and simulations.

UCIT is working on these aspects on a daily basis and develops innovative solutions that allow to present computing resources and applications in the form of services, which are consumable by users, for example through a web portal such as that of AmpliSIM.

AQMO allows us to expand our solutions, in particular by integrating end-to-end use cases, from “Edge/Fog Computing” to hybrid HPC (Cloud and Supercomputer).

The partners with whom we have worked most on these technical aspects are, on the one hand, IDRIS on infrastructure topics, Ryax Technologies on workflow aspects, Data flow and automation of data processing at different levels, and Rennes 1 University, which has set up sensors and embedded Edge platforms on buses.

UCit within AQMO

The first step in the project was ‘the abstraction’ of computing resources. We have deployed with CCME (Cloud Cluster Made Easy) and the EnginFrame portal, tools to provide access to computational resources in the cloud, or to IDRIS in a transparent way through unified interfaces. Each computing infrastructure has its own internal tools, and we provide an abstraction for these different resource providers in order to hide from the end users the “pipework” needed for the proper functioning of the AQMO platform.

These users are not experts in HPC and do not necessarily understand the complexity behind when applying for data processing and simulations: there is a need to choose the right types and number of calculation resources, as well as the target infrastructure. In order to simplify these choices, we worked on analysing and forecasting the needs for the calculations: we have developed machine learning tools and algorithms capable of answering the following questions when users submit calculations: “How many resources do my calculation require?”, “In how long will I get my results?”, “How much it will cost?”, etc. The goal is to hide the complexity of HPC for the user, and for the administrator to optimise the burden of resources by limiting operating errors.

UCit solutions

Our solutions are at the heart of the AQMO architecture to manage the execution of data processing on HPC environments. We are very closely interacting with the instruments of Ryax, which are responsible for managing the overall data workflow and processing operations. We are behind this workflow manager to get started with computing resources in the cloud, and to provide a single entry point to Ryax to use them as well as that of IDRIS computing centre to integrate them into their workflows.

Ahead of the execution of each data processing operation, our prediction tool, Predict-IT, is queried by Ryax to identify the target platform and the correct execution parameters to be used according to criteria chosen by the user.

Finally, we have also developed a tool for analysing the costs of carrying out these workloads in order to be able to plan data analysis campaigns and to arbitrate between different price/performance scenarios (render times of the analysis).

AQMO as an innovation in data processing and usage

The data recovery system is, in itself, innovative, even before the data are processed. Instead of relying on data from stationary sensors, the AQMO project uses mobile sensors installed on buses, making it possible to grid areas in real time in a broader and dynamic manner compared to fixed sensors.

The second major innovation is that the platform we are implementing is analysing and processing data at different levels and scales. The data can be processed at the source, in the buses, where a first level of calculation is made, to understand what is going on around the bus. Then at the central server where other filters and other operations are carried out. Finally, for simulations needs to predict the evolution of the dispersion of pollutants, we use HPC infrastructure at IDRIS or in the Cloud.

These simulations will make it possible to answer questions of everyday life, for instance regarding the proper location where a future preschool is to be set up, depending on traffic rates and pollution of an area. But also to respond to emergency scenarios following incidents, in order to contain the problem, draw recommendations, forecast potential spillovers, etc.