GIGABYTE Servers Become Part of the German Aerospace Center’s Data Center

When the German Aerospace Center (Deutsches Zentrum für Luft- und Raumfahrt, abbreviated as DLR) wanted to expand their data center, they searched for servers that could operate smoothly in an ambient temperature of 40°C without the need for air conditioning. GIGABYTE’s server team provided a solution from its High Density Server product line that combined computing, storage, and connectivity in a single system. The High Density Servers are equipped with liquid-cooling technology to run without a hitch in high-temperature environments, enabling the data center to efficiently process an enormous amount of space-related research data in the limited space available.
On the 50th Anniversary of the Moon Landing, Space Research is Booming
Fifty years ago, American astronaut Neil Armstrong stepped off the lunar lander and became the first human being to set foot on the Moon. He uttered the famous quote: “That's one small step for a man, one giant leap for mankind.” This was a new milestone in mankind’s exploration of space. 

Humanity has always been very curious about not only the Moon, but also the other regions of outer space. Scientists are always looking to gain new insights into the formation and evolution of our Solar System. The exploration of Mars is one area of intense interest, even though with our current level of technological development, it will be much more difficult for humans to set foot on this planet. Japan, France, and Germany began working together from June 2019 to jointly develop the Mars Moon Exploration (MMX) Program, which aims to launch an AI-enabled robotic space probe in 2024. The probe will be equipped with a rover that will go into orbit around Mars, so that we may understand more about the evolution of Earth’s red neighbor and the origin of its moons through robotic observation and sample collection.
Giving the Aerospace Center a Helping Hand for Large, Complex Computing Workloads
Scientists at the DLR have been studying the Earth and Solar System for decades. In the process, they have designed robotic probes for space missions to Mars and cooperated with other international organizations to develop space shuttles and rockets. These highly precise technologies are often developed using large and complex sets of data, such as images returned from outer space or trajectories and altitudes from spacecraft flight recorders. The data can be rapidly analyzed using the computational capabilities of powerful servers to conduct more research and development.

In response to the growing number of research programs, the massive amount of data requiring storage, and increasingly complex computing needs, the DLR launched a new project to expand their data center. 《Glossary: What is Data Center?
Overcoming the Project's Challenges
In recent years, environmental protection has become a critical issue due to the effects of climate change and global warming. Data centers – which traditionally consumed a huge amount of electricity – are being designed to better meet the goals of energy efficiency and carbon reduction. The DLR wanted servers that met the following requirements:
● Capable of operating in a data center with an ambient temperature of 40°C and with no air conditioning equipment
● Outfitted with the most suitable cooling method for servers without changing the existing mechanical and electrical infrastructure within the data center
● The temperature of the liquid used for heat exchange should not exceed 60°C
With more than 20 years of experience in the server industry, GIGABYTE teamed up with a local systems integrator to take on this project. Our H261-Z60 High Density Servers were more than enough to overcome the project’s challenges. These servers can provide up to twice the maximum computing power of competing products on the market. They reduced the servers’ footprint by 50% and met the requirements for high speed computing and storage capabilities in the limited space available. To provide the best cooling solution for the data center, the servers were also equipped with a liquid cooling system by CoolIT Systems.《Recommend for you: About GIGABYTE’s High Density Server Products》
The Future Trend of Data Centers: Energy Conservation & Smaller Footprint
Data centers that offer increasingly higher computing capabilities will consume more and more energy for cooling. However, Europe has experienced its share of heat waves in recent years, and the average annual temperature is climbing. Therefore, for the data center’s expansion project, it was decided to adopt the more energy efficient liquid cooling technology, instead of using air conditioning to dissipate heat.

Currently, there are two main liquid cooling technologies commonly available on the market: Immersion Cooling, and Direct Liquid Cooling. Since the available space was limited, and because the DLR did not wish to modify the data center’s existing mechanical and electrical infrastructure, Direct Liquid Cooling was chosen for the cooling solution. At the same time, it was feared that if the temperature of the liquid discharged from the heat-dissipation pipes was too high, it could cause an excessive amount of wear and tear on the system’s components in the long run. Therefore, it was requested that the temperature of the fluid used for direct liquid cooling should not exceed 60°C. 《Learn More: How Does Direct Liquid Cooling Technology Help Solve the Heat Dissipation Problem of High Density Servers in a Limited Space?
An Immersion Cooling system uses a non-conductive dielectric fluid to assist the server with heat dissipation.
A Direct Liquid Cooling system uses liquid in heat-dissipation pipes to remove the heat from the server.
GIGABYTE’s Senior Product Marketing Manager Andie Yen talks about her experience when she first began working on the project: “I brought all the technical information and data that the client gave me back to the R&D department for communication and discussion.” She recalls listening to the R&D team explain the fundamentals to her for more than two hours before she was finally able to calculate the thermal energy consumption of all the new servers required for the data center. After that, she continued to work with R&D members to repeatedly study, re-plan, and re-verify various server configurations, in order to present the most suitable customized design to the client.

Direct Liquid Cooling technology involves a dense configuration of liquid cooling pipes installed within the server chassis, which assists with heat dissipation via liquid circulation. You can imagine it as something similar to the water heater in your house, which turns cold water into hot water as it runs through the pipeline. The roles are reversed in this case, where cold liquid is pumped into the server chassis through cooling pipes. The pipes run a circular route inside of the server, allowing the liquid to absorb the heat emitted by the CPU, GPU, and memory, becoming hot in the process. The hot liquid then flows back outside of the server to a regulator at the top of the server rack, where it is converted back into cold liquid and re-used. Through the effort of GIGABYTE’s technical team, the temperature of the hot liquid exiting the server was finally capped at 58°C, successfully meeting the DLR’s request that the cooling fluid’s temperature should not exceed 60°C.
Efficiency Reaches New Heights with More Scientists Working in Tandem
The GIGABYTE server product team strove to optimize the data center’s new server configuration, hoping to help the client improve their efficiency. Each R281-Z94 (the server that handles user connections and data management) was equipped with two brand new RTX5000 graphics cards, which can support up to twice as many user connections as other graphics cards. As a result, each management server can now simultaneously allow up to 64 different scientists to connect to the computing cluster to perform calculations, which reduces a lot of unnecessary downtime. 《Recommend for you: Learn More About GIGABYTE’s R-Series Rack Mount Server Products》
GIGABYTE Commits to Helping the Aerospace Center Save Energy
Even more importantly, in order to help the client reduce operating expenses, GIGABYTE’s server product team optimized the server’s power supplies, resulting in a 15% reduction in power demand compared with competing products. Reduced energy consumption means much lower energy bills, which brings additional benefits to the DLR. Andie Yen emphasizes that, “We often tell our clients, you don’t want to just look at how much money our proposal is saving you on the books, but how much long-term expenditure, such as electricity and water fees, will also be saved by using GIGABYTE’s server products.”

In keeping with the enactment of energy conservation and carbon reduction policies in various countries, data centers that traditionally consumed a huge amount of energy are being redesigned to be more environmentally friendly and energy saving. In this project, GIGABYTE’s server team not only fulfilled our client’s target of obtaining maximum computing capabilities in a limited space, we also resolved their main concerns about equipment temperature tolerance and energy conservation. GIGABYTE did so by providing fantastic results for heat dissipation and power consumption. In doing so, GIGABYTE not only succeeded in obtaining more opportunities for further cooperation with the DLR in the future, it also did its part to contribute to environmental protection. 
Realtion Tags
Data Center
Artificial Intelligence
Liquid Cooling
Server Farm
In the Quest for Higher Learning, High Density Servers Hold the Key
A top technological university in Europe noticed rising demand for computing services across its various departments. It decided to build a next-generation data center with GIGABYTE's high density servers. With the right tools in place, scientists were able to accelerate their research, analyze massive amounts of information, and complete more data-intensive projects. Science advanced while the institute flourished.
Satisfying Your Need for Speed:  Server Technology Helps to Achieve Aerodynamic Vehicle Design
A world-renowned automotive manufacturer uses Computational Fluid Dynamics (CFD) simulation software, analyzing huge amounts of data to optimize the design of their vehicles. They selected GIGABYTE’s high density multi-node servers to build a high-performance computing cluster for their vehicle design center, making the most efficient use of the limited space available to deliver maximum computing power to their aerodynamic engineering team.
As Good as Gold – High Performance Computing Accelerates Oil Extraction
High Performance Computing (HPC) can complete complex and large-scale computational analysis workloads in a relatively short amount of time, bringing about many breakthroughs in scientific and technological development. HPC is also an indispensable tool for contemporary scientific research, and the number of fields that it can be applied to is constantly growing, such as for weather forecasting, earthquake imaging or genetic analysis. Even oil extraction can now rely on HPC to improve process efficiency and accuracy, allowing mining companies to save a huge amount of money and giving them a greater competitive advantage in the energy market.