How Online Services, Supercomputers And Satellites Recast Weather Forecasting
Jul 25, 2019
Modern hydrometeorology began when physicists discovered thermodynamics and hydrodynamics.
On August 1, 1861, the English newspaper “The Times” published the first-ever weather forecast, compiled by Robert Fitzroy. Thus the first attempt to fill the gap of “weather knowledge” for the public was made. Fitzroy’s predictions were often ridiculed by journalists, as his data was slow to gather and often outdated by the time it went to press.
Until the late 1950s, weather forecasting was mostly done in the traditional method. This was based on the construction and analysis of synoptic maps depicting atmospheric conditions at a specific point in time.
In the middle of the last century, meteorologists came to the conclusion that historical methods are not an accurate predictor for future weather conditions. Modern meteorological centers started analyzing larger amounts of numeric and statistical data, using quick modern data collection and processing.
Sea buoys, air probes, satellites — all these devices help to track in real-time the location and movement of cloud formations and zones of intense precipitation, as well as record the development and movement of dangerous phenomena such as thunderstorms, hail, or gales.
It is one thing to gather knowledge of the physical world around us, but yet another completely to understand how it evolves and changes in time, to connect the dots and predict what will happen next. Advanced satellites, satellite imagery, and next-generation computers promise huge strides in understanding our world.
Why satellites are important in forecasting
Thanks to the military necessity of developing rockets, scientists in the 1940s came up with the idea of also sending cameras to orbit the earth and observe Earth’s weather. Soon after the launch of the first satellites, meteorologists thought to observe Earth’s atmosphere from space. April 1960 saw the USA launch “Tiros-1”, the first successful meteorological satellite. Its success was a hallmark to the suitability of weather satellites.
Collection of weather data from orbit has the advantage of being more expansive and accurate than ground devices. Given that 71% of the earth’s surface is covered in ocean, plus the amount of inaccessible land due to deserts, mountains, polar regions and so forth, satellites are really the only viable option for reliable, consistent weather information.
The equipment installed on satellites makes it possible to obtain detailed data about the state of clouds and snow cover, ice conditions, thunderstorm activity, precipitation, hail and their phase state, the temperature of the earth’s surface and water, wind conditions, sea waves, etc.
Thanks to satellite observations, it is possible to see the big picture of weather phenomena at once. For example, the AIRS (Atmospheric InfraRed Sounder) device aboard the Aqua satellite, NASA creates three-dimensional maps of air temperature and surface, water vapour and cloud properties — to accomplish this with ground stations, it would take several hundred to reach the same parameters.
More than just precise forecasts, AIRS also monitors volcanic emissions and smoke from forest fires and measures harmful atmospheric compounds such as ammonia. When you hear that the ozone layer over Antarctica has begun to recover, it was measured thanks to AIRS.
There are other devices observing the weather from space. The scatterometry method remotely monitors wind speed and direction over the oceans. A scatterometer is a microwave radar transmitted to the surface of the ocean, which then measures the effective scattering area. This backscatter measurement proves the parameters of existing winds. The launch of the American SeaSat in 1978 was the first time such a device was proven to accurately measure wind speed from orbit.
Since global politics are tense, and countries lack trust even in weather forecasting, each maintains its own network of weather satellites: Meteosat from Europe, GOES from the USA, MTSAT from Japan, Fengyun from China, GOMS from Russia, and even India has KALPANA. Each reports some analytical information about weather conditions above their territory to other countries, but even then, technologies aren’t perfect.
For instance, NASA is developing a new version of lightweight satellites dubbed RainCube. Each RainCube satellite is several times smaller than any existing comparisons. According to NASA engineers, one satellite is no larger than a shoebox and can “easily fit in a backpack.”
RainCube scans the terrain using technology similar to echolocation: the satellite antenna sends sound waves and radar signals which reflect from the water and microparticles in the atmosphere, and send an echo back to the satellite. By analyzing this data, the device can “see from the inside” everything that happens in the clouds or in the storm (which can’t be done with ground equipment). A hive of such spacecraft is extremely easy to scale; it can widen as needed or partition into small groups.
Creating a comprehensive constellation of satellite scanners will enable more efficient forecasting of sea storms, the study of ocean circulation, and the interaction between the atmosphere and the oceans, as well as their impact on weather and global climate.
Weather forecasters are literal lifesavers when it comes to storms, typhoons and hurricanes. Not to mention the billions of dollars saved by having advance warning of impending storms. It may well be that, in all the history of meteorological science, there has not been a tool to compare to the satellite.
Supercomputers in weather forecasting
Each day, each hour, ground-based meteorological stations, meteo-probes, ocean buoys and meteorological satellites collect massive amounts of data. This data is then streamed to meteorological information processing centers which are equipped with the most up-to-date computers. The need for knowing tomorrow’s forecast is now, not tomorrow or next week. Any lesser machine would just not be up to the task of churning through this vast amount of data in the minimal time needed.
English mathematician Lewis Fry Richardson said that he would need 64 thousand people to perform the necessary calculations for today’s high-quality weather forecast.
The Top500 list ranks the most powerful computing systems in the world. In the November 2016 edition, 23 of the supercomputers were weather forecasting systems. Although these 23 systems represent less than five percent of the total number of supercomputers on the list, they made up over seven percent of the combined performance of the list.
Currently, the United Kingdom Meteorological Bureau’s Cray XC40 is the most powerful weather forecasting computer with 7 petaflops performance and ranks number 11 in the Top500. The second most powerful is the Cheyenne installed in the National Center for Atmospheric Research (NCAR), today it ranks 22nd on the list, delivering a 4.8 petaflops performance.
A petaflop is a unit of computing speed equal to one thousand million floating-point operations per second.
In 2017 the Weather Company, a subsidiary of IBM, announced plans to improve the quality of global weather forecasts by means of a collaboration with the University Corporation for Atmospheric Research (UCAR) and the National Center for Atmospheric Research (NCAR).
The alliance will bring together IBM’s next-generation supercomputing technologies, advances in world-class meteorological science via The Weather Company, and of course the expertise of IBM Research computers, OpenPOWER based supercomputer systems, and NCAR’s existing community weather model. By capitalizing on advanced science, they expect to improve the quality of long-term forecasts and create more reliably accurate predictions weeks or even months in advance.
Current modern technologies make it possible to predict large-scale meteorological changes that affect regional weather, such as blizzards and hurricanes. By maximizing computational power, this new model can enhance weather and climate forecasting at a local level by taking into account small-scale phenomena, such as thunderstorms, which affect the local weather.
Satellite services that calculate, analyse and conclude
A massive computer, spanning several floors of a building and a worth a few hundred million dollars is a rare treat to even the most famous researchers. So how can educational institutions or researchers continue research in weather indicators without costing their life savings? Geo-information systems (GIS) is the answer!
Geographic information system (GIS) — a system for collecting, storing, analyzing and visualizing data related to certain positions on the Earth’s surface.
EOS (Earth Observation System) Data Analytics is one of the GIS services which changed how satellite-based data services work.
EOS was launched by Noosphere Ventures and Max Polyakov in 2015 and currently analyzes human-made and natural objects via satellite photo, forecasts weather for a specific territory or even provides climate impact analysis. The company has developed and implemented a platform for the collection and analysis of GIS data and boasts a toolkit of 4 services: spectral analysis of satellite images of the Earth’s surface, automated processing of satellite images, visualizing, analyzing and storing.
Climate Change is a major global impact effect which requires satellite monitoring tools. The rising ocean temperatures loom disastrous to the Arctic, which consists of mostly sea ice. Antarctic, by comparison, has mostly land-based ice, which means a much slower melting.
By using remote monitoring, scientists are able to better observe and track the iceberg calving stages, right from rift detection to a full iceberg breakaway. They are also able to track the drifting bergs and measure overall ice cover.
A satellite imagery use case study shows how glaciers melt and change over the years. The EOS team observed the Antarctic region using ESA’s Sentinel-1 radar photos and LandViewer, and chose to study Larsen glacier in more detail. You can read the full text about EOS research of the Antarctic glaciers melting here.
How reliable the weather forecasts will be?
Readers could ask, “If we have such advanced technologies, will we be able to predict the weather with 100% accuracy? Will we be able to know precisely what the weather will be at each point of time in every country?” In short, no.
There are 2.0×10^44 (200,000,000,000,000,000,000,000,000,000,000,000,000,000) molecules randomly moving about in the atmosphere, creating weather. Attempting to track and display them all would be unimaginable. The simple chaotic nature of weather means that there is always the possibility of error.
Improved modelling will, however, allow more realistic assumptions, and more powerful supercomputers will allow more and more detail just as high-resolution satellite imagery will allow inspecting a single use case. As far as weather forecasting goes, there will always be a need for assumptions. The future looks promising, but how close we come to the perfect forecast is still a mystery.
Follow our Facebook page to know more about space investment
Read more from Noosphere Ventures:
- EOS to Create Its Own Radar Microsatellite Constellation “EOS SAR”
- Ukraine’s parliament opens space for private companies
- Meet the Noosphere Ventures team at the 70th International Astronautical Congress 2019!
- Trident Defence and Firefly Ukraine teams meet delegation from Pentagon
- Top Tech Trends Influencing Space and Earth Business in 2019
- Noosphere Ventures July: An Interview With Max Polyakov, Three Legs Of Venture Investing And Future Of Electro-Jet Low-Thrust Engines By SETS
- Max Polyakov and Trident Defence team meet US Ambassador to Ukraine
- How Online Services, Supercomputers And Satellites Recast Weather Forecasting
- Q2 2019 Edition Of Venture Investments In Space Industry
- Monthly Noosphere Ventures Digest: Firefly Kickoff, Interview With Tom Markusic, EOS Change Detection, Civic Progress