Thanks, Budget Cuts, and an Inside Look

| comments


I would like to begin this blog with a heartfelt thanks to all of you that have contributed to the UW weather prediction research fund noted in the upper right of this blog and here. Over one-hundred of you have contributed both small and larger amounts and these funds (several thousand dollars so far) will help maintain our regional weather research and our real-time local weather prediction models running. Others contributed to the student scholarship fund, which is extraordinarily important as tuition zooms and financial pressure on our students increase.

It is really impressive that there is a community of Northwest residents who are so committed to keeping the collection of regional weather data, state-of-the-art weather models, and local forecasting research going in this period of cutbacks and retrenchment.

And the support is truly needed. This month for example, I learned that the funding I had gotten from the NWS for over a decade is ending due to their cutting of university funding in half. This was the money I was going to use to assimilate the new coastal radar into our local weather prediction efforts! Anyway, with your support I will work to keep our efforts going.

But since you are investors in our regional modeling, let me give you a little glimpse "behind the curtain" to see what you are supporting.

The local weather prediction computing facility is in the atmospheric sciences building on the UW campus. I have about two-hundred processors, most of them on dual quad-core servers (8 processors per board or node) and over 300 terabytes (trillion bytes) of disk storage. And to keep the thing running when lights flicker, there are uninterruptable power supplies (UPSs). Some of these processors can communicate through high-speed (40 gigabit per second) interconnects. All this stuff is packed into several clusters in very special racks (more on this later). Here is an example of one of the clusters:


This computer facility is one of the greenest around! The UW didn't have enough funds to give us a decent air conditioning system, so we came up with our own approach...we blow all the hot air outside and bring in cool air from the outside to make it up! To make this work we got fancy enclosures with BIG fans inside them. It pulls air across the processors and then out big ducts to the exterior of the building. Here are two pictures of our Rube Goldberg set up:
Ok, it is not pretty. And the UW AC guys didn't like the looks of it, but it really works! It is like all the heat from the computers doesn't exist. We have a big intake in one of the computer rooms that brings in the cool outside air. Only when the outside air is hot (a VERY, VERY short period here in Seattle!) do we need AC and the weak units the university gave us, plus an auxiliary unit we bought, does the trick. Why use energy to cool hot air when you can get rid of it! If we were really clever we would redirect the computer air to the heat ducts of the building and no external heat would be needed. Is anyone designing buildings to do this? They should.

You know the WRF model I am always showing on this blog? Generally we run this 0n 64 processors simultaneously, in other words the code is parallelized so that the problem can be split efficiently on many processors. And fast communication between the processors speeds this up immensely.

This computers not only run the forecast models, but they include web servers and other needs. This is not an inexpensive enterprise, and that is why your help is so valuable. Disks continuously fail, UPS batteries die, backup tapes are continuously needed, and we find that we have to replace the processors roughly every five years. Our system programmers, Dave Warren and Harry Edmon, are marvelous in keeping the system going and it rarely fails (at least 99% availability). And the department charges us per processor for support.

The amount of data moving through the system is amazing. Every day we bring in hundreds of gigabytes of weather data (NWS models, satellite and radar data, surface obs) and we acquire the data from over 70 local weather networks...all in real time. The models we run produce hundreds of gigabytes more a day. And the graphics you see on the web....we produce over 50,000 images a day! A tape and removal disk back system allows us to save key observations and data --again more expense.

And then there are the people... right now I have 3 staff members who spend much of their time developing improved weather prediction systems and maintaining this enterprise. And of course the students who are doing their theses on understanding weather systems and developing future technologies--like figuring how to get the maximum benefit from the new coastal radar.

This all started in 1995 with a single processor computer.

Again, thanks for all your help...cliff
Share this article :
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2011. The Weather - All Rights Reserved
Template Created by Creating Website Published by Mas Template
Proudly powered by Blogger