Whither the Weather
A Raspberry Pi directs data collection from a network of weather stations in the Caribbean, processes the data, and makes it available on a web page.
A Raspberry Pi directs data collection from a network of weather stations in the Caribbean, processes the data, and makes it available on a web page.
Climate, weather, and hydrological data collection and archiving is an integral part of the services provided by the Caribbean Institute for Meteorology and Hydrology (CIMH) [1], located in Barbados. Data collected in these archives is integrated into global databases that are used to assess and monitor global and regional climate change. These data are also important for designing national and community-based climate adaptation strategies, so easy access to the data is critical.
Apart from training technical personnel for their respective National Meteorological and Hydrological Services (NMHSs) positions across the Caribbean region, I am frequently requested to assist with the installation of meteorological and hydrological equipment throughout the region. These requests can come directly from an NMHS, but most requests come directly from projects funded by regional and international development agencies. Regardless of the source of the request, a common question asked is, "Can I access the data?"
Under the Enhancing Resilience to Reduce Vulnerability in the Caribbean (ERC) Project [2] funded by the Government of Italy from 2009 to 2013 and executed by the United Nations Development Programme Office for Barbados and the Organization for Eastern Caribbean States, 14 full Sutron automatic weather stations (AWSs) [3] were installed at locations in the Eastern Caribbean, including the CIMH campus. A Windows machine was purchased for the Sutron software, XConnect [4], for each AWS system. This Windows-based software downloads and archives data onto local machines on each island.
One of the objectives of the ERC project was to establish an AWS network to assist in disaster risk reduction and to support the region's climate and hydrometeorological early warning activities. This was done through the implementation of a Linux-based online database. Because I needed to interact with this Linux-based database platform, I installed Cygwin [5] on the test machine to create a Linux environment in Windows; therefore, native Linux commands could be used to transfer the formatted dataset (discussed later). XConnect was configured to store the data every 10 minutes in ASCII text files.
A Python 2.7 script read the data and extracted the readings. Through the use of cron jobs, this script was also run at 10-minute intervals, with a 1-minute offset (see the flow diagram in Figure 1). Once collected, the script also formatted the dataset and sent it to the disaster risk reduction database platform by SFTP. Initial tests of the Cygwin/Windows interactions were done on a Windows XP machine, which proved not only successful but, most importantly, stable.
Once again I was asked about data access. Although the data could be downloaded locally from the XLite 9210B datalogger [6] and the local collection machine, this dataset was not immediately user friendly without some level of processing, so I decided to find a cost-effective yet robust way to make the data easily accessible.
At the time of the project procurement, which was some months after the testing phase, only Windows 7 machines were available for purchase. However, the XConnect software was not fully compatible with Windows 7; as a result, the collection systems were no longer stable: The software constantly had to be restarted and, in some cases, the system needed a complete reboot. Given the importance of the collected data, a more reliable approach had to be devised. At this time, I began to develop a Linux-based system to collect the data directly from the network of AWSs.
Initially, I tried to emulate in software the signals that needed to be sent to the AWS datalogger to request data, but that proved to be difficult because a proprietary protocol was used between XConnect and the 9210B datalogger. With further research, I discovered that the datalogger implemented a terminal console within its firmware with a built-in command set. Through this feature, I was able to communicate directly and download the current dataset.
Communication with the AWS is through a radio frequency (RF) link [7] at 900MHz implemented over an RS-232 serial line. In the initial tests, I used the minicom
serial communication program [8] on a Linux Mint 15 desktop computer to test the connection with the datalogger (Figure 2). Once the structure of the dataset was determined, a Python script [9], making use of the Python serial module, established the serial connection with the datalogger. The show /tag /c
command, when issued to the 9210B datalogger, stored the returned dataset (Figure 3) in a buffer. The script then parsed the data and stored it in ASCII files.
Although the datalogger was configured to collect averaged data every 10 minutes, data was requested every two minutes and stored on the local collection machine on each island, which yielded running average values within that given 10-minute period:
06Aug2016 15:00:02 AT 31.1 G 06Aug2016 15:02:02 AT 31.1 G 06Aug2016 15:04:02 AT 31.0 G 06Aug2016 15:06:02 AT 31.0 G 06Aug2016 15:08:02 AT 30.6 G
These parameters were then stored in hourly text files (Figure 4), with a <ddmmmyyyy_hh>.txt
naming format (e.g., 06Aug2016_15.txt
). Each variable or measurement parameter of the station was assigned its own folder; for example, the folder for the air temperature variable is stored in the AT
folder (Figure 1, far left).
Only the 10-minute intervals are formatted separately and sent to the central database at CIMH via SFTP. Using this Linux-based approach, the system proved to be very reliable and stable, and, more importantly, it avoided many of the challenges associated with the Cygwin/Windows 7 setup.
Pages: 8
Price $15.99
(incl. VAT)