Sensor Networks for Arctic Environmental Monitoring

By Andrew Rettig

“Top of the World, Point Barrow Alaska,” as it is known by the local Inupiat Eskimo, is the northern most point on the continent of North America. Just a few miles to the south-west of the point is Barrow, Alaska, home to four thousand people.  In March 2010, Barrow was highlighted in a Smithsonian article entitled, “Barrow, Alaska: Ground Zero for Climate Change.” Scientists from all over the world ascend to Barrow every year to research the Arctic as well as learn from the Eskimo elders and hunters. It was for just these reasons that the Arctic Climatology Sensor Network Prototype (ACSNP) was developed. The University of Cincinnati excels at Arctic research and Dr. Richard Beck wanted to implement an Arctic Sensor Network that would stream data, in near-real-time, from the Arctic to Cincinnati and eventually into the cloud.

Sensor test site 1 on top of the Barrow Arctic Research Center.

The project began out of necessity. Arctic research takes place in very extreme environments creating limited access. Sensors storing the data for extended periods can easily be damaged by the harsh environmental conditions or by an adventurous polar bear or Arctic fox, often recognized by their tracks and teeth marks. Even trying to physically access sensors in extreme environments can cause problems such as the cracking of cables or unsealing of access points. Also, with no external monitoring of the sensors there is no way to be aware of discontinued data feeds. Lastly, reliable Arctic scientific sensors for extended monitoring are very costly. Our approach was to experiment with less expensive sensors, using more of them and streaming the data in near-real-time.

The goal for this network was to make it interoperable, scalable and extensible. Initially to create an interoperable network, standardized sensors that worked over TCP/IP were needed to collect the data. With this initial standardization, the sensors could be easily connected to embedded Linux devices for FTP connectivity. The ACSNP has both meteorology sensors and cameras at each of the three test sites. The meteorology sensor is the Davis Vantage Pro II, while for cameras the ACSNP has five Q-See QSDS148DH Weatherproof CCD Cameras with heaters and blowers at each of our three test sites. Each site also has one Stardot netcam XL, which is a standalone network camera with a built-in Web server, email and FTP client. The Stardot also has the ability to record and stream meteorological data. All of our sensors have survived the Arctic winters and are continually being tested for durability.

Arctic tundra sensor site.

Connecting to these sensors on the tundra was an additional challenge. The wireless broadband network is a 700 MHz WipLL augmented with Iridium Open Port Units, which allow for global connectivity. The 700 MHz system has a radius of 16 kilometers encompassing the entire infrastructure for the city of Barrow as well as open tundra and ice. The Iridium units, on the other hand, are associated with the only complete global commercial communication satellites. With global wireless access, the only current sensor installation limitation for the ACSNP is electricity.

ACSNP was designed using the best of current technology, adding scalability to the network. To accomplish this, partnerships were formed with an information technology (IT) company and a data transfer company. The IT company, INTRUST Group Inc. based in Cincinnati, Ohio, incorporated high availability virtualization into the project with Avance by Stratus. The mirrored server configuration of Avance accommodated the remoteness of our Arctic FTP servers. The Arctic servers created a local buffer which helped to assure the data integrity even when the connection to the “lower 48” went down.  The data transfer company, Linoma Software, donated the software, GoAnywhere Director. GoAnywhere automated the FTP processes and parsing data to the SQL database. With the addition of these partnerships, our network became an elegant solution with user-friendly interfaces, self-updating software and an easily duplicated method for universities, businesses and government agencies.

With the network infrastructure in place, ACSNP could connect through international standards to sensors in Barrow or further out on the tundra. The data are transferred through FTP and parsed into a Microsoft SQL Server database in Cincinnati. The ACSNP IPs are constantly monitored and the transfer logs are readily available. Once the data are parsed into SQL, the process gets much easier. The ACSNP can store the information in a current reading table and pass the old record into the history table. Using a unique ID for each sensor, the sensor table is joined to the spatial data table for geocoding. The ACSNP uses ArcGIS Server to manage the spatial data utilizing the Open Geospatial Consortium (OGC) specifications incorporated into the data storage, enabling interoperability at the database level. The server can also publish the information as a Web application or as standardized Web services such as KML and WMS, among others. These standards are interoperable in common geobrowsers, such as Google Earth, creating the desired “common operational picture” (COP) for decision making processes.

Unfortunately, streaming live video was not an option due to throughput limitations. Generally, the cameras in the ACSNP would record images every 10 minutes. However, the time lapse is relative to the desired event being recorded. The ACSNP cameras were placed on the Meade River to record the freezing and subsequent breakup, possible indicators of global warming. Time intervals any larger than 10 minutes would have diminished the results, missing upstream dam breaks and the physical processes involved in the breakup. Again, the ACSNP utilized the GoAnywhere Director to easily manage and automate the movement of these pictures.  A small amount of code was written in ArcGIS to associate the image with the spatial data for easy Web server publication and visualization.

The ACSNP solution fills the gap between actually taking the reading with a sensor and automating that reading into the cloud to be published as standardized geospatial services. The solution has already been implemented on other University of Cincinnati research projects. The Department of Defense and the university were working on a barge in Lake Erie streaming data from an aerostat, an unmanned blimp. Cameras and environmental sensors were attached to the barge, as well as the aerostat, streaming data to an FTP server. An undergraduate student was able to easily adapt the ACSNP to add the sensors to the network. The user interfaces and flexibility of the system enabled the inclusion of the Lake Erie research.  Our hope is for the ACSNP to continue to assist with research at the university while providing a working prototype for educating others on the possibilities of filling the infrastructure gap between the sensor reading and the utilization of standardized Web services.

Aerostat, or unmanned blimp.
The aerostat in flight.

Published Monday, August 1st, 2011

Written by Andrew Rettig

If you liked this article subscribe to our newsletter...stay informed on the latest geospatial technology

© 2016 Directions Media. All Rights Reserved.