Monitor river water levels with "County Rivers" Android app

Published on September 19, 2010 1 Comment

Monitor rising water levels in the rivers in your county with the County Rivers Android app.

The app currently monitors river gage data from the USGS and NOAA in Washington State for Clallam, Grays Harbor, King, Lewis, Mason, Pacific, Pierce, Skagit, Snohomish, Thurston and Whatcom counties.

Last fall I wrote a Twitter bot which monitors information from the USGS and NOAA on river water levels in King County, Washington. It tweets hourly updates when water levels reach the alert thresholds set for the individual river gages. It wasn't long after I implemented the @kingrivers account that I received requests for rivers in other counties. By the end of fall 2009 I had Twitter accounts for ten counties which were monitoring water levels on 43 rivers. The system works well but the tweets can become overwhelming when several of the rivers start hitting alert thresholds.

With La NiƱa almost upon us and climatologists predicting above-average rainfall, increased snow pack and lower temperatures from late September until spring it is likely we are going to see an increased chance of floods. For this reason, porting the Twitter application over to work on Android phones seemed like a good next move and hopefully more people can make use of it.

If you are in a county I do not have listed or want to know if a particular river can be added to the watch list please let me know. If you use the application and have suggestions or feedback, drop me a line.

Real-time Pacific Northwest earthquake monitoring with Twitter

Published on October 30, 2009 0 Comments

The PNWQuakes Twitterbot monitors information from USGS and the University of Washington's Pacific Northwest Seismic Network (PNSN) on earthquakes in the Pacific Northwest. It will tweet on any magnitude 1+ quakes.

Accord to PNSN's web site, when an earthquake has a preliminary magnitude exceeding 1.7 an automatic, computer-generated epicenter is show if it is recorded by at least 10 seismic stations and is less than 80 km deep. Other earthquakes will be posted after being reviewed by an analyst and may be tweeted several hours after the quake was initially recorded.

A sample tweet is shown below,

M 2.2 occurred on Thu Oct 29 at 12:51:12 AM, 20 km E from Mansfield, WA at a depth of 0.50 km. http://bit.ly/30yWa8

The bit.ly address provided at the end of the tweet is a direct link back to more detailed information about the earthquake on the USGS Earthquake Hazards Program web site.

If you have any comments or suggestions please feel free to post them here or contact me on Twitter at @platoscave or @pnwquakes.

Real-time King County river monitoring with Twitter

Published on October 15, 2009 2 Comments

Let me start by saying I do not work for King County.

The idea for this script originally came during the flooding Washington State endured in late 2008 and early 2009 of this past year. I made a rough draft, wrote some of the code and documented how I thought it would work. It didn't make much sense to activate it last January because, well, we all knew there was flooding all over the place and the last thing I would want to see is a Twitter feed telling me something I already knew.

So with the rain now beginning in earnest, winter around the corner and flood preparation underway for the Green River Valley, now seemed like a good time to launch the Twitter account, @KingRivers.

The Twitter bot monitors information from USGS and NOAA on river water levels in King County, Washington. It will begin tweeting hourly updates when water levels reach the alert thresholds set for the individual river gages.

A sample tweet is shown below,

Green River near Auburn: At 2009-10-14 21:45 river height was 53.41 ft. http://bit.ly/iYtjl

The bit.ly address provided at the end of the alert is a direct link back to more detailed river flooding information on King County's Flood Warning System website. King County's individual river pages provide USGS gage data, a map, alert phases, and recent high flows.

The river gages I am currently monitoring with this script are:

  • Green River near Auburn
  • Skykomish River near Gold Bar
  • Tolt River near Carnation
  • Cedar River near Landsburg
  • Cedar River at Renton
  • White River above Boise Creek at Buckley
  • Issaquah Creek near Issaquah
  • Snoqualmie River near Snoqualmie
  • Snoqualmie River near Carnation

In most cases I am trying to display the river height in feet rather than flow in cubic feet per second (c.f.s) because I think people can visualize that easier. Where height information is not available I have fallen back to displaying a river flow measurement.

And now, a quick disclaimer. In no event will I be liable for any loss or damage including without limitation, indirect or consequential loss or damage, or any loss or damage whatsoever arising from loss of data or profits or property arising out of, or in connection with, the use of this service. Every effort is made to keep the service up and running smoothly. However, I take no responsibility for, and will not be liable for, the service being temporarily unavailable due to technical issues beyond my control.

If you have any comments or suggestions please feel free to post them here or contact me on Twitter at @platoscave or @KingRivers.

Calculate distance between latitude longitude pairs with Python

Published on October 05, 2009 0 Comments

The Haversine formula is an equation that can be used to find great-circle distances between two points on a sphere from their longitudes and latitudes. When this formula is applied to the earth the results are an approximation because the Earth is not a perfect sphere. The currently accepted (WGS84) radius at the equator is 6378.137 km and 6356.752 km at the polar caps. For aviation purposes the FAI uses a radius of 6371.0 km

#!/usr/bin/env python

# Haversine formula example in Python
# Author: Wayne Dyck

import math

def distance(origin, destination):
    lat1, lon1 = origin
    lat2, lon2 = destination
    radius = 6371 # km

    dlat = math.radians(lat2-lat1)
    dlon = math.radians(lon2-lon1)
    a = math.sin(dlat/2) * math.sin(dlat/2) + math.cos(math.radians(lat1)) \
        * math.cos(math.radians(lat2)) * math.sin(dlon/2) * math.sin(dlon/2)
    c = 2 * math.atan2(math.sqrt(a), math.sqrt(1-a))
    d = radius * c

    return d

Saving the above script as haversine.py you can also use it interactively within the Python shell like this,

>>> import haversine
>>> seattle = [47.621800, -122.350326]
>>> olympia = [47.041917, -122.893766]
>>> haversine.distance(seattle, olympia)
76.386615799548693
>>>

DOSBox with Ubuntu 9.04

Published on August 30, 2009 0 Comments

I have been feeling a little nostalgic this weekend. Yesterday morning I installed ZSNES, a Super Nintendo emulator, and fired up a ROM of Earthworm Jim and played that with my son for a while. Ah, the memories. After cleaning out the garage that afternoon I came across a box of old DOS games including two of my all-time favorites from Lucas Arts, X-Wing and Tie Fighter.

DOSBox is a DOS-emulator that uses the SDL-library, emulates 286/386 realmode/protected mode and has excellent sound compatibility with older games. Last time I installed a DOS emulator it was DOSEMU and the experience was far from memorable. So, with fairly low expectations, I decided to try DOSBox this time around. The experience was great and getting it to run under Ubuntu 9.04 is super simple.

Open a terminal window in Ubuntu and install DOSBox:

sudo apt-get install dosbox

Next, create a folder to install your DOS games to:

mkdir dosgames

Because Ubuntu changed something related to the scancodes from 8.10 on, some of the keys, specifically the arrow keys, don't work. According to the DOSBox website, the workaround for English/American keyboard layouts is to enter the following from the open terminal window:

echo -e "[sdl]\nusescancodes=false\n" >>~/.dosboxrc 

Now, when we start DOSBox the arrow keys will function correctly.

dosbox

This will start a DOSBox window which resembles an old DOS prompt command line. You then need to mount your dosgames folder you created earlier so DOSBox knows where to find your games. At the DOSBox Z:\> prompt enter:

mount c /home/username/games/dosgames

If you need your CD drive to install/play your game, place the CD in the drive first. Ubuntu will typically auto mount it under /media/disk and you can mount it within DOSBox with:

mount d /media/cdrom -t cdrom

Once you have your game running, you can add CPU cycles to it in order to increase performance by pressing, Ctrl + F12 or decrease cycles by pressing, Ctrl + F11.

Scenes from Lucas Arts "X-Wing"

Seattle real-time traffic data using Python

Published on August 01, 2009 3 Comments

The Washington State Department of Transportation (WSDOT) operates Traffic Management Centers (TMCs) around the state to monitor traffic and identify problems. The TMCs use cameras located on the highway system and collect data from traffic detectors in the highways to get a real-time picture of traffic conditions. The most common detector WSDOT uses is the induction loop, a simple low-voltage wire coil buried in the roadway that sends an electrical pulse when a vehicle passes over it.

The collected traffic data is provided in a proprietary, yet simple, binary format from a WSDOT anonymous ftp site, ftp://webflow.wsdot.wa.gov/. The two main files we will be using are the WEBFLOW.DAT and WEBFLOW.STA files.

  • WEBFLOW.DAT is a small, 6 KB binary file that contains the latest volume and occupancy data for the Seattle area TMS.
  • WEBFLOW.STA is a text file database that contains the current list of detection locations for the Seattle area TMS. It is updated as new detection equipment is installed in the system.

The Python code below is heavily documented and includes the format of the binary and text files. The program retrieves the data files, parses and returns an xml stream containing the latest volume and occupancy information.

#!/usr/bin/env python

# Python WebFLOW data decoder example
# Author: Wayne Dyck

import datetime
import re
import struct
import urllib2
from xml.dom import minidom

class WebFLOW(object):
    """Returns traffic information from WSDOT's Traffic Management System.

    WEBFLOW.STA is a text file database that contains the current list of vehicle
    detection locations for the Seattle area TMS. It is updated as new detection
    equipment is installed in the system. Each detection location (also called
    a station) consists of one or more loops in a single roadway direction. The
    file is broken down as follows:
   
      Line 1: Version number
      Line 2: Station count (# of records)
      Line 3: Start of station info (records)
   
    Each record line contains a 16-byte record name, followed by the freeway
    and cross-street where the detector station is located. The record position
    is used by WEBFLOW.EXE to synchronize WEBFLOW.DAT with station definitions
    contained in the webflow maps. The file is in the same order as the data file.
    This order must be used to decipher the data file, WEBFLOW.DAT. The location
    names also appear in WebFLOW when the user clicks on a map segment.
    
    Last Line: END
   
    WEBFLOW.MSG is a text file that contains the latest text based messages from
    the TMS. These messages include current incident information, as well as
    bulletins that warn of upcoming construction closures. There are from one to
    four parts to this file:
   
    1st Byte = 0x02 Hexadecimal
      Body of a message containing Date, Heading and Message fields. The reader
      should disassemble the file with a good text editor to learn more about the
      message body format. The format of the message body does not change.
      
    Byte following message = 0x03 Hexadecimal
    
    The file can contain up to four messages in the format above, concatenated
    together.
   
    WEBFLOW.DAT is a binary file that contains the latest volume and occupancy
    data for the Seattle area TMS. Numbers given are zero based and in
    hexadecimal.
   
    Byte 0 is the version number of the WEBFLOW.STA file needed to decode it.
    Byte 1 has directional data for the I-5 and I-90 reversible express lanes.
    
      0x00 = Both Rev CLOSED
      0x01 = I-90 Rev CLOSED, I-5 Rev North
      0x02 = I-90 Rev CLOSED, I-5 Rev South
      0x10 = I-90 Rev East, I-5 Rev CLOSED
      0x20 = I-90 Rev West, I-5 Rev CLOSED
      0x11 = I-90 Rev East, I-5 Rev North
      0x12 = I-90 Rev East, I-5 Rev South
      0x21 = I-90 Rev West, I-5 Rev North
      0x22 = I-90 Rev West, I-5 Rev South
    
    Bytes 2 and 3 are reserved.
    
    The data is 1-minute data, for the time period ending on the time stamp
    given in bytes 4 through 9.
    
    Byte 4 = Year of time stamp, in hex (i.e.: 0x5F = 95)
    Byte 5 = Month of time stamp, in hex (i.e.: 0x07 = July)
    Byte 6 = Day of time stamp, in hex (i.e.: 0x19 = 25th)
    
      0x5F0719 = July 25, 95
    
    Byte 7 = Hour of time stamp, in hex (i.e.: 0x0D = 13)
    Byte 8 = Minute of time stamp, in hex (i.e.: 0x32 = 50)
    Byte 9 = Second of time stamp, in hex (i.e.: 0x2F = 47)
    
      0x0D322F = 13:50:47 hrs. Note: Clock is 24 hr
   
    Starting with byte 10, the data is in blocks of 5 bytes, the number of
    blocks is equal to the station count (see Line 2, WEBFLOW.STA).
   
    Each 5-byte block contains bit-mapped data in the following format:
    
    Bit positions are increasing right to left Byte positions are left to
    right, i.e., 1st data byte is first byte encountered in data file, etc.
   
    1st data byte:
      bit 6-7: Incident (0=No incident, 1=Tentative, 2=Occurred, 3=Continuing)
        (Note: at this time, the incident information is not active)
      bit 0-5: # of 20-second periods in data sample (3 = 1 minute, standard)
   
    2nd data byte:
      bit 7: Data validity flag (0=data NOT usable, 1=data OK)
      bit 4-6: # of loops in station (allows for several lanes per station, 0-7)
      bit 3: Reserved
      bit 0-2: High 3 bits of Scan Count (% Occupancy = Scan Count / 12)
   
    3rd data byte:
      bit 0-7: Low 8 bits of Scan Count
   
    4th data byte:
      bit 0-7: High 8 bits of Volume data (Volume is total count of vehicles)
   
    5th data byte:
      bit 0-7: Low 8 bits of Volume data
    
    The last four bytes of WEBFLOW.DAT are a 4-byte arithmetic checksum, low
    byte is first, which is the sum of all the other bytes in this file.
    
    """
    
    def getSeattle(self):
        """ Returns traffic volume, occupancy and speed data for the Seattle area. """
        
        sta = "WEBFLOW.STA"
        dat = "WEBFLOW.DAT"
        xmldata = self._getDataFiles(sta, dat)
        return xmldata

    def _getDataFiles(self, sta, dat):
        """ Retrieves the data files, parses and returns an xml stream. """
      
        doc = minidom.Document()
        stn = re.compile(r'_stn', re.IGNORECASE)
        url = "ftp://webflow.wsdot.wa.gov/"
        warning = """This program is using computer data originating from the Washington 
State Department of Transportation (WSDOT). WSDOT provides free software for public and 
private use of this data. The software is available at WWW.WSDOT.WA.GOV via the Internet. 
WSDOT may, at any time, change data formats, location, update rates or other features or 
aspects of the data. WSDOT changes to the data will adversely affect this software program. 
WSDOT is under no obligation to notify you or your software vendor of any changes to the 
data source."""

        webflow = doc.createElement("webflow")
        doc.appendChild(webflow)
        disclaimer = doc.createComment(warning)
        webflow.appendChild(disclaimer)
        
        request_sta = urllib2.Request(url + sta)
        request_dat = urllib2.Request(url + dat)

        try:
            response_sta = urllib2.urlopen(request_sta)
            response_dat = urllib2.urlopen(request_dat)
        except IOError, err:
            xml_error_message = self._getExceptionInfo(err)
            return xml_error_message
            
        lines = response_sta.read().splitlines()
        tokens = lines[0].split()
        version_sta = tokens[2]
        tokens = lines[1].split()
        stations = tokens[2]
        version_dat = ord(response_dat.read(1))
        express_lanes = ord(response_dat.read(1))
        response_dat.read(2) # Bytes 2 and 3 are reserved. Skip them.
        year = ord(response_dat.read(1)) + 2000
        month = ord(response_dat.read(1))
        day = ord(response_dat.read(1))
        hour = ord(response_dat.read(1))
        minute = ord(response_dat.read(1))
        second = ord(response_dat.read(1))
        timestamp = datetime.datetime(year, month, day, hour, minute, second)
        
        info = doc.createElement("info")
        info.setAttribute("version", str(version_dat))
        info.setAttribute("date", str(timestamp))
        webflow.appendChild(info)
        express = doc.createElement("express_lanes")
        express.setAttribute("status", str(express_lanes))
        webflow.appendChild(express)
        
        station_list = lines[2:] # skip the first two lines
        
        for i in range(int(stations)):
            tokens = station_list[i].split(',')
            byte0 = ord(response_dat.read(1))
            word0 = self._readWord(response_dat)
            word1 = self._readWord(response_dat)
            periods = byte0 & 0x3f
            incident = (byte0 & 0xc0) >> 6
            scans = word0 & 0x7ff
            loops = (word0 & 0x7000) >> 12
            valid = (word0 & 0x8000) != 0
            volume = word1 & 0xffff
            occupancy = scans / 12

            if not stn.search(tokens[0]): continue            
            
            sta = doc.createElement("station")
            sta.setAttribute("id", tokens[0])
            sta.setAttribute("valid", str(valid))
            sta.setAttribute("occupancy", str(occupancy))
            webflow.appendChild(sta)

        # Close resources
        response_sta.close()
        response_dat.close()

        return doc.toprettyxml(indent="  ")

    def _getExceptionInfo(self, err):
        doc = minidom.Document()
        webflow = doc.createElement("webflow")
        doc.appendChild(webflow)
        error = doc.createElement("error")
        webflow.appendChild(error)
        error_message = doc.createTextNode(str(err.strerror))
        error.appendChild(error_message)
        return doc.toxml("utf-8")
    
    def _readWord(self, stream):
        word = stream.read(2)     
        return struct.unpack('>H', word)[0]


def main():
    webflow = WebFLOW()
    xml = webflow.getSeattle()
    print xml


if __name__ == "__main__":
    main()

Google Search Appliance - How to add Omniture's SiteCatalyst code

Published on January 31, 2009 0 Comments

I originally posted this on Google's Search Appliance Group for work, however, there was an issue with Google's HTML editor that kept messing up and refusing to display it. One of the group admins was finally able to post it but they had to leave out some code in order to do it. It's presented here in its entirety.

In order to add Omniture's SiteCatalyst code to a Google Search Appliance you need to modify the raw XSLT stylesheet. The following customizations will insert the tracking code after the opening <body> tag on the search results page. As per the implementation requirements by Omniture, the code will also dynamically set s.prop1 to the search terms and s.prop2 to the number of results returned. If zero results are returned we populate s.prop2 with "zero" and not 0.

Before you implement these changes be sure to make a backup of your existing stylesheet.

Implementation

Step 1:

Add the following code to the end of the section where Google recommends you make customizations. Replace the "INSERT-DOMAIN-AND-PATH-TO-CODE" with the location where you uploaded the s_code.js file:

<!-- **********************************************************************
 Add Omniture SiteCatalyst code (can be customized)
     ********************************************************************** -->

<xsl:template name="sitecatalyst">
  <xsl:param name="query"/>
  <xsl:param name="matches"/>
  <xsl:comment>
    SiteCatalyst code version: H.16.
    Copyright 1997-2008 Omniture, Inc. More info available at
    http://www.omniture.com
  </xsl:comment>
  <script language="JavaScript" type="text/javascript" src="INSERT-DOMAIN-AND-PATH-TO-CODE/s_code.js"></script>

  <script language="JavaScript" type="text/javascript">
    <xsl:comment>
      s.prop1="<xsl:value-of select='$query'/>"
      s.prop2="<xsl:value-of select='$matches'/>"
      var s_code=s.t();if(s_code)document.write(s_code)
  //</xsl:comment>
  </script>

  <script language="JavaScript" type="text/javascript">
    <xsl:comment>
      if(navigator.appVersion.indexOf('MSIE')>=0)document.write(unescape('%3C')+'\!-'+'-')
  //</xsl:comment>
  </script>
  <xsl:comment>

    End SiteCatalyst code version: H.16.
  </xsl:comment>
</xsl:template>


Step 2:

Next, locate the XSL template named, search_results and place the following code after the opening <body> tag:


<!-- *** Add Omniture SiteCatalyst code *** -->
<xsl:choose>

  <xsl:when test="RES">
    <xsl:call-template name="sitecatalyst">
      <xsl:with-param name="query" select="Q"/>
      <xsl:with-param name="matches" select="RES/M"/>
    </xsl:call-template>
  </xsl:when>

  <xsl:otherwise>
    <xsl:call-template name="sitecatalyst">
      <xsl:with-param name="query" select="Q"/>
      <xsl:with-param name="matches" select="'zero'"/>
    </xsl:call-template>
  </xsl:otherwise>

</xsl:choose>
    

Conclusion

Save your changes and you should be in business. The Google Search Protocol Reference states the XSL stylesheet cache is updated approximately every 15 minutes. I have found it can take a lot longer. In order to force it to refresh the stylesheet currently being requested simply add &proxyreload=1 to the end of your search URL.

Get aviation METAR text with Python

Published on November 19, 2008 0 Comments

A simple command line tool written in Python to get the latest METAR text from NOAA's Aviation Weather Center.

#! /usr/bin/env python

import BeautifulSoup
import optparse
import urllib
import urllib2

class Metars(object):
    """Gets latest METAR text given 4-letter ICAO station identifier(s) and returns a list."""
    def __init__(self, stations):
        self.stations = stations    
        
    def get_metars(self):
        metars = []
        url = 'http://adds.aviationweather.gov/metars/index.php'
        user_agent = 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 2.0.50727)'
        values = {'station_ids': self.stations}
        headers = {'User-Agent': user_agent}
        data = urllib.urlencode(values)
        req = urllib2.Request(url, data, headers)
        response = urllib2.urlopen(req)
        page = response.read()
        soup = BeautifulSoup.BeautifulSoup(page)
        fontTag = soup.find('font')
        stations = fontTag.findAllNext(text=True)
        metars = [station for station in stations if station.strip('\n')]
                
        return metars
            
def main():
    p = optparse.OptionParser(description=' Returns latest METAR text given a 4-letter ICAO station identifier',
                     usage="usage: %prog station1[,station2]",
                     version="%prog 0.1")
    
    options, arguments = p.parse_args()

    if len(arguments) == 1:
        stations = Metars(arguments[0])
        observations = stations.get_metars()
        if observations:
            for observation in observations:
                print observation
    else:
        p.print_help()
              
if __name__ == '__main__':
    main()

To use it from the command line simply enter a comma separated list of 4-letter ICAO station identifiers.

$ ./metars.py cyvr,cyxx
CYVR 201500Z 00000KT 20SM SCT120 OVC170 05/02 A2979 RMK AC3AC5 SLP088
CYXX 201500Z 00000KT 30SM SCT120 BKN210 07/M02 A2979 RMK AC4CS2 VIRGA SLP090

You can also use it interactively within the Python shell like this,

>>> import metars
>>> stations = metars.Metars("cyvr,cyxx")
>>> observations = stations.get_metars()
>>> observations
[u'CYVR 201500Z 00000KT 20SM SCT120 OVC170 05/02 A2979 RMK AC3AC5 SLP088',
u'CYXX 201500Z 00000KT 30SM SCT120 BKN210 07/M02 A2979 RMK AC4CS2 VIRGA SLP090']

A list of weather stations can be found in the Aviation Weather Center's station.txt file.

Follow me on Twitter

Published on October 09, 2008 0 Comments

In case you have never heard of Twitter, it's touted as, "a service for friends, family, and co-workers to communicate and stay connected through the exchange of quick, frequent answers to one simple question: What are you doing?"

It's free to join and fun to see what you can pack into a 140 character limit update.

You can follow my "tweets" here. See you online.

Source code highlighting with Python and Pygments

Published on August 13, 2008 0 Comments

I was looking for a source code highlighting solution to colorize code I place into this blog. I found there are a number of solutions available. Many are external programs which run against the source file such as GNU Source-highlight, however, some are browser based and use JavaScript to markup the code in realtime as Syntax Highlighter does.

Pygments is an python based generic syntax highlighter that has support for wide number of programming languages and markup formats.

Older