Just to demonstrate the power of the system, adding in 15 new locations today took just a few hours, most of that looking up information
and messing with support scripts. Still hasn't added appreciable load to the system.
I've added a new page to the system: Warnings. Again, it merely recaps info
available elsewhere in the system.
Inicidentally, although I doubt you'll notice it, wxhtml (the component that reads in stats and observations from Environment Harper Canada)
has been tweaked to output stats for the proper day. Prior to this, the values were only correct after the 5am forecast, and always
backdated.
I've put together a graphic showing the various components of the system. Mostly because I'm proud of it and
what it can do. Note that while outside data is not currently received, there is no reason why it can't, by way of the CGI-bin interface.
Also, technically, the Query form operates through the CGI-bin interface as well. The Stats Form you don't see, as it is for entering
stats directly into the database. The Web Capture involves all the outside stats and images, including everything from Environment Harper Canada.
The web pages are refreshed every five minutes, as new local readings come in.
Some other changes recently include the breakout of precipitation data and the introduction of the Stats form itself. As mentioned, it is
available only to me, to input stats directly into the system. Some things are not automatically measured, here at CTO, and one of them
is snowfall. I can now input my daily estimates of snowfall, making the breakout possible.
Special Weather Statements from Environment Harper Canada are now supported. They'll now show up, at the top
of the page with other weather warnings, in grey. When there are no warnings, nothing is displayed.
The government recently changed its regional warnings graphic, which has been restored. I had to change how the image
link was extracted.
I've added a new page: Query and highlighted it in the menu.
The Query Page lets you perform live queries of the database, but it takes a little learning,
as data tags are involved. It took little new coding; I easily adapted from existing code for grids. The Query Page is
based heavily on JavaScript and sends the data by POST to the CGI-BIN server, in XML format. The queryhost waits around just
long enough to see the response file from the server, waits one second longer for any write-file/close-file issues, and feeds
it back to you. I'll soon add one to the Airport.
Just give me a couple days to get the help pages in order; I'm busy looking for work.
The power of the tagset becomes clear once you get used to the page. Try opening the
CLICK HERE FOR HELP in a separate tab, and beware--it changes according to the interval you've selected
(and whether or not you've entered a start time).
The real star of the show is the One-Time interval, as it allows you access to the entire tagset. Each of the others is
necessarily limited.
Incidentally, I'm now hosting my sourcefiles on subversion. Fairy easy to set up, and a breeze for keeping sourcefile issues
straight.
I've restarted the system a number of times this morning; the only way you'd know
were if you tracked the new Capture Accuracy stat, which I've added to the main page;
for each of the major items fed into the system from the capture system (Temperature,
Relative Humidity, Wind Speed and Bearing, Precip, Sunshine, Barometric Pressure)
, it records a
hit (record received intact as expected), or a miss (record received as Nil).
I've added this in as a check against accuracy and just because it's another little
nicety to have.
At this point, the system is humming along with no remaining bugs; it's time to start
working on a scripting language.
I've reset the all-time records for the CTO location, as it was giving me grief
although the code looked fine. Turns out the data file was corrupted, and I'll
keep an eye on it for the next little while.
Precipitation has been tracked for about the past week, although not reliably,
as the sensor is heavily overgrown with delicious beans at the moment. Nonetheless,
it gives a good picture of the precip goings-on here at the CTO.
That's just about all remaining issues. Now to move onto implementing some
niceties from my wish list.
After some hiccoughs, the system is now working excellently. Error rates have dropped well
below 1%.
I still have issues with the preciptation sensor and its spurious readings. I've been desperately
waiting for it to rain, during a time I'm home, to see if I can do anything about it.
Data-capture from Environment Harper Canada is nigh-perfect, with just one issue remaining on
overnight forecasts. I expect I'll remedy that today.
I'll be posting new pics today of the operational elements of the system.
There've been a few changes over the past little while.
First off, the system now includes a Dr. Tech WA-1070T wireless
Solar system (though I'm still reading values off an LCD). Second,
the data-capture system now runs on the former server. Third, the
new server is a laptop of ca. 2005 vintage, with 1MB of RAM and scads of
disk space. It hardly feels the load.
The system now records wind velocity and bearing (although, given
the proximity of trees and buildings, it must be taken with a grain
of salt), plus precipitation. Which doesn't work, due to the spurious
readings it's inclined to give.
The data-capture software has been upgraded and now scans the LCD
pixels quite efficiently. The input includes a sky component and auto-
calculation of Dew Point, Wind Chill and Humidex values.
Environment Canada very recently changed their data format, ending their
support for RSS (xml) products. This necessitated a complete rewrite of
that capture software, which now takes its inputs directly from the
html publicly available.