The weather station, the Raspberry Pi and the ESP8266

In a previous post going back to 2012 I talked about how I used an Android netbook running Linux and wview to post data from my WS 2355 weather station to Weather Underground. Such a long time ago, things have changed a lot since then.

First up I replaced the Linux netbook with a Raspberry Pi running Raspbian and using pywws to upload the weather info for IWELLING61.  The netbook was ok but getting Linux on it had been a kludge so as soon as I found out about Raspberry Pis I got one in. Cost – around $60 NZD.

ws1Then a second weather station to go at the bach. This time I was pretty confident that I could get away with a cheaper variant so picked up a WH1082 from Trademe for about $185 and another Raspberry Pi from nicegear.

trademeA year or two rolled by and the original WS 2355 base station stopped communicating with the sensors – I tried cables and wireless without luck so I went hunting on Trademe for another weather station – this time it was a WH108x for $135 NZD from debra101 in Taranaki.

 

Another year and the $185 WH1082 at the bach stopped working so back to debra1010 for a $135 replacement.

That leaves me with a set of spare gauges from the WS 2355 and WH1082 lying around.

In between times I started playing with the cheap (< $10 NZD on ebay)  ESP8266 Internet of Things devices.  esp8266So much fun to be had here! The ESP8266 has a large fan base now so lots of resources around – it is  bit like a dumbed down Ardunio that you can use as a single purpose thing to do stuff. I had a choice between programming it in C using the Ardunio IDE or LUA using nodemcu firmware. I decided on nodemcu as although I hadn’t used Lua before it looked easy, especially if using the cloud firmware build service that has a lot of APIs to common sensors and devices built in.

So what do do with it? Well first up I  connected the ESP8266 to my Macbook Air and flashed it with a custom cloud build using esptool.py. I could then use a terminal program (I used zterm for the Mac) to talk to it at 115200 baud 8N parity. It seemed to work better with no flow control. Next up I created a small init.lua script to start the wifi and connect to my network. To transfer the script to the ESP 8266 I used luatool.py. The ESP restarted and was on the network. All good but I wanted a smarter way of transferring files and running adhoc commands. Some googling and I built Lua scripts for tftp and telnet servers for the ESP (thanks to whoever I stole bits of scripts from), loaded them up and then unplugged and I could access the ESP over wifi for most things I wanted to do.

You can download copies of the scripts here. There are also a few shell scripts in there to automate copying files, restarting the ESP etc;

espThe last thing I did was to hook up a very cheap and inaccurate it turns out DHT11 temperature sensor and post the results to thingspeak.

I’ve now ordered the better DHT22 (and another ESP, and a 4 line LCD display, and a relay … all up less than $20 NZD delivered) for my next project – that is get the ESP talking directly with my leftover rain, anemometer and wind direction gauges.

thingspeak

A couple of things to watch out for:

  1. You can only  run one TCP and one UDP service at a time on the ESP so if my telnet server is running then the ESP won’t be able to use https posts to update thingspeak.
  2. There are a lot of variants of the ESP around – you can buy them cheaply on ebay and probably a few cents cheaper still on Alibaba. Go for a 12E or better variant, apparently the newer ones have more flash and better wifi reception.
Posted in IT

Google has solved the photos problem

It is a long time since I’ve written anything here – its been busy! I wrote about this in our June 2016 newsletter and it decided it was worthy of expanding on.

The problem with photos (and videos) is we take a lot of them, they take a lot of space and often they are spread over multiple devices so we don’t always back them up properly (understatement). Also the photos are hard to find – I don’t know about you but I never seem to get around to organising them into logical albums all in one easy to find place. And the other problem is I occasionally like to show someone a photo or two and my phone just doesn’t have enough storage to keep a decent history of photos.

The backup issue is the big one – every year or so a friend (not usually the same one) calls up to say their PC has died and they have lost their photos. Everything else you can probably replace even if there may be a cost, but if the hard drive is gone and not recoverable and that’s your only copy of your photos then they are gone forever. Even if you do back them up say to a USB hard drive – how many of these do you have and where do you keep them? If you have a house fire do you still have a backup?

Google Photos solves all these problems in a free and easy to use way. It’s brilliant!

You go to the Google Photos link and follow your nose. Download the app onto the devices where your photos and videos are (PC/Mac/Phone), point it at your photos and let it go. If you have a lot of them it can take many hours but at the end of it all your photos and videos will be uploaded to google photos and they will be searchable and shareable.

There are just a couple of things to watch out for when uploading:

  • Take the option for high quality uploads (free unlimited storage) rather than original quality (limited storage). The high quality photos and videos are absolutely fine.
  • If you have a lot of photos/videos and you are not on an unlimited internet plan it may use up your data allowance during the upload – keep an eye on it. If you are on a phone do it over wifi.

At the end of the process all your photos/videos are backed up into the Google cloud – for free – and you can search them from your PC/Mac/Phone. I’d still keep a copy of the them on a USB drive in their original resolution just in case – in case your account gets hacked perhaps, or you delete some by accident.

For me the backup is the boring but necessary bit. The real power of Google photos is that it gives me access to all my photos anytime (that I have an internet connection) from any device, in particular my storage challenged phone – and the search is freakishly good.

It takes a few days for Google to analyse the photos but when done you can search on location, date or things like cat, car, beach etc;jm. You can also find people by clicking on a mug shot that Google generates. The search identified a person in a full face motorcycle helmet taken from 20 feet away. For the article on ANZ in the newsletter I knew I’d taken a picture of the ANZ sign being added to Hotel Intercontinental (when they should have been spending the money on their IT systems) but I couldn’t find it on my phone. A search of ANZ found nothing but using the term “hotel intercontinental” found it second in the list in Google photos.

No problem finding photos of the late and dearly loved Mr Cat either.

cat

And from my phone..

sky

So what have the Romans has Google ever done for us…?

  • Search
  • Gmail
  • Chromecast
  • Now Google Photos

 

What’s wrong with this picture?

On the ANZ Direct Online (DLO) business banking site we see ANZ are dropping support for Chrome…

Good on ya ANZ

Wiki tells us Chrome is the most popular browser…

wiki browser stats

The NZ Herald tells us ANZ ranks bottom in the 2015 NZ Banking customer survey

2015 banking satisfaction survey

And also in the Herald we see the top paid CEO in New Zealand is the ANZ CEO!

ANZ CEO

So I guess there is nothing wrong with this picture, it all hangs together nicely.

anz

Come on ANZ, lift your game – this really isn’t good enough! Your Direct Online business banking website is dreadful (and you charge $29 a month to use it!!), you have no direct bank feeds to Xero for your credit cards and your internet banking is down. Maybe pay your CEO a bit less and spend the savings on sorting our your IT systems.

If you could sort out your IT systems you might become our favourite bank.

1

2

3

4

5

6

7

Disclaimer: I have nothing personal against the ANZ, some of my best friends work there :-), it is just that their java requiring Direct Online site is an insult to anyone who has to use it. If you want to see how to do it properly check out BNZ’s site.

 

How good is hyper-v 2012 R2 (free)?

Very good actually!

It isn’t often I get enthusiastic about Microsoft products these days but hyper-v 2012 R2 (free) is a Really Useful Engine.

We are an SME, we have a virtualised environment and we use free virtualisation products as in our space I can’t justify spending $ to virtualise a server. Besides I’m an IT guy and I always try the free stuff first.

Between 2007 and 2009 we virtualised our office running remote XP PCs and an SBS 2003 Server under VMWare Sever 1.0 then 2.0 on Ubuntu 8.04 LTS servers on inexpensive low end Dell tower servers. It was 100% reliable, performed reasonably well and didn’t take much effort to manage. Once a month I’d plug in a USB drive into an Ubuntu server and run a script that snapshotted each VM, copied the vmdk files to USB, then removed the snapshots. At the same time I’d apply any outstanding Windows updates to the SBS server (with ensuing painfully long installation and reboot) and the much faster Ubuntu updates that didn’t require a reboot. Once every six months or so I’d shut everything down and restart it – it seemed to need this as performance slowly dropped off over time.

A couple of years ago I looked at changing from VMWare 2.0 server to ESXi version 5.0. The ESXi install was easy and it worked fine in my test environment but I found the free version without Virtual Centre quite lacking, specifically backing up VMs, importing and exporting VMs  and accessing USB drives from the service console. All these things I could do easily on VMWare 2.0 but looked hard on ESXi free (although GhettoVCB looked like a good option for backups). In the end I put it all in the too hard basket and moved onto other things.

This year I decided to get on with it as I was worried that VMWare 2.0 server was getting very old and would bite me soon. ESXi 5.5 was out and hyper-v 2012 R2 had recently been released, and there was a free version of it. On paper the free hyper-v  looked to have all the functionality of the paid version apart from the GUI. What really appealed was the hyper-v server would allow me to map drives, mount USBs disks and export or backup VMs easily from the command line and via scheduled tasks – all those things that were easy with VMWare Server 2.0 but not with ESXi.

So I decided it was worth a test drive. I wasn’t really looking forward to installing a Windows server but hyper-v couldn’t have been easier, download the ISO, burn to DVD, boot, answer a few questions and five minutes later I had my hyper-v 2012 R2 (very free) server running. The only problem was no GUI management tools.  While powershell lets you do everything from the command line being new to the hyper-v world I really wanted to start with a GUI tool and then graduate to the command line after I know it all worked.

To manage it with the native GUI tools I needed a Windows 8.1 PC to run the RSAT tools or install a free GUI tool on the hyper-v server. I tried out a few free tools, the most promising one was ProHVM free for personal use. With it I was able create, start and connect to VMs, however when I tried more complicated tasks such as importing VMs in place it had problems. So reluctantly I decided to upgrade my Windows 7 PC to 8.1 (it came with an upgrade license included).

Hyper-v installs in 5 minutes, not so Windows 8.1. First it was upgrade to Windows 8 as my license was for Windows 8 not 8.1, then run the free 8.1 upgrade from the app store. I got there in the end, it wasn’t a pleasant experience, but I learnt a lot and one day I may write it up.

Next challenge was connect to the workgroup hyper-v server from my domain PC. I really wanted to keep the hyper-v server standalone and not dependant on a domain controller that would (all things going well) be a guest VM on that server. Well it is doable, and I got it working using the instructions here but in the end it was still a bit hard and whenever something didn’t work as expected I was unsure if it was caused by the lack of domain membership. So I joined the server to the domain and this made it much easier.

I used StarWind V2V converter to turn my VMWare VMDK file for a Windows XP PC test VM into a vhd file then created a VM in hyper-v for it and attached the disk. After a couple of reboots, an install of the hyper-v client integration tools, and a reactivation of XP the VM was working perfectly – performance excellent.

Next up Ubuntu 12.04 VM, no problems it just worked. Once booted I enabled the inbuilt support for hyper-v. The only thing I didn’t like was very little info was being output to the console on boot  for the linux VM. After a bit of searching and trial and error I found editing /etc/default/grub to uncomment GRUB_TERMINAL=console then running update-grub resulted in a much more informative boot process.

The last VM to come across to my test system was the SBS 2003 server, all 120GB of it using the same process I used for the XP VM. It worked fine.

Time for some fun – I built a second hyper-v server so I could test some of the more interesting features such as live share nothing migrations, replication, and import from another server (interestingly the i3 PCs we use won’t run hyper-v VMs under Windows 8.1 as they lack SLAT but if you install hyper-v 2012 R2 free they will).

Imports/exports/share nothing migrations/replicas – it all just worked and was really easy. As I get older I appreciate easy stuff. I moved the XP VM between hyper-v servers while logged onto it via RDP and I didn’t notice it, the connection stayed up while the 120GB vhdx and running state of the VM was moved. Very impressive, and all this for free.

That was enough for me – the next weekend I rebuilt our VMWare 2.0 servers as hyper-v 2012 R2 servers and converted the VMs. The only hitch in the entire process was I couldn’t connect to the VMware servers using the VI Client any longer (probably because I was using Windows 8.1, too new for the old VI Client I had) so I had to use a mix of the web interface and command line tools to start and stop the VMware VMs.

Having the domain controller as a guest on the hyper-v server doesn’t seem to be a problem, it caches credentials so I can log on and do stuff even if the DC is down.

Backups – we have a belt and braces approach. We backup our file share within the SBS VM to idrive and also sync it to google drive using insync. At a VM level the VM exports are so fast and non intrusive that I run a powershell scheduled task each night to export all VMs to disk, then copy to a USB drive. We keep a weeks worth on USB and rotate USB drives.

All up a good result, very easy to manage, performance excellent (better than under the old VMWare, not sure how it would compare to ESXi), and easy to backup.

Next up – do something with the SBS 2003 server VM, it is end of life. Much as I would like to replace it with a Samba 4 server the sensible choice looks to be Server 2012 R2 Essentials as a VM.

Update: I did it. SBS now replaced with 2012 R2 Essentials.

1 2 3