Alexa on Rails – how to develop and test Alexa skills using Rails


Introduction

Alexa is awesome and I think that conversational software is the future. This post documents what I set myself as a technical learning challenge:

  • Host the skill locally, to allow a fast development feedback cycle prior to pushing code.
  • To find a way to automated tests (unit, functional and end-to-end), as most demos refer to manual testing.
  • To use something other than JS (like most of the demos do)
  • To write an Alexa skill that’s backed by a data store
  • To be able to handle conversations.

The way Alexa services interact with apps is the following:

User->Echo: “Alexa, …”
Note right of Echo: Wakes on ‘Alexa’
Echo->Amazon: Streams data spoken
Amazon->Rails: OfficeIntent
Rails->SkillsController: POST
SkillsController->Amazon: reply (text)
Amazon->Echo: reply (voice)
Echo->User: Speaks

The skill

The skill is a data retrieval one, giving information about the company’s offices and the workers there.

Alexa, Rails, git, ngrok and an Amazon account

I bought a dot and set up an Amazon account to register the skill on.

Install Rails and git for your OS. You’ll also need a data-store, easily using sqlite, or mysql gems.

ngrok is a nifty tool that will tunnel Alexa calls in to our local server.

Get the code

Fork or clone the repo for a head-start, or read along taking only pieces you need from this post.

Set up the app

  • Setting some environment variables

The database connection use the following environment variables:

export ALEXA_DB_USERNAME=
export ALEXA_DB_PASSWORD=
  • Setting up the database
bundle
rake db:create db:migrate db:seed spec

This will create and setup the database tables, seed the development tables and run the unit and integration tests.

  • Running tests
rake

Will run all tests excluding the audio tests, which I’ll describe below. Make sure all tests pass.

Connecting to the real thing

When a user invokes your skill, Amazon will route requests to an endpoint listed on the Alexa site. In order for this to function, you must first configure the skill there. It’s straightforward, but must be manually uploaded to the skill’s configuration page on Amazon’s site.

Intent schema

This is where you define the intents the user can express to your skill. I think of ‘intents’ as the skill’s ‘methods’, if you think of the skill as an object.

Utterances

Permutations on the intent’s syntax. For example:

Bookit for vacant rooms between {StartDate} and {EndDate}
OfficeWorkers who the {Staff} from {Office} are

Slot types

Here are the slot types for our skill, defining synonyms for our slots, being the parameters for intents. If you think this is complex, please remember that I am only the messenger here…

slots

Now that you have configured the skill’s interfaces, we now need to route communications from Amazon to our local server running Rails as we develop and debug. This is easily done using ngrok, explained below.

ngrok

ngrok is a service, with a free tier, that will redirect traffic from outside your home/office’s firewall into your network. Once configured, it will route traffic from Amazon to our http://localhost:3000, essential for our aspired fast development cycle.

Run it using:

ngrok http -hostname=endpoint.ngrok.io 3000

Your configuration may vary, depending on whether you are paying customer or not, so change ‘endpoint’ accordingly.

You’ll see something like this once you run it:

1495923802.png

Add your endpoint to Amazon’s skill page under configuration:

endpoint

Generating a certificate

Once you’ve settled on the endpoint URL, you’ll need to create or reuse a certificate for Amazon to use when communicating with your server process.

genrsa 2048 > private-key.pem
openssl req -new -key private-key.pem -out csr.pem
openssl req -new -x509 -days 365 -key private-key.pem -config cert.cnf -out certificate.pem

Copy the the contents of ‘certificate.pem’ to the skill’s page on Amazon:

cert

Toggle the test switch to ‘on’, otherwise Amazon will think you’re trying to publish the skill on their Skills store:

testing

Last but not least, enable the skill on your iPhone or Android by launching the Alexa app and verifying that the skill exists in ‘Your skills’ tab.

Amazon recap

We uploaded the skill info, including:

  • The Interaction model, uploading the ‘intent schema’, ‘Custom slot types’, and ‘Sample utterances’.
  • Configured the end-point
  • Uploaded the SSL cert
  • Enabled the test flag
  • Verified that the skill is enabled by using your Alexa app on your mobile device

The moment we’ve been waiting for

Run your rails app:

rails s

Run ngrok in another terminal window:

ngrok http -hostname=alexa01.ngrok.io 3000

Say something to Alexa:

Alexa, tell Buildit to list the offices

If all goes well, you should:

  • See the request being logged in the ngrok terminal (telling you that Amazon connected and passed the request to it)
  • See that the rails controller got the request by looking at the logs
  • Hear the response from your Alexa device

If there was a problem at this stage, please contact me so I can improve the instructions.

Code walkthrough

Route to a single skills controller:

 Rails.application.routes.draw do
   # Amazon comes in with a post request
   post '/' => 'skills#root', :as => :root
 end

Set up that controller:

class SkillsController < ApplicationController
  skip_before_action :verify_authenticity_token

  def root
    case params['request']['type']
      when 'LaunchRequest'
        response = LaunchRequest.new.respond
      when 'IntentRequest'
        response = IntentRequest.new.respond(params['request']['intent'])
     end
     render json: response
  end
end

Handle the requests:

def respond intent_request
  intent_name = intent_request['name']

  Rails.logger.debug { "IntentRequest: #{intent_request.to_json}" }

  case intent_name
    when 'ListOffice'
      speech = prepare_list_office_request
    when 'OfficeWorkers'
      speech = prepare_office_workers_request(intent_request)
    when 'OfficeQuery'
      speech = prepare_office_query_request(intent_request)
    when 'Bookit'
      speech = prepare_bookit_request(intent_request)
    when 'AMAZON.StopIntent'
      speech = 'Peace, out.'
    else
      speech = 'I am going to ignore that.'
  end

  output = AlexaRubykit::Response.new
  output.add_speech(speech)
  output.build_response(true)
end

Test walkthrough

Unit tests

Really fast, not touching any Alexa or controller code, just making sure that the methods create the correct responses:

 

require 'rails_helper'

RSpec.describe 'Office' do
  before :all do
    @intent_request = IntentRequest.new
  end
  describe 'Intents' do
    it 'handles no offices' do
      expect(@intent_request.handle_list_office_request([])).to match /We don't have any offices/
    end

    it 'handles a single office' do
      expect(@intent_request.handle_list_office_request(['NY'])).to match /NY is the only office./
    end

    it 'handles multiple offices' do
      expect(@intent_request.handle_list_office_request(['NY', 'London'])).to match /Our offices are in NY, and last but not least is the office in London./
    end
  end
end

Integration tests

Mocking out Alexa calls, ensure that the JSON coming in and out is correct:

describe 'Intents' do
  describe 'Office IntentRequest' do
    it 'reports no offices' do
      request = JSON.parse(File.read('spec/fixtures/list_offices.json'))
      post :root, params: request, format: :json
      expect(response.body).to match /We don't have any offices/
    end

    it 'reports a single office' do
      request = JSON.parse(File.read('spec/fixtures/list_offices.json'))
      Office.create name:'London'
      post :root, params: request, format: :json
      expect(response.body).to match /London is the only office/
    end

    it 'reports multiple offices' do
      request = JSON.parse(File.read('spec/fixtures/list_offices.json'))
      Office.create [{name: 'London'}, {name: 'Tel Aviv'}]
      post :root, params: request, format: :json
      expect(response.body).to match /Our offices are in London, and last but not least is the office in Tel Aviv./
    end
  end
end

Audio tests

I was keen on finding a way to simulate what would otherwise be an end-to-end user-acceptance test, like a Selenium session for a web-based app.

The audio test I came up with has the following flow:

describe 'audio tests', :audio do
  it 'responds to ListOffice intent' do
    london = 'Paris'
    aviv = 'Tel Aviv'

    Office.create [{ name: london }, { name: aviv }]

    pid = play_audio 'spec/fixtures/list-office.m4a'

    client, data = start_server

    post :root, params: JSON.parse(data), format: :json
    result = (response.body =~ /(?=#{london})(?=.*#{aviv})/) > 0

    reply client, 'The list offices intent test ' + (result ? 'passed' : 'failed')
    expect(result).to be true
  end

end

Line 6: Creates some offices.
Line 8: Plays an audio file that asks Alexa to list the offices
Line 10: Starts an HTTP server listening on port 80\. Make sure that rails is not running, but keep ngrok up to direct traffic to the test.
Line 12: Will direct the intent request from Alexa to the controller
Line 13: Makes sure that both office names are present in the response
Line 15: Replaces the response that would have been sent back to Alexa with a curt message about the test passing or not.
Line 16: Relays the test status back to RSpec for auditing.

This is as close as I got to an end-to-end test (audio and controller). Please let me know if you have other ways of achieving the same!

Conclusion

What was technically done here?

  • We registered an Alexa skill
  • We have a mechanism to direct traffic to our server
  • We have a mechanism to unit-test, integration-test and acceptance-test our skill
  • We have a mechanism that allows for a fast development cycle, running the skill locally till we’re ready to deploy it publicly.

My main learning, however, was not a technical one (despite my thinking that the audio test is nifty!). Being an advocate for TDD and BDD, I realise that now there’s a new way of thinking about intents, whether the app is a voice-enabled one or not.

We may call it CDD, being Conversation Driven Development.

The classic “As a..”, “I want to…”, “So that…” manner of describing intent seems so static compared to imagining a conversation with your product, whether it’s voice-enabled or not. In our case, try to imagine what a conversation with an office application would be like?

“Alexa, walk me through onboarding”. Through booking time, booking conference rooms, asking where office-mates are, what everyone is working on etc.

If the app happens to be a voice-enabled one, just make audio recordings of the prompts, and employ TDD using them. If it’s a classic app, use those conversations to create BDD scripts to help you implement the intents.

 

NRF51 full-screen debugging


NetBeans Debugging

Introduction

This post is a quick tutorial on how to set up a GUI for debugging NRF51 code.

Currently, CLion does not support remote debugging, but they promised to consider it if enough votes were collected – so vote here! I am sure it’s going to be seamless once they implement it, so I’m awaiting eagerly.

Modifying XCode to use ARM tools is a big challenge and support documentation seems to stop at XCode 6. I have it in mind to try to write a plugin to support toolchain switching as well as code templates, so watch this space…

I also tried configuring CodeLite as there were romours that it was possible. Unfortunately it never worked for me (user error, I assume) but I intend to pursue this as I’d like to support this open source effort.

Conscious of my mental well-being, I am avoiding Eclipse and will not refer to it again in this post.

So which shall we use? NetBeans. While not the most modern and the least flexible of above-mentioned IDEs, it actually manages to debug cross-compiled code running on a remote device.

Here’s how

Step 1: Download the C/C++ enabled version of NetBeans from here.


Step 2: Create a new project


Follow the wizard’s path and select to open your project’s root directory.

Step 3: Create a new configuration for the ARM toolchain


Step 4: Set up the toolchain

Enter the path of your ARM toolchain, usually in /usr/local and fill in the rest of the form


You can access this form in the future by clicking on the “Services” tab on the left hand side of the project explorer


Then right click on the configuration name and select “Properties”


This will bring up a similar window as was shown the first time when you configued the toolchain


At this stage, you’ve set up a configuration to use the ARM toolchain.

Step 5: Connect the config to our project.

This is done by selecting “Set Project Configuration” from the Run menu


Step 6: Verify that the project is set to use the ARM config.

This is done by selecting “File/Project Properties” and making sure that “Tool Collection” is set to “ARM”


Step 7: Set up debugging session parameters


You should now be ready!

Let’s build the application by using your makefile


To debug, we must first launch the “jlinkgdbserver” executable, as described in my previous blog.
Unfortunately, I have not found a way to automatically do this from within NetBeans as it does not appear to have a pre-debugging hook where we would have been able to run the server. If anyone is aware of doing so, please alert me and I will update this post.

Open a terminal window and run the following command:

jlinkgdbserver -device nrf51822 -if swd -speed 4000 -noir -port 2331

The result should look something like this:


We can now start our debugger session by selecting “Debug/Attach Debugger” in NetBeans


This will open a dialog that you should fill in as shown here


If everything went well, you should be able to see the code in NetBeans and use their debugger fully!


I hope you found this tutorial helpful! See you next time, hopefully with a solution for CLion and XCode.
Happy hacking!

Some helpful links regarding the subject:

– ARM toolchain: GCC ARM Embedded in Launchpad

– Blog entry on setting up an NRF51 dev environment manually: Nordic NRF51 up and running | InContext, by Itamar Hassin

– Setting up an NRF51 dev environment on your mac: ihassin/fruitymesh-mac-osx · GitHub

– Setting up an NRF51 dev environment using Ansible (VirtualBox and Parallels): ihassin/fruitymesh-ubuntu-vm · GitHub

– Compiling an example using Make and CMake : ihassin/nrf51-blinky-cmake · GitHub

– FruityMesh example module: ihassin/fruitymesh-ping · GitHub

– FruityMesh example on official FruityMesh site: fruitymesh/Readme.md at master · mwaylabs/fruitymesh · GitHub

– Debugging NRF51 code using NetBeans GUI: NRF51 full-screen debugging | InContext, by Itamar Hassin

Nordic NRF51 and FruityMesh BLE Up and Running


 

Update:

There’s now also an Ansible script that runs locally if you want to use your Mac natively. Use this repo.

Enjoy!

 

Some learnings and new implementations have happened since the last post about the Nordic NRF51:

– I wrote an Ansible script to automate the provisioning and deployment of a complete development environment for NRF51 using the FruityMesh framework. Please note that the environment is hosted on a headless Ubuntu, so you need some command-line fu.
The repo supports VirtualBox and Parallels running Ubuntu using Vagrant. I hope you find it be a useful way to quickly enable you to develop modules for BLE mesh experimentation or simply develop for the NRF51.

– I cloned the original and created this repo to exercise its mesh programming, specifically:
* Timer functions
* RSSI values
* GPIO programming

The implementation demonstrates an RGB LED that changes colours when its paired NRFs change their relative signal strengths as their distance from it changes.

I hope you find these two artefacts useful, and as always, your comments are welcomed.

Some useful links:
M-Way Labs FruityMesh implementation
– Helpers for development
Mac OS/X setup (without FruityMesh support)

Happy hacking!

Nordic NRF51 up and running


Update:

If you want to know the insides of how to set up a development environment, read on!

If you want Ansible to do all the work for you, skip this post and check out my repos:
* For an Ubuntu VM, use this repo
* For using your Mac natively, use this repo.

Enjoy!

Introduction

There is not much documentation about the NRF51, and the tool-collection hunting and gathering process can be intimidating.
I hope this blog entry will help those that want to use and program the Nordic NRF51 development board to test out BLE functionality.

The hardware

We are using the NRF51 development board, which was purchased from here.

Basic operations

Connecting to the board

Connection is done via conneting a standard Micro USB Cable to your host computer. Once power is supplied to board, it will run the current program automatically.

Communicating with the board

Flashing the device can be done using the JLinkExe program runing on the host computer. JLinkExe can be downloaded from here.

Resetting the board to manufacturer settings

From a terminal window, as the device is connected and turned on, run the following command line:

prompt> JLinkExe -device nrf51822

When the JLink prompt appears, type the following:

J-Link> w4 4001e504 2 
J-Link> w4 4001e50c 1 
J-Link> r 
J-Link> g 
J-Link> exit 

This will erase all the programs that were loaded.

Programming the device

In order to program the device, you must first set up the following tools:

The Nordic SDK

The SDK can be downloaded from the Nordic website here. For our testing, we used nRF51_SDK_v9.0.0. The SDK contains a binary referred to as “SoftDevice” that supports BLE management of the chip. Please see below on how to load the SoftDevice to the board using JLink.

Compiler and Linker toolchain from GNU

The cross-compiler/linker tools that are needed to build executables for the board can be found here. We placed them under ‘/usr/local’. If you have multiple development environments, it may be easier to set an alias to run the right tools rather than modifying the path. For example:

alias gdb51="/usr/local/gcc-arm-none-eabi-4_9-2015q2/bin/arm-none-eabi-gdb"
alias jdb51="jlinkgdbserver -device nrf51822 -if swd -speed 4000 -noir -port 2331”

Loading a binary to the device

An executable image is created in the form of “.HEX” files that has to be loaded to the board’s flash memory. To load it to the device, open a terminal window and run JLinkExe, this time using the loadfile command:

prompt> JLinkExe -device nrf51822 
J-Link> loadfile path-to-binary
J-Link> r  
J-Link> g  
J-Link> exit 

When you program BLE functionality, you will need to load the chip’s firmware in order to support your programs. This is packaged as an executable and is part of the SDK. In order to load the SoftDevice, simply use the loadfile command with the correct path, such as:

J-Link> loadfile SDK_ROOT/components/softdevice/s110/hex/s110_softdevice.hex 
J-Link> r
J-Link> g
J-Link> exit

Select a different path if you want to change the version loaded (in this example, it’s S110).

Checkpoint

At this stage, you should have a connected board that has a version of the firmware loaded, and the toolchain downloaded, ready for development to begin!

BLE is hard, but blinking the board is easy

Using the toolchain let’s compile and load the demo blink program that comes with the SDK to make sure we have everything in place for future development.

Making Make make

Here you’ll edit the file named Makefile.posix to point to the correct toolchain for cross-development. The file is found at SDK_ROOT/components/toolchain/gcc/Makefile.posix, where SDK_ROOT being the location you installed the Nordik SDK files.
Edit this file so it contains the path to where you installed the cross-compiler:

GNU_INSTALL_ROOT := /usr/local/gcc-arm-none-eabi-4_9-2015q2
GNU_VERSION := 4.9.3
GNU_PREFIX := arm-none-eabi

Building the blink example

Navigate to the “blink” example directory

cd SDK_ROOT/examples/peripheral/blinky

Depending on your board (the one we used was PCA10028), you might need to create a subdirectory within “blinky” by copying the one present, if your model number does not appear there:

cp -r pcaXXXXX pca10028

Edit the Makefile in the PCA10028/armgcc directory to reference DBOARD_PCA10028, if it’s not already referenced there.

The path to the makefile is: SDK_ROOT/examples/peripheral/blinky/pca10028/s110/armgcc/Makefile.

Once you have saved the modification, return to the terminal window and invoke make to build the image:

prompt> make

in the directory where the makefile is located.

Even though the LED program does not need BLE functionality, let’s load the S110 firmware prior to loading our image for illustrative purposes:

prompt> JLinkExe -device nrf51822
JLink> loadfile SDK_ROOT/components/softdevice/s110/hex/s110_softdevice.hex

And now we’ll load our blink example

JLink> loadfile _build/nrf51422_xxac.hex
JLink> r 
JLink> g

You should now see the board’s four LEDs should blink at a nice rhythm.

Debugging

Download the jlinkgdbserver debugger from here. When run, it will connect to the board via the serial cable, and wait for commands coming from the GNU debugger, which was included in the GCC download described previously.

To build with debug symbols, invoke make with the debug goal:

prompt> make clean
prompy> make debug

Run the debugger server in a terminal window or tab:

prompt> jlinkgdbserver -device nrf51822 -if swd -speed 4000 -noir -port 2331

Open another terminal window and run your image from the armgcc subdirectory so that JLinkExe will be able to load the debug symbols created when building the application:

prompt> gdb51 program-name.out
(gdb) target remote localhost:2331
(gdb) gdb-command-here

This runs the debugger, loading debug symbols which will relay instructions to the JLink server that in turn will relay those to the board.

Summary

We made sure that the hardware and the development environment was set up correctly for future application development. In order to take advantage of the hardware’s capabilities, please refer to the documention of the board and firmware here, which contains essential links to the BLE functionality as well as a demo mesh project.

Acknowledgements

I’d like to thank Tim Kadom, my friend and colleague at ThoughtWorks, who sparked my interest by introducing me to BLE and mesh applications and was instrumental in helping me set up the environment and getting everything to work.

Nordic NRF51 up and running


Update:

If you want to know the insides of how to set up a development environment, read on!

If you want Ansible to do all the work for you, skip this post and check out my repos:
* For an Ubuntu VM, use this repo
* For using your Mac natively, use this repo.

Enjoy!

Introduction

There is not much documentation about the NRF51, and the tool-collection hunting and gathering process can be intimidating.
I hope this blog entry will help those that want to use and program the Nordic NRF51 development board to test out BLE functionality.

The hardware

We are using the NRF51 development board, which was purchased from here.

Basic operations

Connecting to the board

Connection is done via conneting a standard Micro USB Cable to your host computer. Once power is supplied to board, it will run the current program automatically.

Communicating with the board

Flashing the device can be done using the JLinkExe program runing on the host computer. JLinkExe can be downloaded from here.

Resetting the board to manufacturer settings

From a terminal window, as the device is connected and turned on, run the following command line:

prompt> JLinkExe -device nrf51822

When the JLink prompt appears, type the following:

J-Link> w4 4001e504 2 
J-Link> w4 4001e50c 1 
J-Link> r 
J-Link> g 
J-Link> exit 

This will erase all the programs that were loaded.

Programming the device

In order to program the device, you must first set up the following tools:

The Nordic SDK

The SDK can be downloaded from the Nordic website here. For our testing, we used nRF51_SDK_v9.0.0. The SDK contains a binary referred to as “SoftDevice” that supports BLE management of the chip. Please see below on how to load the SoftDevice to the board using JLink.

Compiler and Linker toolchain from GNU

The cross-compiler/linker tools that are needed to build executables for the board can be found here. We placed them under ‘/usr/local’. If you have multiple development environments, it may be easier to set an alias to run the right tools rather than modifying the path. For example:

alias gdb51="/usr/local/gcc-arm-none-eabi-4_9-2015q2/bin/arm-none-eabi-gdb"
alias jdb51="jlinkgdbserver -device nrf51822 -if swd -speed 4000 -noir -port 2331”

Loading a binary to the device

An executable image is created in the form of “.HEX” files that has to be loaded to the board’s flash memory. To load it to the device, open a terminal window and run JLinkExe, this time using the loadfile command:

prompt> JLinkExe -device nrf51822 
J-Link> loadfile path-to-binary
J-Link> r  
J-Link> g  
J-Link> exit 

When you program BLE functionality, you will need to load the chip’s firmware in order to support your programs. This is packaged as an executable and is part of the SDK. In order to load the SoftDevice, simply use the loadfile command with the correct path, such as:

J-Link> loadfile SDK_ROOT/components/softdevice/s110/hex/s110_softdevice.hex 
J-Link> r
J-Link> g
J-Link> exit

Select a different path if you want to change the version loaded (in this example, it’s S110).

Checkpoint

At this stage, you should have a connected board that has a version of the firmware loaded, and the toolchain downloaded, ready for development to begin!

BLE is hard, but blinking the board is easy

Using the toolchain let’s compile and load the demo blink program that comes with the SDK to make sure we have everything in place for future development.

Making Make make

Here you’ll edit the file named Makefile.posix to point to the correct toolchain for cross-development. The file is found at SDK_ROOT/components/toolchain/gcc/Makefile.posix, where SDK_ROOT being the location you installed the Nordik SDK files.
Edit this file so it contains the path to where you installed the cross-compiler:

GNU_INSTALL_ROOT := /usr/local/gcc-arm-none-eabi-4_9-2015q2
GNU_VERSION := 4.9.3
GNU_PREFIX := arm-none-eabi

Building the blink example

Navigate to the “blink” example directory

cd SDK_ROOT/examples/peripheral/blinky

Depending on your board (the one we used was PCA10028), you might need to create a subdirectory within “blinky” by copying the one present, if your model number does not appear there:

cp -r pcaXXXXX pca10028

Edit the Makefile in the PCA10028/armgcc directory to reference DBOARD_PCA10028, if it’s not already referenced there.

The path to the makefile is: SDK_ROOT/examples/peripheral/blinky/pca10028/s110/armgcc/Makefile.

Once you have saved the modification, return to the terminal window and invoke make to build the image:

prompt> make

in the directory where the makefile is located.

Even though the LED program does not need BLE functionality, let’s load the S110 firmware prior to loading our image for illustrative purposes:

prompt> JLinkExe -device nrf51822
JLink> loadfile SDK_ROOT/components/softdevice/s110/hex/s110_softdevice.hex

And now we’ll load our blink example

JLink> loadfile _build/nrf51422_xxac.hex
JLink> r 
JLink> g

You should now see the board’s four LEDs should blink at a nice rhythm.

Debugging

Download the jlinkgdbserver debugger from here. When run, it will connect to the board via the serial cable, and wait for commands coming from the GNU debugger, which was included in the GCC download described previously.

To build with debug symbols, invoke make with the debug goal:

prompt> make clean
prompy> make debug

Run the debugger server in a terminal window or tab:

prompt> jlinkgdbserver -device nrf51822 -if swd -speed 4000 -noir -port 2331

Open another terminal window and run your image from the armgcc subdirectory so that JLinkExe will be able to load the debug symbols created when building the application:

prompt> gdb51

This runs the debugger, loading debug symbols which will relay instructions to the JLink server that in turn will relay those to the board.

Summary

We made sure that the hardware and the development environment was set up correctly for future application development. In order to take advantage of the hardware’s capabilities, please refer to the documention of the board and firmware here, which contains essential links to the BLE functionality as well as a demo mesh project.

Acknowledgements

I’d like to thank Tim Kadom, my friend and colleague at ThoughtWorks, who sparked my interest by introducing me to BLE and mesh applications and was instrumental in helping me set up the environment and getting everything to work.

Arduino programming using Ruby, Cucumber & rSpec


The project

This project serves as a sanity check that all is in order with the hardware, without the need to write on-board code using the IDE nor use the avr toolchain. What better tool than Ruby to do so?

The first thing we’ll do is to assure that the board and its built-in LED are responsive. Let’s define the behviour we would like, and implement it using Cucumber, in true BDD fashion:

Feature:
  Assure board led is responsive

  Background:
    Given the board is connected

  Scenario: Turn led on
    When I issue the led "On" command
    Then the led is "On"

  Scenario: Turn led off
    When I issue the led "Off" command
    Then the led is "Off"

The step implementation follows:

require 'driver'

Given(/^the board is connected$/) do
  @driver ||= Driver.new
end

When(/^I issue the led "([^"]*)" command$/) do |command|
  value = string_to_val command
  expect(@driver.set_led_state value).to be value
end

Then(/^the led is "([^"]*)"$/) do |state|
  expect(@driver.get_led_state).to eq string_to_val state
end

def string_to_val state
  case state.downcase
    when 'on'
      my_state = ON
    when 'off'
      my_state = OFF
  end
end

Some things to note:

  • We don’t have an assertion on @driver ||= Driver.new because the driver will simulate a connection in case the phyical board is disconnected or unavailable due to disrupted communications.
  • The user communicates using the words “on” and “off”, which are translated to ON and OFF for internal use.

This test will fail, of course, as we have yet to define the Driver class and we drop to rSpec, in TDD fashion:

require 'driver'

describe "led functions" do
  before(:each) do
    @driver = Driver.new
  end

  it "turns the led on" do
    expect(@driver.set_led_state ON).to eq ON
  end

  it "turns the led off" do
    expect(@driver.set_led_state OFF).to eq OFF
  end

  it "blinks" do
    @driver.blink 3
  end
end

This too fails, of course, and we implement Driver thus:

class Driver
  def initialize 
    @arduino ||= ArduinoFirmata.connect nil, :bps =&gt; 57600 
  rescue Exception =&gt; ex 
    puts "Simulating. #{ex.message}" if @arduino.nil?
  end 
  def set_led_state state 
    result = @arduino.digital_write(LED_PIN, state)
  rescue Exception =&gt; ex 
    @state = state 
    state 
  end 

  def get_led_state 
    @arduino.output_digital_read(LED_PIN)
  rescue Exception =&gt; ex 
    @state 
  end 

  def blink num 
    (0..num).each do 
      set_led_state ON 
      sleep 0.5 
      set_led_state OFF 
      sleep 0.5 
    end 
  end 
end

 

Some things to note:

  • I am using the arduino_firmata gem, please see the Gemfile for details.
  • The initialize method catches the exception thrown when the Arduino is not connected, as the other methods do, in order to simulate the board in such circumstances. The simulation is always succeeds, by the way, and was coded to allow development without the board connected.
  • arduino.output_digital_read is a monkey-patch to the gem, as I could not find a way to query the board if an output pin was on or off:
module ArduinoFirmata
  class Arduino
    def output_digital_read(pin)
      raise ArgumentError, "invalid pin number (#{pin})" if pin.class != Fixnum or pin &lt; 0
      (@digital_output_data[pin &gt;&gt; 3] &gt;&gt; (pin &amp; 0x07)) &amp; 0x01 &gt; 0 ? ON : OFF
    end
  end
end

All green

Having implemented the code, the tests should now pass and running rake again will run both Cucumber and rSpec, yielding:

~/Documents/projects/arduino (master)$ rake
/Users/ThoughtWorks/.rvm/rubies/ruby-2.2.1/bin/ruby -I/Users/ThoughtWorks/.rvm/gems/ruby-2.2.1/gems/rspec-support-3.3.0/lib:/Users/ThoughtWorks/.rvm/gems/ruby-2.2.1/gems/rspec-core-3.3.1/lib /Users/ThoughtWorks/.rvm/gems/ruby-2.2.1/gems/rspec-core-3.3.1/exe/rspec --pattern spec/\*\*\{,/\*/\*\*\}/\*_spec.rb
...

Finished in 7.56 seconds (files took 0.27749 seconds to load)
3 examples, 0 failures

/Users/ThoughtWorks/.rvm/rubies/ruby-2.2.1/bin/ruby -S bundle exec cucumber 
Feature: 
  Assure board led is responsive

  Background:                    # features/initial.feature:4
    Given the board is connected # features/step_definitions/initial_steps.rb:3

  Scenario: Turn led on               # features/initial.feature:7
    When I issue the led "On" command # features/step_definitions/initial_steps.rb:7
    Then the led is "On"              # features/step_definitions/initial_steps.rb:12

  Scenario: Turn led off               # features/initial.feature:11
    When I issue the led "Off" command # features/step_definitions/initial_steps.rb:7
    Then the led is "Off"              # features/step_definitions/initial_steps.rb:12

2 scenarios (2 passed)
6 steps (6 passed)
0m4.579s

 

Make this better!

The project is here. Please feel free to fork and contribute.

Conclusion

How much is “good enough”? If you notice, the assertions are implemented using the data structure exposed by arduino_firmata, not with a call to the board itself. This is always a tradeoff in testing. How far should we go? For this project, testing via data structure is “good enough”. For a medical application, or something that flies a plane, it’s obviously not good enough and we would have to assert on an electric current flowing to the LED. And again, who is to assure us that the LED is actually emitting light?

There’s not much else we can do with a standalone Arduino without any periferals connected, but it’s enough to make sure that everything is set up correctly for future development.

Disclaimer

This installment was to show a quick-and-dirty sanity check without bothering to flash the device.

Afterword

The testing and writing of this installment were made while flying to Barcelona, hoping that fellow passengers would not freak out seeing wires and blinking lights mid-flight.

Happy Arduinoing!

Infrastructure as code using Vagrant, Ansible, Cucumber and ServerSpec


Designing and developing VMs as code is at last mainstream. This post is in fact a presentation I give to highlight that we can treat infrastructure code just as we would regular code.

We use TDD/BDD and monitors to spec, implement, test and monitor the resulting VM, keeping its code close to the app’s code and as an integral part of it.

infrastructure as code