Build with Watson

Last night I played around with IBM Watson Internet of Things and Bluemix cloud services platform using their developer tools. IBM changed their developer pricing policy to allow free light accounts not to expire which is a great for unlimited prototyping. I experimented with training Watson conversations dialogues and natural language understanding using their visual aid tool and the github sample codes as part of their machine learning development platform and later ran an IOT simulator using a sample NodeJs app uploaded to their cloud using cloud foundry as part of their IOT development platform. A feature that I liked but didn’t play with it yet was IBM recipes for IoT using NodeRed, a flow based programming tool that was originally created by IBM engineers but is now part of JS Foundation. I plan to a run a real prototype with my Raspberry PIs and Arduinos soon and potentially extend it with Watson as well for an office prototype project.

Check IBM Watson developer program and IBM Watson for IOT

Google AIY

Setting up Google Voice AIY was an easy challenge but getting Google Vision AIY took a deep toll on me to successful solder the smaller Raspberry PI Zero W gpio pins. Messed up two PIs along the way but improving my soldering skills out weigh the cost of two $5 PI Zeros excluding the gas and time lost from the multiple trips to the store. i proudly think that I perfected the art of soldering (personal bias). Nevertheless, the multi faced benefits of Raspberry PIs, along with Arduinos, for the of things multiplied with artificial intelligence are amazing. Couple of years back I won a hackathon thanks to a RasPi idea, now I have one for retro games, and, more recently, couple of ones for AI proof of concept projects. If you never used a Raspberry Pi, go get one ! Any one!

More details about Google AIY is at


Amazon Alexa Getting Started

Alexa development

####Getting started with Alexa service

Code to install the sample Alexa app

Code to train the Alexa with a new skill

Skills development

⁃ Read ⁃ Develop a custom skill using aws lambda or a webservice with https ⁃ Need a full device for full testing but you can use Service Simulator for testing

####Custom Skill

Consist of : ⁃ Set of intents represent actions that users can do with your skill ⁃ Set of sample utterances – map these utterances to the intents and create the interaction model ⁃ Invocation name that identifies the skill and initiates the conversation ⁃ Cloud-based service that accepts the intents and is accessible via the internet. Endpoint need to provided for skill ⁃ Configuration that puts all the info above for Alexa to route the requests ⁃ Example: User: Alexa, get high tide for Seattle from Tide Pooler “Get high tide” form the sample utterance, innovation name is “Tide Pooler” Sample utterances include: OneshotTideIntent get high tide OneshotTideIntent get high tide for {City} OneshotTideIntent tide information for {City} OneshotTideIntent when is high tide in {City} … (many more sample utterances)

To Deploying the skills:

  • Create a Lambda Function for a Skill
  • Deploying a Sample Custom Skill to AWS Lambda
  • Hosting a Custom Skill as a Web Service
  • Deploying a Sample Custom Skill as a Web Service

####Steps to Build a Custom Skill

Step 1: design the voice user interface Step 2: set up the skill Step 3: write and test the code Step 4: submit the skill

Defining the Voice Interface

two main inputs:

  • Intent schema: JSON structure for the set of intents
  • Spoken input data includes sample utterances and custom values needed for custom slots)

Custom intents developments:

Integrating with AWS Lambda

Developing using NodeJS