Sunday, April 17, 2016

Phone controlled ArduinoBot - USB I/O

Note - as this is a work in progress, all the github code is going to update as I write the blog. For ease of following, I will tag each revision when I publish a blog post, so it is still consistent with the blog post. This will also give you a glimpse into the evolution of the design, which can be helpful to get a sense of the thought process if you're just getting started developing software.

All the code for this blog post can be found at https://github.com/nirsoffer/ArduinoBot/tree/v1.0

So for our robot, we will use a phone as a USB Host, as opposed to a USB Device; in USB terminology a host is something that sends the commands to the devices; the Arduino is a device. The PC you hook up the Arduino to is a host.

We will make the phone, essentially, act like a PC; it will control the Arduino through the serial interface.

This is a demo of what this stage ends up looking like - assuming you're already done assembling your robot, and, well, it works.


It doesn't look like much now - but believe me - this is the foundation of something wonderful.

Compile your Arduino sketch and upload it to the board; debug your robot by hooking it up to a PC and using the "g", "r", and "l" characters through the serial monitor to see if it works. Remember to switch on the external power, otherwise you'd be using your PC's USB port to power the motors, and that might not be a good thing and could potentially, if you have a shoddily made PC, fry your motherboard. I take no responsibility. 

The next step is to make the phone a USB Host and talk to the Arduino device as a serial device:
To do so we will use UsbSerial - which can be found at https://github.com/felHR85/UsbSerial - UsbSerial in turn relies on the Android USBHost class and functionality (see documentation here: http://developer.android.com/intl/es/guide/topics/connectivity/usb/host.html)

You will also need what's called a "USB On-the-Go Adapter", also commonly named "USB OTG" here's one that I found worked out well in practice. Note that the USB OTG standard typically does not allow powering the host device with a USB OTG cable. I've heard mixed success with people using cables such as these - I personally bought one and it did not power my S2 - but your success with other devices might vary.

Let's start with the first version of both our Arduino code and the Android code to control it.

Arduino Side:

The first revision we will have is kind of dumb. It will use Serial.available() and Serial.read() to identify the opcode sent to it. At first it will only have a limited subset of the protocol I have in mind: to-wit - "f" for going forward a preset amount milliseconds, "r" for right, and "l" for left.

This is to ensure you have the USB to serial interface done right. Otherwise debugging will be hell later.

Android Side (also known as the "ArduinoRobot" class):
The ArduinoRobot class is essentially a wrapper for the UsbService class (from here https://github.com/felHR85/UsbSerial/blob/master/example/src/main/java/com/felhr/serialportexample/ ) ; I've had to modify the class slightly to look for Arduino devices specifically rather than any device that comes along, as well as work around some Android bugs and some design issues in the UsbService itself (I will send those suggestions to the author, soon).

The ArduinoRobot class is meant to abstract the serial device and the robot commands away from the main activity. Due to the quirks of the Android platform, it's designed as a Singleton that owns a Context static variable. I dislike the design intensely - but the Android platform requires that a Context be available if I want to bind to any service or register any broadcasters; the only way to do it would be either through extending a Service class (which may yet happen in later iterations) or pass a Context object to a Singleton. I went with the latter. Android savvy programmers - if you have any better ideas, let me know in the comments.

The key methods currently defined in the ArduinoRobot class are:

engage():  begin talking to the serial interface, register receivers, and do all that good stuff. Meant to be called from the main activity's onCreate(). Due to an Android bug in my old Android phone I also have to use onNewIntent() and re-engage the robot there. That bug is pretty gnarly - essentially the USB Attached intents are never received by the receiver, so I had to define the intent filter on the manifest, look for that specific intent on onNewIntent, and rebroadcast a different intent (with the same extra) so that UsbService knows that something actually happened. Assuming you are using Android versions slightly newer than my 4.1.2 then it probably wouldn't happen, but it did to me.

disengage(): unregister receivers; unbind from service. Meant to be called from onDestroy()

Another important class member to note is the interface RobotCallback which currently has only 3 functions to implement: handleData(), onServiceDisconnected(), and onServiceConnected() - the final two are currently not even used but are there for later. The handleData() function is meant for when new data is received.

Later revisions will see this structure changed significantly: remember we are only at the stage of making sure the USB bridge works, so a lot of those hooks were inserted in for debugging purposes. This is code in evolution, people, not finished product.

The callback and context for the Singleton are set with setCallback() and setContext(). No, I'm not happy with it. Yes, this is a hack. We'll refactor this once we're happy with functionality.  Honest. I swear.

There are of course a few more members - for now - mostly those that instruct the robot to go, turn left, and turn right.


The other things still there are quite simply there to test stuff: the buttons are there to test the functionality. Press "forward" and the robot should go forward, etc. See video above for a quick demo.

A few important notes: The application manifest specifically declares an intent filter:
   <meta-data android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED"  
         android:resource="@xml/device_filter" />  

that would signal the application when it is time to act; in addition, the code in ArduinoRobot that defines the device that is being sought after:
    int[] paramarray = {0x2341, 67}; // my arduino's pid, vid  
     bindingIntent.putExtra(UsbService.BIND_PARAMETERS_EXTRA, paramarray);  

is defined in startService. You should find out the VID and PID of your own brand of Arduino, and modify appropriately, as it may be different than mine.









Things to do with an old Android phone and an Arduino

A quick guide to the posts in this series (updated as the series is):

1. Arduino/Android Serial interface

A few years ago I participated in the "Color Run", a 5k race in which people throw colorful corn starch at each other. Apparently, it's something people consider to be "fun", something I am told I "don't have enough of", and that I should "get out of the house and meet more people, really, robots and cats aren't real socializing". Weirdos.

I didn't run, of course. I'm not exactly capable of running, but trudge along the path I did. On my person, in a bag, was my old trusty Galaxy SII. And somehow, when I returned from the "race", it emerged that the entire screen was cracked.

I was considering fixing it, but then realized I was incapable of staying sane for more than 3 hours without a phone, so I immediately bought a new Galaxy S4 and put the S2 in the drawer, soon to be forgotten.

Fast forward a few years, and I'm cleaning out my drawers. I find the S2. The screen is still broken, but no matter, a few eBay and Amazon clicks later I got myself a brand new screen digitizer assembly and fixed the phone.

Now I had a (almost - I didn't glue the new digitizer in properly) fully functional Android phone and nothing to do with it.

So now came the question: Can I use the USB OTG ("On-the-Go") features of the Galaxy S2 to control an Arduino driven robot (because, of course it's going to be a robot)? And what sensors and algorithms can I use on the S2 to make the robot do something awesome?

The answers are, in turn, "yes" and "a crapload".

The general idea is this:

1. Create Arduino driven robot using a kit, such as this (the exact kit I'm using was part of the Spark.io kickstarter kit, and the name of which is lost to the ravages of time; at least to me. If you know what it is, holler!) - you would of course need the motor shield, as well. My code depends on the L298P shield.

2. The Arduino code will accept commands (e.g. "f" for forward, "r" for right, etc. - protocol still a work in progress) over the serial link and activate the motors. Essentially, the Arduino will be a fairly dumb serial-to-motor bridge. I might throw in a couple of PIR sensors like these to activate the robot upon motion detection. We'll find out.

3. Have all the "smarts" of the robot run on the S2 - current thoughts include:

a. Use OpenCV and the camera for object detection, identification and collision avoidance (side note: I had a ping sensor connected to a previous version of the robot, and it drove the cat freaking wild; ultrasonic frequencies and pets do not mix. Don't try that at home, kids)
b. Some sort of gyroscope/vibration detection, maybe?
c. Use of the speaker to make noises that terrify the cat.
d. Have it receive commands via Bluetooth/Wifi/The web.

This is the sketch, in general terms. As the project progresses I will update the blog and whatever code ends up being created on Github.

Happy hacking!

Thursday, April 7, 2016

Last but not least - the app for the mouse robot (part 4 of 4)

Looking for some context on the robot? Here are: part 1, part 2, and part 3 of the series!

We are creating a simple app - one that uses a seekbar to set the position of the backend using the "setpos" function. The code for the entire thing can be found on my github, at https://github.com/nirsoffer/RobotController/ - and this blogpost will serve as a documentation for how it's done and why it was done that way.






I used the basic wizard of Android Studio to create a Fullscreen Activity app, and populated it. Download Android Studio here.

This being my first ever Android app, I probably messed it up to no end, but it functions and is a good starter. Don't take my coding practices as sound, though.  I haven't been coding professionally for nearly a decade.

There are three key files (at least at the time of writing this blog post):

FullscreenActivity.java: Initializes activities, request queue, and ParticleIORequestor - change this line:
 

final ParticleIORequestor pioreq = new ParticleIORequestor(getApplicationContext(),
new ResponseListen(), 
new ErrorListen(), 
"your device id here", 
"your token here", 
"setpos");  

to contain your token and your device ID from the Particle.IO builder. Without this - things won't work.

FullscreenActivity.java initializes a seekbar in the middle, and a 3 button radiogroup (left, center, right) at the bottom. These are meant as a quick shortcut to setting the seekbar to 0, 50, and 100.

The seekbar ends up calling the "turn" function in the ParticleIORequestor.

Speaking of:

ParticleIORequestor.java: is the "secret sauce" of the app; it's what ends up sending requests to the backend. It's heavily reliant on Volley to send these requests. At its core, it:

1. Initializes a singleton request queue object.
2. Enables sending requests through turn(int).

turn(int): cancels ALL pending requests (as an exercise to the reader, ask yourself why), and inputs a new one in. The callback functions for onResponse and onError are set in FullscreenActivity. Is it ugly? Yes, but I didn't want to bother with more complicated patterns, and the response and error callbacks needed access to the UI context. If you have more elegant ways of doing that, please let me know.

The tricky GoogleFu here is the POST request, which is poorly documented by Volley developers. The trick is to override getParams and getHeaders. I would have used a JsonRequest object rather than a StringRequest, but I suspect strongly that JsonRequest gets the POST parameters wrong and encodes them into a JSON document into the data portion of the HTTP request. I was too lazy to sniff and verify, but StringRequest works fine too.

SingletonRequestQueue.java: is an auxiliary class nearly copied verbatim from the Volley docs designed to supply a singleton request queue so there is only one.


there you have it - any questions? Post in the comments below!



Sunday, April 3, 2016

Web Connected Mouse Robot - Part 3 - the backend

(Looking for the previous parts?  Here's part 1 of the series,  And here's part 2 )

The latest version for the code can always be found at: https://github.com/nirsoffer/robot-backend - I'd personally just copy & paste this into the particle.io IDE, because it doesn't seem to have very good integration with github (if at all).

Let's break it down, bit by bit - first the set up code:

 int leftPin = D5;   
 int middlePin = D6;  
 int rightPin = D7;  
 int pos = 0;  
 int dutyCycleDelay = 10;  
 void setup() {  
  pinMode(leftPin, OUTPUT);  
  pinMode(middlePin, OUTPUT);  
  pinMode(rightPin, OUTPUT);  
  Spark.function("setpos", setPosition);  
  Spark.function("getpos", getPosition);  
  Spark.function("right", rightDummy);  
  Spark.function("left", leftDummy);  
 }  

We are defining the left, middle and right pin to correspond where we've soldered the sensor pins before, D5, D6 and D7.

pos is a variable that controls the turning of the robot - 0 corresponds to all left, 100 to all right, 50 to middle. This is done to work more easily with the android seekbar, but of course, feel free to make it whatever makes sense to you :)

We set up pos to start at 0, which would make the robot turn around itself in a consistent loop. Which in my case is better than go forward in my limited kitchen space.  You can set it up to 50 if you want it to start with going forward.

At either rate - keep in mind that it takes about 20 seconds for the Particle core to get a grip on reality and start executing code, so there will be some unexpected results until it's up and running.

The next bit of code sets the IO mode of the pins and registers the internal functions with some web enabled functions.

On to the functions:
 // Next we have the loop function, the other essential part of a microcontroller program.  
 // This routine gets repeated over and over, as quickly as possible and as many times as possible, after the setup function is called.  
 // Note: Code that blocks for too long (like more than 5 seconds), can make weird things happen (like dropping the network connection). The built-in delay function shown below safely interleaves required background activity, so arbitrarily long delays can safely be done if you need them.  
 int resetPins() {  
   digitalWrite(leftPin, LOW);  
   digitalWrite(middlePin, LOW);  
   digitalWrite(rightPin, LOW);  
   return 0;  
 }  
 void pinBlink(int pin) {  
   resetPins();  
   digitalWrite(pin, HIGH);  
 }  
 void right() {   
   pinBlink(leftPin);  
 }  
 int rightDummy(String dummy) {  
   right();  
   return 1;  
 }  
 int leftDummy(String dummy) {  
   left();  
   return 1;  
 }  
 void left() {  
   pinBlink(rightPin);  
 }  
 void forward() {  
   digitalWrite(rightPin, LOW);  
   digitalWrite(leftPin, LOW);  
   digitalWrite(middlePin, HIGH);  
 }  
 int setPosition(String posValue) {  
   pos = posValue.toInt();  
   return pos;  
 }  
 int getPosition(String dummy) {  
   return pos;  
 }  


resetPins() moves all pins to zero; left() and right() just turn on the corresponding pins. setPosition() changes the global int pos, getPosition() gives you the current position. forward() simply turns on the middle pin and turns the rest off (yes, you can refactor it to use resetPins() and pinBlink() but I am lazy).

rightDummy() and leftDummy() are wrappers for left() and right() that are there to accept an argument on behalf of the web backend. All web enabled functions except setpos are there for debugging purposes only.

Finally, let's look at the main loop:
 void loop() {  
  // Pos goes from 0 to 100, meaning that 50 is middle so we...  
  int newpos = pos - 50;  
  // now -50 is full on left, 50 is full on right, 0 is straight ahead. we need to have a way of "blinking" the sensor on on a duty cycle.   
  // We'll use a stochastic duty cycle for smoothness -  
  // by that what I mean is that we generate a random number between 1-50, if it's below abs(pos) then we blink the pin high for 10ms, if it's above we keep it low  
   for (int i=0; i<50; i++) {  
     if ( random(50)<abs(newpos)) {  
        if (newpos > 0) left();  
        if (newpos < 0) right();  
        if (newpos == 0)   
        {   
          forward();  
        }  
     }  
     else {   
         forward();  
     }  
     delay(dutyCycleDelay);  
   }  
  // And repeat!  
 }  

What we're doing here is some sort of a stochastic PWM - we're flipping the corresponding sensor on, on average, for abs(pos-50)/50 of the time. If the position is 25, for instance, we will flip the "left" bit on, hopefully, 50% of the time, which would make it turn at 50% duty cycle. We do this other than alternatives (for instance, keep it on 25 ticks and then off 25 ticks) because randomizing it gives a certain smoothness. Keep in mind that we can't actually stop, so we want to spread out the turning and the forward as much as we can to give the illusion of a half turn rather than a full turn for 50% of the time and then stopping for the rest.

Not sure if that makes sense, but even if it doesn't, trust me ;)

Now that you have this - you can compile the backend, flash your particle, and begin your testing! Using curl you can quite simply see if setpos returns the correct results:

 curl https://api.spark.io/v1/devices/[your device id]/setpos -d access_token=[your access token] -d arg="0"  
This should return something like this:
 {  
  "id": "[your id]",  
  "last_app": "",  
  "connected": true,  
  "return_value": 0  
 }  

You can now play with the robot to your heart's content using curl and different arguments to setpos. Make sure that this part works well before proceeding to the next part - your android controller app!


If you're looking for the previous related posts - here's part 2 and here's part 1. Enjoy!



Friday, April 1, 2016

The Mouse Robot, Part 2 - THE HARDWARE

(Looking for the introduction to the project? Here it is!)

The robot mouse is a simple affair - essentially 3 photo sensors connected to motors; if it sees white, photo sensor goes off; if it seems black, photo sensor does not. Think of the motors as directly connected to the photo sensors (although, they are not, of course, there are transistors, and ICs, and a whole bunch of stuff along the way) - but it helps to think of it this way.

If the photosensor sees white (while the others see black) it simply turns on the corresponding motor. Turning on the left motor turns the mouse right, so that way it aligns itself to see more black. It's quite elegant, really.

And also, perfect for hacking. Simply hijack the signal from the photoreceptors to the logic board, and wham! Robot mouse does your bidding. They even conveniently have the photoreceptors and sensor board completely apart from the main logic and motor control board and connect them with a ribbon cable, easily connectable to boards of your own choosing.

A few notes on the Mouse -

1. it is horrible on carpeted surfaces. You can possibly wrap around rubber bands around the wheels or something, but I didn't bother. You can certainly hack the code and use something more beefy, like this robot kit - if you have any luck - post about your adventures and misadventures in the comments below!

2. my personal mouse won't stop - no matter what combination of photoreceptors it sees. It seems to have only "forward", "right", and "left". When it doesn't have a valid response from the photoreceptors, it chooses "right" (i.e.  - activate left wheel). I have a feeling that is a deliberate design - it is so if it can't find anything it starts rotating around looking for the black line again. But I can't be sure. Maybe I just soldered something wrong :)

In other words, the logic table of the mouse seems to be (I didn't bother testing things like 110 and 011, maybe it would be a good idea when I'm bored):
Left sensor Middle sensor Right sensor Expected left motor Expected right motor Real left motor Real right motor
0000010
0011010
0101111
1001010

The next component I have is a Particle.io board; I still have the first Kickstarter version, then named "Spark". Your mileage may vary with your specific board. With Particle.io, everybody can write a web connected device. So can you.

Begin by assembling your mouse robot according to the instructions - make sure it works by testing it on a black line and seeing that it properly follows. Relish in your newfound soldering ability. Cry profusely at the times you accidentally soldered incorrectly and struggled valiantly with a blob of solder that threatened to short out all your board. Wipe the sweat off your brow and have a beer. You've earned it, my friend.

Once you're done with the mouse, take one of the small PCB layout boards, and solder on two female header sets, the length of your Photon board, and space them the width of your board apart. Additionally, solder a 5 pin male header strip as well. This is what mine looked like:




You can see I marked + and - on the male headers where the positive and the negative would go, and the USB side of the photon board - so I make sure I put it correctly every time.

On the back, solder a wire connecting one side (the + side) of the male header to the vin pin of the photon, the other (- side) to the gnd pin, and the other 3 to pins 5, 6, and 7 in order. (The one closest to your - side should be to pin 5, middle to pin 6, and the one closest to the + side to pin 7)

This how the back of mine looks:




I never said I was any good at soldering.

You can now slot your photon in the headers:


And you're done! You can plug the ribbon cable (ensuring correct polarity, in my case the yellow ribbon was the DC voltage) into your new "sensor board" and hopefully, everything will light up.

In the next part, we will start showing your the backend code and testing it with curl, the final part will talk about the Android app. All source code will be available on github, of course.

Here's the link to the backend part!

The Easiest Web Connected Robot you Will Have Ever Made - Part 1

(Note - this part 1 out of a 4 part series - if you want to skip ahead:

Part 2 - is the hardware
Part 3 - is the backend
Part 4 - is the Android app )


Sometimes, we get bored.

And in these times, we want to find the cheapest, easiest way to make a web connected robot that harasses our cats.

The times are upon us. The end is nigh. We will have justice for all those times the cat woke us up at 3 in the morning.

The inspiration for the project came from shopping on Amazon.com one day and coming across The Elenco Line Tracking Robot Mouse. It seemed like the perfect Christmas gift for myself.

So I bought one. And then I hooked some stuff up to it and wrote some code.

This is the end result:




Want to see how it's done? Read on!

The project is going to be in three parts:

The hardware:
A combination of the Mouse and the Particle.io chip.

The backend:
We'll be using the Particle.io web backend for ease of development. Future version may swap out to a wifi enabled Arduino for which we will have to write our backend.

The client
We'll make an Android app using the Google Volley RPC framework. We won't be fancy, we'll just make it work. We'll also use curl for debugging.

As for list of ingredients:

The hardware:

1. A Particle Photon board - mine is still a Spark from the Kickstarter, but the one in the link should be compatible.
2. The Elenco Line Tracking Robot Mouse as specified before.
3. A soldering iron
4. A small soldering ready PCB (one of these should do nicely).
5. Some pin headers, male and female (like these)

The software
1. Android Studio
2. An Android Phone


Are you ready to get started? Then move on to part 2 - hardware!