Using Bluebrain powered Cannybots to demonstrate Braitenberg Vehicles and group behaviours.
This could be a good introduction to sensors and robotics.
Initially, planning on using light sensors (or ideally twin colour sensors for versatility) with the Bluebrain interpretting how this input should be used to drive the wheels. Programming/monitoring via bluetooth link to Raspberry Pi or mobile device would be ideal e.g. to adjust the style of connection / response graphs.
The Cannybot would also be fitted with an RGB LED which could be configured, for the purposes of allowing it to act as a target for others to respond to (e.g. follow / avoid) and this could also be set via a simple function. Thus it would be possible to have a number of similar bots interact in interesting ways.
One could consider a further development, where a "charging station" (either literal, or simulated) could be sought out. Simulated "energy" may lend itself better to demonstrations with short lifespans, and allow for configuring the "metabolism" of the Vehicles.
This is going to be really cool Project. Cant wait to see the autonomous behaviours emerging in cannybots in real time based on very simple logic. Since BlueBrain can seamlessly talk to Pi (over Bluetooth), the logic can be implemented on Pi.
More about BV here - http://en.wikipedia.org/wiki/Braitenberg_vehicle
@Stuart - I have sent you a pair of cannybots to try this out. Hopefully you will get it be Saturday in time for the Berlin Pi Jam. Fingers crossed.
Sorry for the delay in updates here - other things have been taking up my time with work and holiday.
This is a picture of the first-cut very rough photocell based Vehicle. Took it along to the Raspberry Jam Berlin event yesterday and been developing it further.
I've created a code repository (CannybotsBraitenbergVehicles) on Github for my BV related work, I've got local USB serial feedback of sensors for testing (local repo), and will be adding Bluetooth monitoring soon. The current version allows response-curves between sensor and motor outputs to be defined. My plan is to allow this to be configured over Bluetooth. I'm keen to allow virtual sensors to be defined and fed from Bluetooth data.
Admittedly I don't know how soon I'll run into problems with larger wiring graphs when logic-units are added, but this could be offloaded to the monitoring software.
Can you recommend any software for documenting the physical wiring of the sensors? This is just a simple photocell + 10k ohm resistor and measuring voltage across the resistor, as described in https://learn.adafruit.com/photocells/using-a-photocell. I have a couple of I2C colour detectors too for a future version.
Looking really cool.
For diagraming circuits I've used Fritzing before, it seems to be popular and is used a lot in Adafruit tutorials.
Hi Wayne - thanks - it looks great!
Do you already have the image for the pin-outs of the original and new Bluebrain boards available in SVG format? (I'd be happy to convert these into parts to use with Fritzing)
That would be awesome, thanks!
We might be able to get hold of SVG versions, I will ask Steve our electronics guy.
In the meantime, would this PDF of the new one help you get you started?
BlueBrain_3D_Render_Top.PDF (1.6 MB)