DIY OSD update
Filed under: Development, FPV, Hardware, Robots, Video
Late update, but video is working as expected, GPS is working to some extent, but because of local weather I havent really tested it outdoors… I’m considering changing video transmitter and power system however. I’m currently running on 1S 160mAh batteries with a stepup regulator to 5v to a 10mw transmitter, which gives a really lousy battery time for the video system, also I’ve got too many batteries flying around at the moment I think.. Limiting to a bit fewer might be a good idea. Currently, there’s a battery for the video link/osd, one built into the camera, one for motor/rc receiver/servos, and one in the rc transmitter, 4 batteries in total of various sizes etc… a pain to charge them all. Hooking up the videolink and OSD to ESC/BEC is a trivial change and would remove a lot of cruft (stepup module etc), but will also add the EMC from motor/ESC to handle for the videolink etc as it’s a completely separate system today.
I’ve also upgraded so I’m using a mobius actioncam. Everything all in all means the system will draw more power and weigh more as well. Removing some batteries and running from the mains system would be really nice.
DIY OSD
I’ve built a DIY OSD based on Dennis Frie hardware and software released on rcgroups forum. I’m fairly happy with the results, the osd actually work. I do have some issues, but nothing serious. I have GPS, voltage sensors working as they should so far and am planning to add a current sensor and possibly rssi as well.
Top side not much to talk about. It’s an arduino pro mini 5v unit.
The GPS in the picture is not the one I’m using so far, but I will move over to it in a few days I hope, if I find the time. Main problem is that the GPS requires 3.3v,I’m running everything on 5v at the moment. I’m currently using a ublox neo7m from banggood. The picture shows how I hope to mount the GPS approximately though, depending on how I plan to mount OSD in comparison to the video link and the camera… GPS and video link should be mounted as far away from each other as possible, but OSD needs to be close to the video cable.
The bottom side. Most of the components have been sandwiched between the PCB’s. Some of the routing is a bit stupid but I’ll see if I can fix it so I can fit the current sensor somewhere on the bottom as well.
Saleae16 Logic16 100M 16 Channel Logic Analyzer
Filed under: Development, Hardware, Linux, Robots, Ubuntu
I got a clone of the Saleae16 Logic16 100M 16 Channel Logic Analyzer a few days ago to test some of the arduino stuff I’ve been working on for the last few months.
I was at first really annoyed because the software simply refused to work, I tried it on 4 different computers, ubuntu 13.10 64bit, Windows 7 32bit and 64 bit and same error for all of them, logic too slow for speed, try reducing blah something. My searches came up with very little info.
So in the end I found the sigrok package and compiled myself (old version in Ubuntu lacked saleae support) and was pleasantly surprised! It works and actually seemed to be an OK package! There are a few things to do before that which wasn’t obvious, you need to extract firmware from the Saleae software suite and so on for example. Also there are a few bugs in the graphical interface that I found so far. I’ll try to get them reported ASAP and possibly do some work on them. I’ll get back to this topic a bit later as it wasn’t completely obvious how to get it running at all points, for now however, it works;).
Raspberry Pi + 2x Arduino Nano V3.0
Filed under: Development, General, Hardware, Linux, Robots
Quick update, during my evenings I’ve been working with one of the Raspberry Pi’s I won on a local contest a few months ago, and it’s generally derailed into some kind of “let’s put as much stuff on it as possible”, meaning that I currently got my Raspberry Pi hooked up with:
- Slice of Pi
- Adafruit PWM driver
- Raspicam on a simple pitch/yaw servo gimbal that me and my 1,5 year old put together in 10 minutes. Controlled via PWM.
- MPU9150 sparkfun breakout board
- 2 Arduino Nano V3.0
The two Arduino Nanos have split functionality, where one will provide me with data that needs to be gathered frequently, and the other is used for slow processes such as reading out 1-wire temp sensors etc.
The first nano will have the following functions hooked up to it:
- 3x HC-SR04 ultrasound distance sensors
- Voltage measurement 0-20V circuit
- Control of 2 relays
- 3x line tracking sensors
- Reed switch
- 2x motor controls via L298P chip
The second nano has the following hooked up to it:
- MPX2200GP pressure sensor (will use something else soon’ish)
- 2x 1-wire DS18B20 temperature sensors.
- Others?
The general idea was to move timing critical stuff off the raspberry to the Nano and let the first one deal with quick sensor readouts, while the second Nano is dropped off with relatively slow sensors (DS18B20 takes very long time to read out for example). The two nanos will talk to the Raspberry via SPI I think, or possibly serial ports, but this is less likely as it would either require me to use one USB serial driver and the raspberry UART or get a 2 port USB hub of some kind and talk via USB UART’s.
I’ve meanwhile also played around with Eagle CAD for the first time in my life, making some electrical drawings of the hookups. I’m considering making a PCB layout of everything when I get there, not sure if there is any interest in this out there except for “the lols”. The image is still very raw and missing a lot of stuff that I know needs to be added at some point.
During christmas I spent some time making opencv haar cascade training on clementine detection and generally fooling around with it. I think I’m leaning towards making a robot (chassi on the way) which will travel around in the room looking for objects… I guess some of the sensors are a little overboard for this, but it’s all fun and games while it’s not work… 😉
Small project (continuous rotation servo)
For a very long time I’ve been a little bit obsessed with continuously rotating servos and searched quite a bit for them online, but never finding any cheap servos available.
Probably just me being a lousy google user, but anyways, I started looking up some mods the other day and today I finally spent 2 hours to modify some old crappy servos to see how it works and… to my suprise, it actually works, even though I used lousy 5% 1/8W resistors (recommended 0.1%) and having my 1,5 year old kid sitting in my lap while doing most of the modifications.
As always when it comes to adafruit, I highly recommend their tutorial http://learn.adafruit.com/modifying-servos-for-continuous-rotation/overview.
1 hour Raspberry pi camera controller
Kids really chew into your spare time. I got some time the other day to have some fun with a Raspberry pi, raspicam, Adafruit pwm servo driver, 2 servos, double sided tape, tape and cardboard. In 60 minutes I managed to make the “thing” in the video. OK, some stuff was pre-assembled :-).
Also, I’ve just finished setting the construction up running on Battery power. 2x zippy compact 3300mah 4s (14,4v nominal LiPo) in parallel via a 5v 5A bec powering both the servos and the Raspberry hardware. Running everything including servos for 4 hours , my measurement is that I’ve used less than 10% of the battery capacity.
The servos are currently controlled using up, down, left and right keys and the application is just a modified example from Adafruit. However, it should be very easy to integrate with either the mpu 9150 breakout unit I’ve also installed and make a simple gimbal of sorts.
But first I’m planning to test integrating the code with the opencv haar object recognition stuff I did during the Christmas holidays, for example track clementines.
Raspicam and OpenCV instructions
Filed under: Development, Hardware, Linux, Robots
I have previously gotten the opencv and python bindings to work via the 2.3 opencv system and facial recognition did work, but the system is bugged out and I could only get 64x64px image size.
I followed these instructions to get the raspicam to work with opencv and simplecv
http://tothinkornottothink.com/post/59305587476/raspberry-pi-simplecv-opencv-raspicam-csi-camera
However, there where some minor details wrong with it so here is a short list of the updates to the installation instructions on that page.
I noticed a few problems as I saw it with the above instructions, or possibly I didn’t read the instructions well enough. That said, very good info! Thank you!
Here are the issues I had:
The opencv_2.4.5.sh script pulls in libopencv-dev which in turn pulls in the entire libopencv2.3 from raspbian repositories, which in turn meant that simplecv was still using libopencv2.3.
apt-get remove libopencv-core2.3
Once that is removed, python-opencv is also removed and simplecv will not even run anymore. Adding the following will add the path to where the python cv bindings are.
export PYTHONPATH=/usr/local/lib/python2.7/site-packages/
And finally, the LD_PRELOAD to actually use the correct uv4l libraries.
export LD_PRELOAD=/usr/lib/uv4l/uv4lext/armv6l/libuv4lext.so
Once this is done, you should no longer get the munmap errors etc, and large resolution should work:
SimpleCV:1> cam = Camera(0, {“width”:640, “height”:480})
SimpleCV:2> cam.live()
etc.
Off-work robot fun
As of late, I’ve been having loads of fun with an old robot of mine, Robby RP5. My biggest complaint at all times has been the fact that it has a horrible 8-bit processor with “some kind of” Basic interpreter/compiler that I never quite figured out because it is so boring and … well, let’s face it, you will never be able to do anything “wow” in a language that is more or less assembler having 4k flash and 256bytes ram where only some 60 bytes are actually available.
As of late, we’ve been having some fun with zigbee modules at work, and I figured out a way to have fun with my old Robby again. Robby has a serial port, and I’m connecting one zigbee module to that one, and on the other end I’ve got a zigbee module connected to my computer via USB. On the Robby processor, I got a very simple program that simply talks a protocol sent over the zigbee connection and “implementing” the commands sent in packets. There are 3 packets that can be sent, TrackData, SensorData and RequestData. TrackData packet sent from computer sets speed of both tracks individually, RequestData is sent from computer to Robby and contains a request for a packet back. The Request can either be TrackData or SensorData. SensorData contains data from all sensors supported (currently only IR range sensors).
My first demonstration program on the computer is connected to a joystick and simply transforms the joystick input and sends it to the robot. Pushing button 0 requests sensordata and 1 trackdata.
Right now, I’m looking at porting my robot drivers into the Player/Stage project which I’ve been looking heavily at as of late, and seems damn cool. I’ve been testing some of the example robots in the Stage simulator, and if I would port my setup into that project, I should be able to use the available robot “behavioural modules” straight on my robot, and/or test my new modules in a simulator before actually running in the real world. In all honesty, I think player/stage is the best thing I’ve ever found since sliced bread, it simply opens up for sooo much fun 🙂 . Connect this with a couple of zigbee modules, you can build very simple and cheap robots that are extremely powerful. 60ÃœSD robot chassis, 5USD processor, 10USD junk, 30USD for 2 zigbee modules, add some sensors, and you’ve got as much as you can ask for. Robby for example is around 110USD, probably much lower, a pair of zigbee modules are 30USD.
And yes, I will open this once I feel that I’m closer to finished :-).