Hi, I just created a series OpenCV demo applications that run on MK802 and communicate with an Adruino board to control a servo-based PTZ camera.
Here are some videos, have a look :-)
A realtime facetracker based on a MK802 minipc http://www.tudou.com/programs/view/wvnv7vbJnGU/
A light spot tracker http://www.tudou.com/programs/view/eMv3zLORfDw/
Performing background extraction algorithm on MK802 http://www.tudou.com/programs/view/OCmPiTiK_48/
Display network bandwidth on a VFD http://www.tudou.com/programs/view/oubxw-PhL7s/
Sourcecode is on github: https://github.com/cheaven/mk802_demo
For more details, here is my articles: http://www.csksoft.net/blog/post/mk802_dev_intro.html (In Chinese)
Google Translated Version: http://translate.google.com/translate?hl=en&twu=1&u=http://www.csksoft.net/blog/post/mk802_dev_intro.html
Hopes you may enjoy them. :-)
This is remarkable. 8 FPS on an mk802 is very good. I've worked with the OpenCV face detection code before; did you use the Viola-Jones algorithm or the other one (can't remember its name offhand)?In reply to CSK
Hi patwood, I just used the OpenCV's CascadeClassifier to do the detection job. It seems this interface uses the Viola-Jones algorithm. To achieve 8fps, I used a armhf rootfs and trcompiled the OpenCV code to use neon instructions.In reply to patwood
Awesome work... You are legendary.. I have got A10 Hackberry , I am doing the same ,trying to run opencv in that. :)In reply to CSK
brilliant CSK, I'm running craze's gentoo image, just grabbed OpenCV from portage, grabbed your code from git, a small mod to remove the VFD display code and actuator control code, and my mk802ii is running face detect from a PS2 eyetoy, compiled on the mk802 :). Great examples to get me going. The face detect runs with a loop time of ~180ms when it's not recognising a face, and i think faster when it does. unfortunately it does not recognise my face very well :(, but this is all down to the classifiers. My intent is to use OpenCV to help my mk802 understand it's surroundings by identifying known features in the rooms, and provide a robot with a method for multi-room navigation, initially to find it's charger. I figure if I create easy to identify features and place them round the room, it should be possible to navigate by size of feature (giving distance) and bearing of feature (by turning until it's central). thanks again for this inspiration :).
slightly more detail: links I've gather inspiration from: http://raspberrytank.ianrenton.com/ I've got two projects on the go, one is very similar to the tank, a tracked vehicle based on a RoboRover base like this: http://evmakers.wordpress.com/2012/12/22/glitch-a-hacked-wowwee-roborover-laser-cat-toy/ using PIC and UBW firmware modified for hbridge PWM control. The other is an iCybie (http://en.wikipedia.org/wiki/I-Cybie) modified to 'super i-cybie', which will be commandable via serial (usb to serial from mk802), the mk802 giving it more of a brain and vision if I can squeeze the camera(s) on board. Battery power may be my biggest issue :) - so i've ordered 5 DC-DC convertors for peanuts to obtain efficient 3.3/5/6v from the various NiMH packs i have lying around.In reply to CSK