[Dprglist] (no subject)

David Anderson davida at smu.edu
Fri May 1 13:36:14 PDT 2020


Robert,

Thanks for the link.  I don't have a google login so I'll have to figure 
out how to download it.

The gyro/IMU I've been using is the CH6 from CH robotics, but it has 
some temperature drift I haven't been able to compensate so I'm in the 
process of switching to the BNO055 which guys in the group have had good 
success with.  Basically the robot runs normal odometry using the wheel 
encoders but substitutes the IMU heading for the theta value in the 
sin() and cos() calculations.

The jBot robot uses a pretty pricey 9DOF IMU but runs basically the same 
software.  Here's a link with some code snippets for that robot that 
describes the basic technique.   It's the same link Murray already posted.

http://www.geology.smu.edu/~dpa-www/robo/Encoder/imu_odo/ 
<http://www.geology.smu.edu/%7Edpa-www/robo/Encoder/imu_odo/>


<http://www.geology.smu.edu/%7Edpa-www/robo/Encoder/imu_odo/>I've never 
written up the perimeter following behavior but here's the technique in 
general.  The perimeter following for the outdoor jBot uses the outer 
two Polaroid sensors but on the smaller nBot balancer it uses Sharp IR 
distance sensors.  These are mounted on each side and angled forward 
about 45 degrees and down about 15 degrees.

The technique is to use the robot's normal obstacle avoidance sensors 
and behaviors to push the robot away from the perimeter in conjunction 
with the angled distance sensors to follow the perimeter.  The angled 
sensors are divided into four ranges which are basically 1) too close, 
turn gently away from wall, 2) deadzone, so just go straight, 3) too far 
, turn gently toward wall, and 4) way too far (no detection) so turn 
sharply towards wall.

In addition, the normal obstacle avoidance behaviors (detect on left - 
turn right, detect on right - turn left, detect in center - keep turning 
whatever way already turning) are modified so that center detections 
always turn away from the wall.  The "bumper" behaviors on both bots, 
actually derived from the IMU and not physical bumpers, also turn away 
from the wall for center detects.

The  normal obstacle avoidance behaviors are higher priority than the 
perimeter following and so subsume those behaviors, and the bumper 
behaviors are the highest priority.

Here's a video of the jBot robot navigating towards a waypoint on the 
far side of a large building.  It gets trapped in a cul-de-sac and 
switches to perimeter following to follow the outline of the building, 
and switches back to waypoint navigation once clear of the building.

http://www.geology.smu.edu/dpa-www/robo/jbot/jbot2/jbot_ti2_m1.mpg


Hope this is helpful,
dpa




On 05/01/2020 10:15 AM, Robert Zeiler wrote:
> Looks very interesting. What sensors did you use to follow the wall 
> and how does the gyro work?
>
> Attached is a video showing Herberts arm and gripper in use. Herbert 
> understands verbal instruction. The action shown in the video is the 
> result of telling Herbert to "take this" and then "release".
> IMG_0212 (1).MOV 
> <https://drive.google.com/file/d/1u1sMsd262GLKUGZ_7DbKNtMeVt1UKFLJ/view?usp=drive_web>
>
>
> On Thu, Apr 30, 2020 at 1:36 PM David Anderson <davida at smu.edu 
> <mailto:davida at smu.edu>> wrote:
>
>     Cool.  Especially the force sensing gripper.  Any pics or video?
>
>     I played with Roborealm some (many) years ago but didn't pursue it
>     seriously at the time.  A couple of the guys in the group use
>     lidar for localization for some of the robot contest courses
>     though I'm not sure they use it for more real-world environments. 
>     Contest courses are easy :)
>
>     I've had pretty good success with a generalized version of wall
>     following which I call perimeter following (i.e., doesn't require
>     a nice flat wall) to do room navigation. The robot also runs
>     location calculations at 20Hz using gyro corrected odometry while
>     perimeter following so it knows where it is and can, for example,
>     stop when it gets back to the starting point, as you suggest.
>
>     Here's a video of the two-wheel balancing robot nbot doing some
>     perimeter following in the basement of the Heroy building at SMU
>     where I work, which is a pretty challenging environment:
>
>     http://www.geology.smu.edu/dpa-www/robo/nbot/20120614_nbot_05b.mpg
>
>     regards
>
>     dpa
>
>
>
>     On 04/30/2020 12:58 PM, Robert Zeiler wrote:
>>     Thanks I've been thinking about the left or right wall rule. Yes,
>>     Herbert started out as a LEAF robot. I've gone further than the
>>     group though. The last addition to Herbert was an arm with
>>     several DOF and a force sensing gripper. Had to learn a lot about
>>     torque and gear and chain drives.
>>     Have you had any success using LIDAR or programs like Roborealm.
>>     (Herbert uses Roborealm for some things).
>>
>>     Robert
>>
>>     On Thu, Apr 30, 2020 at 10:16 AM David Anderson via DPRGlist
>>     <dprglist at lists.dprg.org <mailto:dprglist at lists.dprg.org>> wrote:
>>
>>         Robert,
>>
>>         Sounds like what you are looking for is SLAM. Though from
>>         your description perimeter following would probably work and
>>         be much more robust.  Is Herbert one of the LEAF robots?
>>
>>         regards
>>
>>         dpa
>>
>>
>>
>>         On 04/30/2020 11:50 AM, Robert Zeiler via DPRGlist wrote:
>>>         Hi all
>>>         Thanks for the reply. I have also done odometry on my robots
>>>         as well as ultrasound and IR.
>>>         But, for this application, I was looking for experience with
>>>         either visual or lidar mapping techniques. Basically the
>>>         idea is for the robot to enter a room, scan the environment
>>>         for obstacles (will also have onboard sonar for collision
>>>         avoidance), make a map from the readouts and then enter the
>>>         room using the info to establish a path through the room.  I
>>>         want to hit all areas of the room. The robot will return to
>>>         the starting point and stop.
>>>
>>>         Robert
>>>
>>>         On Wed, Apr 29, 2020 at 5:22 PM Murray Altheim via DPRGlist
>>>         <dprglist at lists.dprg.org <mailto:dprglist at lists.dprg.org>>
>>>         wrote:
>>>
>>>             Hi Robert,
>>>
>>>             I'm also keen to understand how to perform some of the
>>>             tricks David has
>>>             perfected, and it's worth mentioning that he has a
>>>             helpful page on
>>>             odometry at:
>>>
>>>             http://www.geology.smu.edu/~dpa-www/robo/Encoder/imu_odo/
>>>             <http://www.geology.smu.edu/%7Edpa-www/robo/Encoder/imu_odo/>
>>>
>>>             As my robots are all targeted at indoors GPS is unavailable.
>>>
>>>             I've put together the beginnings of a page on the NZPRG
>>>             wiki on the
>>>             subject at:
>>>
>>>             https://service.robots.org.nz/wiki/Wiki.jsp?page=Odometry
>>>
>>>             but it's not had much love (yet) as I'm still getting my
>>>             PID controller
>>>             to the point of functionality (and not being sidetracked
>>>             by every other
>>>             whim that comes my way, such as Firmata).
>>>
>>>             I've also considered having my robot perform repeated
>>>             scans of the signal
>>>             strength of all the WiFi signals it can see (dozens, in
>>>             a suburban
>>>             neighborhood) from the four corners of my house, storing
>>>             that information,
>>>             and using that info, along with compass heading (from a
>>>             BNO055) to get an
>>>             idea where in my house the robot is. Since the Raspberry
>>>             Pi has WiFi built
>>>             in, this is a free exercise (no additional sensors
>>>             required). There's a
>>>             lot of noise, you'll need both a blacklist (because cell
>>>             phones move
>>>             around) and a whitelist (to prioritise known sources as
>>>             if they were
>>>             beacons, or actually use a few older Pis as beacons),
>>>             but I'm still
>>>             thinking it might work...
>>>
>>>             Cheers,
>>>
>>>             Murray
>>>
>>>             On 30/04/20 12:07 pm, David Anderson via DPRGlist wrote:
>>>             > Robert (and Herbert)
>>>             >
>>>             > I've been doing autonomous robot navigation
>>>             successfully for some years now using location
>>>             information gathered from wheel encoders and gyros on a
>>>             number of my robots. I'd be happy to answer any
>>>             questions you might have.  You might start
>>>             > out by looking at the navigation writeups associated
>>>             with my outdoor jBot robot:
>>>             >
>>>             > http://www.geology.smu.edu/dpa-www/robo/jbot
>>>             >
>>>             > Here's a video of that robot navigating through the
>>>             woods to a waypoint 500 feet away and returning to
>>>             within a few inches of the starting point:
>>>             >
>>>             >
>>>             http://www.geology.smu.edu/~dpa-www/robo/jbot/jbot_hatrick2_2.mpg
>>>             <http://www.geology.smu.edu/%7Edpa-www/robo/jbot/jbot_hatrick2_2.mpg>
>>>             >
>>>             > The jBot robot has an onboard GPS but that is not used
>>>             or required for these navigation tasks.
>>>             >
>>>             > best regards,
>>>             >
>>>             > dpa
>>>             >
>>>             > On 04/29/2020 06:37 PM, Robert Zeiler via DPRGlist wrote:
>>>             >> Has anybody had any success using any kind of mapping
>>>             system or device for robot autonomous navigation?.
>>>             >>
>>>             >> Robert and Herbert (the robot)
>>>             ...........................................................................
>>>             Murray Altheim <murray18 at altheim dot com>           
>>>                        = =  ===
>>>             http://www.altheim.com/murray/                          
>>>                      ===  ===
>>>                                   = =  ===
>>>                  In the evening
>>>                  The rice leaves in the garden
>>>                  Rustle in the autumn wind
>>>                  That blows through my reed hut.
>>>                         -- Minamoto no Tsunenobu
>>>
>>>             _______________________________________________
>>>             DPRGlist mailing list
>>>             DPRGlist at lists.dprg.org <mailto:DPRGlist at lists.dprg.org>
>>>             http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>>>
>>>
>>>
>>>         _______________________________________________
>>>         DPRGlist mailing list
>>>         DPRGlist at lists.dprg.org <mailto:DPRGlist at lists.dprg.org>
>>>         http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>>
>>         _______________________________________________
>>         DPRGlist mailing list
>>         DPRGlist at lists.dprg.org <mailto:DPRGlist at lists.dprg.org>
>>         http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20200501/f719b8dc/attachment-0001.html>


More information about the DPRGlist mailing list