[Dprglist] KR01 Infrared Behaviours

Murray Altheim murray18 at altheim.com
Tue Aug 31 04:19:44 PDT 2021


On 31/08/21 6:35 am, David P. Anderson via DPRGlist wrote:
> Hi Murray,
> 
> You might want to look at some of the video and slides from the June
> DPRG navigation talk that perhaps address some of your questions:
> IR avoidance behavior slides 26-34 from the talk at 00:50 thru 01:00,
> and Perimeter Following behavior slides 43-48 from the talk at 01:08
> thru 01:14.

Hi David,

Yes, I will admit my design used to be different but has recently
been modified based on your influence. The one pronounced difference is
that I'm using 150cm analog IRs rather than digital IRs, and my Cunning
Plan is to, like my Roam Behaviour, to add a lambda function into the
motor control to alter the target velocity as a function of distance.

I.e., the basic idea of the center IR used for the Roam Behaviour is
not a yes/no, we've seen something ahead, but instead gradually slows
down to zero at a configured minimum distance if the robot hasn't
swerved away by that point.

The two oblique IRs, which you're calling your LEFT and RIGHT, are on
my robot positioned roughly the same as yours, but I'm like the Roam
Behaviour planning to have the *amount* of swerve away from the obstacle
be a function of distance. And the relatively difference between the
functions of the LEFT and RIGHT motors would determine the trajectory.

So if both LEFT and RIGHT were reading the same distance, the effect
would be the same as Roam, since the LEFT and RIGHT would balance
each other out. If LEFT were closer, it would steer away from the
left at an arc inversely proportional to the distance. This idea is
similar to the idea bandied about in the group of having a PID (or
P or PD) controller for steering, except I'm using it for swerving
away from an obstacle.

> Your hardware is more like that on the nBot balancing robot (i.e.,
> IR distance rather than proximity detectors) but the code for the
> avoidance and wall following behaviors are basically the same as
> covered in the talk, and as illustrated in the video of nBot around
> 01:12.

And of course the RCAT just before that as well. This is what I believe
you're calling a Perimeter Follower behaviour, and would correspond to
what I called Wall Follow. You had said that it was the balancing act
between the pull of the long range analog Sharp IRs and the push of
the short range IRs and ultrasonics.

I've got the hardware for the ultrasonics as well as the VL531X ToF
LIDAR, the latter mounted on a servo, but I've not begun porting that
all over into the new OS yet. So I'm focused on just getting these
proximity/obstacle avoidance behaviours working first.

Your intuition about the interplay between the different sensors is
what I'm trying to work out, e.g., what you've called "out of band
signaling" is basically where I'm at right now.

Thanks always for your patient attempts to guide me towards some kind
of reasonable solution... slowly getting there...

[I've now watched all or parts of that video multiple times. There's
a lot there to digest...]

Cheers,

Murray

FYI note: Session 2: Waypoint Navigation begins at 1:15:40
...........................................................................
Murray Altheim <murray18 at altheim dot com>                       = =  ===
http://www.altheim.com/murray/                                     ===  ===
                                                                    = =  ===
     In the evening
     The rice leaves in the garden
     Rustle in the autumn wind
     That blows through my reed hut.
            -- Minamoto no Tsunenobu



More information about the DPRGlist mailing list