[Dprglist] Sample Retrieval vacuum cleaner

Doug Paradis paradug at gmail.com
Sun Jan 20 17:07:47 PST 2019


John,
    If you placed your robot in the contest measurement box at 45 degrees
and your robot was roughly round, you could have a max outside dimension of
about 24 inches for your receivers.

In regard to the clarification you pointed out, yes, an individual
competitor's entry could remember previous runs, kinda like MicroMouse
running the maze several times. That competitor just couldn't share the
mapping data with another competitor.

I will have to take several of the points that have come out in this
discussion and capture them on the rule clarifications web-page..

Regards,
Doug P.

On Sun, Jan 20, 2019 at 1:05 PM John Swindle <swindle at compuserve.com> wrote:

> Doug,
>
> People with hearing aids can't figure out who is talking to them because
> hearing aids are in, or on, their ears (behind-the-ear has a piece in the
> ear), changing the ear canal that the wearer grew up with. The ones that
> aren't in-the-ear also mess up the pinna. Have you wondered how we can know
> sound is in front of us, behind us, or above us, when we only have two
> ears? It's the pinna and the ear canal. The pinna and the ear canal screw
> up the sound. They are not flat transducers of the sound. We learn how to
> hear as we grow up. We learn what the screwups sound like. If the pinna is
> cut off or interfered with, and the ear canal is plugged with an aid, the
> wearer loses all but left-right info about where sounds come from, until
> they re-learn how to hear. A few decades ago, head models were used for
> recordings and head response transfer functions were created. After 30 or
> so years, just a couple years ago, the Audio Engineering Society published
> a standard for measuring and using HRTFs. The only use I've seen for them
> so far is virtual reality games.
>
> Although not proven, it is strongly suspected that bats likewise use their
> screwed-up ear shape to determine where the wing beats and echoes are
> coming from. To accommodate that, the ping is a chirp, so each frequency is
> messed up differently. Chirps are traditionally considered to be a way to
> compress energy, and this may be a reason, but I adhere to the opinion that
> bats chirp to determine direction.
>
> Still doesn't answer your underlying question about why my gadget is so
> big. The bats are pointing their ears and they are flying and they are
> smart. I don't like to make stuff that moves, except to move air. So, if
> the sensors can't move, the gadget has to be omnidirectional. The benefit
> is that big chunks of the arena are mapped before the robot moves.
>
> I disagree about ears being integral parts. To me, ears certainly look
> like things hanging about, slapped on, and they are big in relation to many
> other body features. Likewise on bats.
>
> My understanding is that bats use passive sonar to listen for insect wing
> beats and then use active sonar for targeting. Some insects listen for the
> active sonar and quit flying, freefalling. That tactic works if the bat has
> not acquired the moth, sort of like your radar detector alerting you to the
> bear. But if the bat has targeted, it might still catch the freefalling
> moth. I do not believe bats can detect moths that are not flying, ones that
> have lit on a branch.
>
> The challenge is severe. What I have done so far is nowhere good enough.
> With distributed beacons, I've done rudimentary mapping of a room. But if
> the robot is carrying the emitters, I need quite a bit of separation
> between them to get resolution. Raising the frequency for better resolution
> destroys the omnidirectionality. (Kinda like LiDAR when it can only look at
> one spot at a time, or one plane at a time.) I am going to have to come up
> with something much better in order to identify shapes. I imagine the
> gadget mapping the room and identifying obstacles without determining what
> the obstacles are, and then collecting all the obstacles that can be moved
> (Ron's strain gauge). Bats eat anything that's in the air.
>
> Thank you for offering bases for me to use. Very gracious of you. I will
> look at those and let you know. In the meantime, if someone wanted to treat
> the gadget just as another sensor hosted on their robot, that would be
> great.
>
> To clarify: An individual competitor's entry could remember previous runs,
> kinda like MicroMouse running the maze several times. That competitor just
> couldn't share the mapping data with another competitor. Right?
>
> Thanks!
>
> John Swindle
>
>
>
> -----Original Message-----
> From: Doug Paradis <paradug at gmail.com>
> To: John Swindle <swindle at compuserve.com>
> Cc: Ron Grant <deltagraph at aol.com>; DPRG <dprglist at lists.dprg.org>
> Sent: Sun, Jan 20, 2019 10:43 am
> Subject: Re: [Dprglist] Sample Retrieval vacuum cleaner
>
> John,
>     I am still working through your message, however I have some thoughts
> and possible answers.
> First a few questions, I know bats have large ears, but in the scheme of
> things they are small animals, so their sensors (i.e., ears) are no larger
> than 2x4 inches each. They use a high pitch or ultrasonic squeak (emitter).
> They also carry their emitter and receivers on their body without  any
> non-integral body parts hanging about. With this equipment they can easily
> detect insects flying about. The question, is this detection simply
> obstacle detection (i.e., I look in the air, I see an obstacle, it moves,
> therefore it is food.) or is it target determination (i.e., I look in the
> air, I see a moth not a mosquito). What if the moth is on a tree branch.
> Can a bat still find the moth?
>
> Do bats move their ears while listening to form a scan of the reflection?
>
> The contest challenge is to find specific objects (samples). The
> environment will be a room with lots of stuff and people in it. Could a
> sonic system have enough discrimination to identify the target objects? Or
> would it only be able to detect obstacles. What object characteristics or
> features could it determine (height, width, density, surface curvature,
> profile, distance, etc...)?
>
> How would you return to home base? I assume that you will detect the cone
> at home base as just an other unique object and drive to it.
>
> Now I have some answers.
>
> If the robot base is holding you back, I will offer use of my robot
> "Falcon". You can find a picture of it a
> https://www.dprg.org/fall-indoor-competition-results/. It is the white
> robot in front of me in the quick trip picture.  It is a club robot base
> with a gated scoop in front. I will make provisions to mount a ~5x6.5 inch
> deck (or multiple decks) on it for your sensors and electronics.
> Communication between your electronics and the base will be via a serial
> port (115200 baud). The robot is capable of dead reckoning navigation. We
> can work out a interface protocol. My thoughts are your electronics should
> communicate these basic commands: turn right, turn left, go straight, stop,
> or drive to x,y location. The gate closes when an object trips a sensor
> inside the scoop, but you could also command gate position. Currently the
> robot has a low mounted ultrasonic sensor, and a pixyCAM-1. I can remove
> the pixyCAM and the ultrasonic sensor can be unplugged. The wheel base is
> 9.75 inches. So your task would be to construct your gear and place it on
> the deck. I can laser cut a deck, with mounting holes and mail it to you,
> or give to you at next meeting. The deck will be made of Masonite. I will
> provide the robot base software. Let me know it this is acceptable.
>
> An alternative robot base is the club's turtle robot. It is based on a
> Roomba. It would be much more work but would provide more deck space. I
> believe it is a ROS capable robot, but I haven't looked at a while so I
> don't know the state of its electronics.
>
> Programming the robot with object locations is a no go. Note that the
> contest's objects are available for practice and generating learning sets
> at RBNO every Tuesday.
>
> Sharing information from a competitor's robot run I believe is also a no
> go. It seems to me that it would be unfair to other competitors.
>
> Note the rules say that the robot must carry all the sensors, but
> computations do not need to be done on the robot, you can use an external
> computer with some type of radio to communicate the robot. Most folks use
> WiFi.
> Depending on the object features extracted by the sonic system, I think
> that AI could be useful.
> Regards,
> Doug P.
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20190120/0aba26a8/attachment.html>


More information about the DPRGlist mailing list