[Dprglist] Robotics - Capability Maturity models

Thalanayar Muthukumar tnkumar at gmail.com
Thu Oct 28 19:21:38 PDT 2021


When I looked at the 10 capabilities proposed by Chris, from my personal perspective, I found it useful as a list and helps define the details of a capability that I could pursue.

Thanks Chris.

I did not look at it as something you pursue sequentially but helps give guidance on things to pursue, from more experienced people on the level of complexities of the different capabilities.

Regards.
- Kumar

Sent from my iPhone

> On Oct 27, 2021, at 3:54 PM, dprglist-request at lists.dprg.org wrote:
> 
> Send DPRGlist mailing list submissions to
>    dprglist at lists.dprg.org
> 
> To subscribe or unsubscribe via the World Wide Web, visit
>    http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
> or, via email, send a message with subject or body 'help' to
>    dprglist-request at lists.dprg.org
> 
> You can reach the person managing the list at
>    dprglist-owner at lists.dprg.org
> 
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of DPRGlist digest..."
> 
> 
> Today's Topics:
> 
>   1. seriously off topic rant about tonight's discussion (Karim Virani)
>   2. Special interrupts/ timers on STM32 for interfacing with
>      encoders (Thalanayar Muthukumar)
>   3. Robotics "Capability Maturity Model" - Thoughts? (Chris N)
>   4. Re: CircuitPython for NUCLEO STM32L476RG (Doug Paradis)
>   5. Re: CircuitPython for NUCLEO STM32L476RG (Thalanayar Muthukumar)
>   6. Re: Robotics "Capability Maturity Model" - Thoughts? (Carl Ott)
>   7. Re: CircuitPython for NUCLEO STM32L476RG (David P. Anderson)
>   8. Re: Robotics "Capability Maturity Model" - Thoughts? (Iron Reign)
>   9. Re: seriously off topic rant about tonight's discussion
>      (Karim Virani)
>  10. Re: NUCLEO STM32 L476RG - need suggestion of development
>      environment to use (Thalanayar Muthukumar)
>  11. Re: Robotics "Capability Maturity Model" - Thoughts? (Jim Merkle)
>  12. Re: seriously off topic rant about tonight's discussion (Carl Ott)
>  13. Re: Robotics "Capability Maturity Model" - Thoughts?
>      (Murray Altheim)
>  14. Re: Robotics "Capability Maturity Model" - Thoughts?
>      (David P. Anderson)
>  15. Re: seriously off topic rant about tonight's discussion
>      (David P. Anderson)
>  16. Fwd: [Webinar] Using Time-of-Flight Range Sensing to Make
>      Appliances Better (Doug Paradis)
>  17. Re: Fwd: [Webinar] Using Time-of-Flight Range Sensing to Make
>      Appliances Better (David P. Anderson)
>  18. Bees (David P. Anderson)
> 
> 
> ----------------------------------------------------------------------
> 
> Message: 1
> Date: Wed, 27 Oct 2021 03:15:11 -0500
> From: Karim Virani <pondersome64 at gmail.com>
> To: DPRG <dprglist at lists.dprg.org>
> Subject: [Dprglist] seriously off topic rant about tonight's
>    discussion
> Message-ID:
>    <CAKtnkiz-KZ422_x4Czx0AUEdODaoJA4U6SJ5Uc43BAGaQjkORA at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> First, the Nature special on bees was just fantastic. I went ahead and
> watched it after the conversation tonight.
> https://video.kera.org/video/my-garden-of-a-thousand-bees-trjhzt/
> 
> And then ... there's the Donald Hoffman TED interview ...
> 
> OMG David!!!
> 
> You were totally fun'in us. You meant to provoke! DH is just a Deepak
> Chopra wannabe. I resist giving credence to these peddlers of soft-shoe
> quantum theory tincture in pursuit of monetizable wishful thinking.
> 
> Granted this was only one interview on a platform that often caters to the
> intellectual mystics among us (I used to be a fan of TED talks), but this
> dude outed himself completely.
> 
> First he completely mis-characterizes the field of modern cognitive science
> (if that's what he considers to be his colleagues) and paints it in the
> light of 70's era progress. As if he was the first to consider fitness as
> the basis for how evolutionary development works. Almost nobody thinks
> sensory evolution is driven to create accurate or truthful interpretations
> of reality. He can't claim that as his unique insight. It's like he's
> saying his peers all have a 5th grade understanding of evolution.
> 
> But then he goes totally bonkers:
> 
> 1. Consciousness is hard to describe and investigate - ok so far
> 2. So let's throw traditional "reality" out the window and assume the
> universe is fundamentally made up of a network of multi-level conscious
> entities
> 3. For those entities bundled up as humans, the network has decided to give
> them an "interface" that creates time, space, particles, neurons, etc. as a
> useful fiction. (ie. the software is real and the hardware is the story)
> 4. Oh, and I have some math, so it's not really BS
> 5. Oh, and I may or may not believe this, but I'm brave for going out on a
> limb and daring to shake up the field because hard problems need
> disruptions to solve. (this is my get out of jail free card, maybe)
> 
> I agree with step 1, but step 2, that's a doozy. The rest is a sophomoric
> attempt to confound interesting modern explorations into the foundations of
> physics with 70s era pop quantum psychology like in the Dancing Wu Li
> Masters or the Tao of Physics. My bet, he'd point to those books as his
> influences. They were fun reads when I was a pup. But they are truly works
> of fiction. So is this dude.
> 
> Now I'm going to tell you how I really feel...
> ... Oh, and I'm actually very open minded about this guy...
> 
> David, thanks for riling my sensibilities - haven't had a good rant
> defending honest inquiry in ages.
> 
> Again, this is effectively a slam of a single interview but if you wish to
> point to a place where he offers a shred of evidence toward his theory, or
> can correct my interpretation of this interview, well please share. We can
> continue the fun :)
> 
> Cheers,
> 
> Karim
> ps. dear reader, this acerbic review is a choice in creative expression. If
> it offends your viewpoint or sensibilities, I'm sorry, it was not meant to
> do so
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/96b7c4e0/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 2
> Date: Wed, 27 Oct 2021 07:07:26 -0500
> From: Thalanayar Muthukumar <tnkumar at gmail.com>
> To: DPRG <dprglist at lists.dprg.org>
> Subject: [Dprglist] Special interrupts/ timers on STM32 for
>    interfacing with encoders
> Message-ID: <A5FE7167-1A2C-4406-9E17-C8F6FED14EAB at gmail.com>
> Content-Type: text/plain; charset=us-ascii
> 
> Thanks all for the inputs yesterday.
> 
> There was a mention yesterday on the call of some special timers / interrupts on the STM32 that make it easy to interface with encoders. 
> 
> Can someone point me to any links?
> 
> Regards.
> - Kumar
> 
> Sent from my iPhone
> 
> ------------------------------
> 
> Message: 3
> Date: Wed, 27 Oct 2021 11:03:40 -0400
> From: Chris N <netterchris at gmail.com>
> To: DPRG <dprglist at lists.dprg.org>
> Subject: [Dprglist] Robotics "Capability Maturity Model" - Thoughts?
> Message-ID:
>    <CA++ApaW+M66=kPeR+A3p8Z1c7-rMa+zCUgDJZ4bGb59J+D4ojQ at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> So in part because of questions along the line of "where do I get started"
> from Kumar, and others before him, that we occasionally get, I thought it
> might help to have some sort of capability model in mind that can be
> referenced.
> 
> That way one can ask "OK - what level are you at currently?  What level are
> you trying to reach in the near term?"
> 
> Below is what I mean.  For now this is focused on software and locomotion
> (but starting with Level 8,  perception comes into the picture)
> 
> I have more explanation to go along with each item but wanted to keep it
> brief in this e-mail.
> 
> The idea is not to explain how to do these things.  This just represents
> milestones along the journey.  This is really more about having an agreed
> upon vocabulary.
> 
> Thoughts?   Is something like this helpful to have written down?  Is it
> already written down somewhere ? (I am sure that in some ways, this is
> captured in some of the material that David Anderson has published over the
> years)
> 
> Level 0: I have an API through which I can control the speed and direction
> of the individual wheels. My robot can move!
> 
> Level 1: I have an API through which I can reliably get the incremental
> encoder counts for each wheel.  When motors are commanded with a certain
> duty cycle, I can measure what that translates to in terms of encoder
> counts per time unit.
> 
> Level 2: I am keeping track of my Robot's X, Y and Theta via dead-reckoning
> / odometry
> 
> Level 3: I have taken at least basic steps to calibrate my robot's odometry
> calculations.
> 
> Level 4: I can command the robot to move, but using more abstract units
> such as "meters/second" or at least "encoder ticks per time unit"
> 
> Level 5: My robot actually moves at the linear and angular velocity I tell
> it to, even when battery voltage is lower or surface friction is higher.
> And because I have completed Level 3, it can go in a somewhat straight line
> when I ask it to.
> 
> Level 6: My robot moves in a fairly smooth fashion, i.e. it changes speed
> and direction somewhat gradually. Movement is not "robotic"
> 
> Level 7:  I can command my robot to go to a certain X,Y coordinate aka
> waypoint relative to its current location,  and the robot more or less
> reaches that location provided nothing is in the way
> 
> Level 8:  My robot can reach the target location, even if there are
> obstacles in the way.
> 
> Level 9:  My robot can come back after reaching its destination.
> 
> Level 10: My robot can do all this, even if the obstacles are moving around
> quite a bit or if there are other sources of possible confusion
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/f25f65a9/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 4
> Date: Wed, 27 Oct 2021 10:33:38 -0500
> From: Doug Paradis <paradug at gmail.com>
> To: Thalanayar Muthukumar <tnkumar at gmail.com>
> Cc: DPRG <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] CircuitPython for NUCLEO STM32L476RG
> Message-ID:
>    <CAOdUW+Y6nvXqFiww8VOWcPp1DJvdh8Q-pFS1VAUDu+RaRmNs6A at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> Kumar,
>         See STM32L475RG datasheet sections 3.24.2 and 3.24.4 to see which
> timers support the encoder mode. Here are some links to get you started:
> https://deepbluembedded.com/stm32-timer-encoder-mode-stm32-rotary-encoder-interfacing/
> and
> https://www.st.com/resource/en/application_note/dm00042534-stm32-crossseries-timer-overview-stmicroelectronics.pdf
> <https://www.st.com/resource/en/application_note/dm00042534-stm32-crossseries-timer-overview-stmicroelectronics.pdf>
> and
> https://www.st.com/content/ccc/resource/training/technical/product_training/group0/2f/ec/a2/2a/74/48/4c/67/STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM/files/STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM.pdf/jcr:content/translations/en.STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM.pdf
> .
> 
>      There are several YouTube videos also. Use search terms: STM32,
> Timers, Quadrature Encoders, Encoder Mode.
> 
>       STM32CUBEMX should be a good tool to help set up the timers. David
> Ackley and David Anderson most likely have example code that they may share.
> 
> Regards,
> Doug P.
> 
>> On Mon, Oct 25, 2021 at 9:59 AM Thalanayar Muthukumar via DPRGlist <
>> dprglist at lists.dprg.org> wrote:
>> 
>> Have any of you worked with CircuitPython (not Micropython) on any of the
>> STM32 boards? The reason I ask is that for some of the peripheral boards
>> like motor controllers from Adafruit, they are supported on CircuitPython
>> and not MicroPython.
>> 
>> These (https://circuitpython.org/downloads?q=STM32) are the STM32 boards
>> supported by CircuitPython and
>> here (https://circuitpython.readthedocs.io/en/7.0.x/ports/stm/README.html)
>> is guidance on how to support new STM32 boards in CircuitPython.
>> 
>> During the weekend, I was able to get my NUCLEO board working with I2C
>> OLED and BNO055, thanks to help from Jim Merkle and other folks from the
>> Personal Robotics and Adafruit Discord Servers.
>> 
>> Regards.
>> - Kumar
>> _______________________________________________
>> DPRGlist mailing list
>> DPRGlist at lists.dprg.org
>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/2c46beb8/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 5
> Date: Wed, 27 Oct 2021 10:36:02 -0500
> From: Thalanayar Muthukumar <tnkumar at gmail.com>
> To: Doug Paradis <paradug at gmail.com>
> Cc: DPRG <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] CircuitPython for NUCLEO STM32L476RG
> Message-ID:
>    <CAAwP+LZoVZ-3VBUspV=RmkUUD+LjDKzQwJxaOyWk3jap8MN_bg at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> Thanks Doug P. I will check these out.
> I have never worked with encoders before and should be getting my first
> (motors with encoders) in the coming weeks.
> 
> Regards.
> - Kumar
> 
>> On Wed, Oct 27, 2021 at 10:33 AM Doug Paradis <paradug at gmail.com> wrote:
>> 
>> Kumar,
>>         See STM32L475RG datasheet sections 3.24.2 and 3.24.4 to see which
>> timers support the encoder mode. Here are some links to get you started:
>> https://deepbluembedded.com/stm32-timer-encoder-mode-stm32-rotary-encoder-interfacing/
>> and
>> https://www.st.com/resource/en/application_note/dm00042534-stm32-crossseries-timer-overview-stmicroelectronics.pdf
>> <https://www.st.com/resource/en/application_note/dm00042534-stm32-crossseries-timer-overview-stmicroelectronics.pdf>
>> and
>> https://www.st.com/content/ccc/resource/training/technical/product_training/group0/2f/ec/a2/2a/74/48/4c/67/STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM/files/STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM.pdf/jcr:content/translations/en.STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM.pdf
>> .
>> 
>>      There are several YouTube videos also. Use search terms: STM32,
>> Timers, Quadrature Encoders, Encoder Mode.
>> 
>>       STM32CUBEMX should be a good tool to help set up the timers. David
>> Ackley and David Anderson most likely have example code that they may share.
>> 
>> Regards,
>> Doug P.
>> 
>> On Mon, Oct 25, 2021 at 9:59 AM Thalanayar Muthukumar via DPRGlist <
>> dprglist at lists.dprg.org> wrote:
>> 
>>> Have any of you worked with CircuitPython (not Micropython) on any of the
>>> STM32 boards? The reason I ask is that for some of the peripheral boards
>>> like motor controllers from Adafruit, they are supported on CircuitPython
>>> and not MicroPython.
>>> 
>>> These (https://circuitpython.org/downloads?q=STM32) are the STM32 boards
>>> supported by CircuitPython and
>>> here (https://circuitpython.readthedocs.io/en/7.0.x/ports/stm/README.html)
>>> is guidance on how to support new STM32 boards in CircuitPython.
>>> 
>>> During the weekend, I was able to get my NUCLEO board working with I2C
>>> OLED and BNO055, thanks to help from Jim Merkle and other folks from the
>>> Personal Robotics and Adafruit Discord Servers.
>>> 
>>> Regards.
>>> - Kumar
>>> _______________________________________________
>>> DPRGlist mailing list
>>> DPRGlist at lists.dprg.org
>>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>>> 
>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/b8016a38/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 6
> Date: Wed, 27 Oct 2021 10:44:27 -0500
> From: Carl Ott <carl.ott.jr at gmail.com>
> To: DPRG <dprglist at lists.dprg.org>
> Cc: Chris N <netterchris at gmail.com>, Carl Ott <carl.ott.jr at gmail.com>
> Subject: Re: [Dprglist] Robotics "Capability Maturity Model" -
>    Thoughts?
> Message-ID:
>    <CA+XqQZRP6cVgsNghPuWu7qZLuQU=gS=cGnUAT6AJQMxy+xNfOw at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> Chris,
> 
> What a really great idea!  Borrow from the CMM - I love it.
> https://en.wikipedia.org/wiki/Capability_Maturity_Model
> 
> Will take a little time to respond with better suggestions -
> but as quick feedback -
> That maturity path leans heavily on odometry to measure progress - and
> presumes a certain set of objectives.
> But I can show that my robot can reach a destination and return 'more or
> less' to start completely without odometry- using only timed motor commands
> - or a little more accurately by using motor velocity control loops.
> I think the CMM is/ should be more about range of skills, and ability to
> produce consistent results and data driven process and performance
> improvement...
> 
> So I'm going to chew on this, and come back with something a little more
> abstracted, along the lines of
> 
> Level 0
> no technical background
> never wrote a program
> never built mechatronics
> 
> Level 1
> scattered technical skills or some experience
> * changing an existing program or writing a program from scratch
> * building a kit with software / mechanics and electronics
> 
> Levels 2..10 <needs more thought> ...
> 
> 
> Level 11
> claims personal responsibility for developing best-in-class robots and
> robot technologies from scratch at Boston Dynamics / Tesla / CMU Robotics /
> etc...
> 
> why 11 levels?  simple: This Is Spinal Tap
> 
> 
> - Carl
> 
> On Wed, Oct 27, 2021 at 10:04 AM Chris N via DPRGlist <
> dprglist at lists.dprg.org> wrote:
> 
>> So in part because of questions along the line of "where do I get started"
>> from Kumar, and others before him, that we occasionally get, I thought it
>> might help to have some sort of capability model in mind that can be
>> referenced.
>> 
>> That way one can ask "OK - what level are you at currently?  What level
>> are you trying to reach in the near term?"
>> 
>> Below is what I mean.  For now this is focused on software and locomotion
>> (but starting with Level 8,  perception comes into the picture)
>> 
>> I have more explanation to go along with each item but wanted to keep it
>> brief in this e-mail.
>> 
>> The idea is not to explain how to do these things.  This just represents
>> milestones along the journey.  This is really more about having an agreed
>> upon vocabulary.
>> 
>> Thoughts?   Is something like this helpful to have written down?  Is it
>> already written down somewhere ? (I am sure that in some ways, this is
>> captured in some of the material that David Anderson has published over the
>> years)
>> 
>> Level 0: I have an API through which I can control the speed and direction
>> of the individual wheels. My robot can move!
>> 
>> Level 1: I have an API through which I can reliably get the incremental
>> encoder counts for each wheel.  When motors are commanded with a certain
>> duty cycle, I can measure what that translates to in terms of encoder
>> counts per time unit.
>> 
>> Level 2: I am keeping track of my Robot's X, Y and Theta via
>> dead-reckoning / odometry
>> 
>> Level 3: I have taken at least basic steps to calibrate my robot's
>> odometry calculations.
>> 
>> Level 4: I can command the robot to move, but using more abstract units
>> such as "meters/second" or at least "encoder ticks per time unit"
>> 
>> Level 5: My robot actually moves at the linear and angular velocity I tell
>> it to, even when battery voltage is lower or surface friction is higher.
>> And because I have completed Level 3, it can go in a somewhat straight
>> line when I ask it to.
>> 
>> Level 6: My robot moves in a fairly smooth fashion, i.e. it changes speed
>> and direction somewhat gradually. Movement is not "robotic"
>> 
>> Level 7:  I can command my robot to go to a certain X,Y coordinate aka
>> waypoint relative to its current location,  and the robot more or less
>> reaches that location provided nothing is in the way
>> 
>> Level 8:  My robot can reach the target location, even if there are
>> obstacles in the way.
>> 
>> Level 9:  My robot can come back after reaching its destination.
>> 
>> Level 10: My robot can do all this, even if the obstacles are moving
>> around quite a bit or if there are other sources of possible confusion
>> 
>> 
>> 
>> 
>> _______________________________________________
>> DPRGlist mailing list
>> DPRGlist at lists.dprg.org
>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/5acd5ee7/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 7
> Date: Wed, 27 Oct 2021 11:06:25 -0500
> From: "David P. Anderson" <davida at smu.edu>
> To: <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] CircuitPython for NUCLEO STM32L476RG
> Message-ID: <3a747cf1-1cba-9833-6c5a-1137ec874e2f at smu.edu>
> Content-Type: text/plain; charset="utf-8"; Format="flowed"
> 
> Kumar,
> 
> My encoder code was basically cloned from this example:
> 
> https://petoknm.wordpress.com/2015/01/05/rotary-encoder-and-stm32/
> 
> dpa
> 
> 
>> On 10/27/21 10:33 AM, Doug Paradis via DPRGlist wrote:
>> 
>> */[EXTERNAL SENDER]/*
>> 
>> Kumar,
>> ? ? ? ? ?See STM32L475RG datasheet sections 3.24.2 and 3.24.4 to see 
>> which timers support the encoder mode. Here are some links to get you 
>> started: 
>> https://deepbluembedded.com/stm32-timer-encoder-mode-stm32-rotary-encoder-interfacing/ 
>> <https://deepbluembedded.com/stm32-timer-encoder-mode-stm32-rotary-encoder-interfacing/> 
>> and?https://www.st.com/resource/en/application_note/dm00042534-stm32-crossseries-timer-overview-stmicroelectronics.pdf 
>> <https://www.st.com/resource/en/application_note/dm00042534-stm32-crossseries-timer-overview-stmicroelectronics.pdf>?and 
>> https://www.st.com/content/ccc/resource/training/technical/product_training/group0/2f/ec/a2/2a/74/48/4c/67/STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM/files/STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM.pdf/jcr:content/translations/en.STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM.pdf 
>> <https://www.st.com/content/ccc/resource/training/technical/product_training/group0/2f/ec/a2/2a/74/48/4c/67/STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM/files/STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM.pdf/jcr:content/translations/en.STM32G4-WDG_TIMERS-General_Purpose_Timer_GPTIM.pdf>?.
>> 
>> ? ? ? There are several YouTube videos also. Use search terms: STM32, 
>> Timers, Quadrature Encoders, Encoder Mode.
>> 
>> ? ? ? ?STM32CUBEMX should be a good tool to help set up the timers. 
>> David Ackley and David Anderson most likely have example code that 
>> they may share.
>> 
>> Regards,
>> Doug P.
>> 
>> On Mon, Oct 25, 2021 at 9:59 AM Thalanayar Muthukumar via DPRGlist 
>> <dprglist at lists.dprg.org <mailto:dprglist at lists.dprg.org>> wrote:
>> 
>>    Have any of you worked with CircuitPython (not Micropython) on any
>>    of the STM32 boards? The reason I ask is that for some of the
>>    peripheral boards like motor controllers from Adafruit, they are
>>    supported on CircuitPython and not MicroPython.
>> 
>>    These (https://circuitpython.org/downloads?q=STM32
>>    <https://secure-web.cisco.com/1zGRxQj7lnUAF8FIMDnpNJtkEVIES2uGOECh8MUVTI1e_eSsuiW-gSWEqWZ3ZQrnFTMdtCBeOSbulG5EMhEZ5JGvEkbbN4IqLOnYByvtLWujnXJrKLmCZ1uMLrpxJGfeu7UZY-9hjdMJwEnHT2T_AfDQnN90hXOp_EeE51ukd6q6ZTCqBBEWLEKHInj-Egd6ViOrGG7wd6Tl15ssepftVu6axpi8U1Lm0D7jAfUC4lHuSwZPk85QYbOKZb_a0ni2M90Q9xts64xc9w46_cTF-Ifm7raO_5NWzM-fphX5cNhc/https%3A%2F%2Fcircuitpython.org%2Fdownloads%3Fq%3DSTM32>)
>>    are the STM32 boards supported by CircuitPython and
>>    here
>>    (https://circuitpython.readthedocs.io/en/7.0.x/ports/stm/README.html
>>    <https://secure-web.cisco.com/1E3_w710TGBDXrI8NrzhP6jll84149E0LlO1VNltkK8f3imUH1fRrgfMlVLnmBURFBumDYAgyIN5rsGkTvJKfNu4x_Riq9E7aj5GOD9-XGOD3oBWPVeLTPNOkuebyzTqxVIwTp_lZrUjjaE1eBmmcCV6krsSZtI2VJAM4tJhENTULto53CVfea6Nh9B1TodkotNOSScYBbANfh9mHg1wDxFpZ4Hm-xyhXsTgC3GyGTuHkceTb7SJRhpCpjTtzn-XC-EFabAbG597OWjVOyR9oEAWlklv-Xe3DeXn0rD0OlSc/https%3A%2F%2Fcircuitpython.readthedocs.io%2Fen%2F7.0.x%2Fports%2Fstm%2FREADME.html>)
>>    is guidance on how to support new STM32 boards in CircuitPython.
>> 
>>    During the weekend, I was able to get my NUCLEO board working with
>>    I2C OLED and BNO055, thanks to help from Jim Merkle and other
>>    folks from the Personal Robotics and Adafruit Discord Servers.
>> 
>>    Regards.
>>    - Kumar
>>    _______________________________________________
>>    DPRGlist mailing list
>>    DPRGlist at lists.dprg.org <mailto:DPRGlist at lists.dprg.org>
>>    http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>>    <http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org>
>> 
>> 
>> _______________________________________________
>> DPRGlist mailing list
>> DPRGlist at lists.dprg.org
>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/19244816/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 8
> Date: Wed, 27 Oct 2021 11:34:55 -0500
> From: Iron Reign <ironreignrobotics at gmail.com>
> To: Carl Ott <carl.ott.jr at gmail.com>
> Cc: DPRG <dprglist at lists.dprg.org>, Chris N <netterchris at gmail.com>
> Subject: Re: [Dprglist] Robotics "Capability Maturity Model" -
>    Thoughts?
> Message-ID:
>    <CAAZmubrAZ_xtgWXJcChbMYv5tgq2x4LLVgCq9mrCdUzVxGt5eg at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> Chris,
> 
> I think you've described a rational and typical path most of us have taken.
> You've kept the language simple and clean. Easy for beginners to follow.
> Nice job of creating a framework for thinking about these things.
> 
> Maybe think about rephrasing "API" in levels 0 and 1.
> 
> Might want to point out that not everyone passes through these steps in
> this order.  For example, odometry is typical but not necessary in all
> circumstances.
> 
> And some might describe their path as more tree-like than a linear
> progression. For example, how would you insert certain other capabilities:
> 
> 
>   - I have access to a live visualization of my robot's sensor readings
>   and internal state.
>   - My robot can transform locations between different relative and
>   absolute(ish) coordinate systems.
>   - My robot can identify specific objects as targets/beacons to assist
>   with navigation.
>   - My robot can grasp, manipulate or transport certain objects from point
>   A to point B.
>   - My robot has an internal kinematic model of itself.
>   - My robot can charge itself because it fears death. :)
> 
> 
> Granted you were probably setting out to describe a straight forward model
> for getting started in robotics and not intended to encompass extended
> capabilities.
> 
> I just read Carl's response. There might be room for both a robot CMM and a
> roboticists CMM. "These go to 11"
> https://www.youtube.com/watch?v=KOO5S4vxi0o&ab_channel=pmw8000
> 
> On Wed, Oct 27, 2021 at 11:04 AM Carl Ott via DPRGlist <
> dprglist at lists.dprg.org> wrote:
> 
>> Chris,
>> 
>> What a really great idea!  Borrow from the CMM - I love it.
>> https://en.wikipedia.org/wiki/Capability_Maturity_Model
>> 
>> Will take a little time to respond with better suggestions -
>> but as quick feedback -
>> That maturity path leans heavily on odometry to measure progress - and
>> presumes a certain set of objectives.
>> But I can show that my robot can reach a destination and return 'more or
>> less' to start completely without odometry- using only timed motor commands
>> - or a little more accurately by using motor velocity control loops.
>> I think the CMM is/ should be more about range of skills, and ability to
>> produce consistent results and data driven process and performance
>> improvement...
>> 
>> So I'm going to chew on this, and come back with something a little more
>> abstracted, along the lines of
>> 
>> Level 0
>> no technical background
>> never wrote a program
>> never built mechatronics
>> 
>> Level 1
>> scattered technical skills or some experience
>> * changing an existing program or writing a program from scratch
>> * building a kit with software / mechanics and electronics
>> 
>> Levels 2..10 <needs more thought> ...
>> 
>> 
>> Level 11
>> claims personal responsibility for developing best-in-class robots and
>> robot technologies from scratch at Boston Dynamics / Tesla / CMU Robotics /
>> etc...
>> 
>> why 11 levels?  simple: This Is Spinal Tap
>> 
>> 
>> - Carl
>> 
>> On Wed, Oct 27, 2021 at 10:04 AM Chris N via DPRGlist <
>> dprglist at lists.dprg.org> wrote:
>> 
>>> So in part because of questions along the line of "where do I get
>>> started" from Kumar, and others before him, that we occasionally get, I
>>> thought it might help to have some sort of capability model in mind that
>>> can be referenced.
>>> 
>>> That way one can ask "OK - what level are you at currently?  What level
>>> are you trying to reach in the near term?"
>>> 
>>> Below is what I mean.  For now this is focused on software and locomotion
>>> (but starting with Level 8,  perception comes into the picture)
>>> 
>>> I have more explanation to go along with each item but wanted to keep it
>>> brief in this e-mail.
>>> 
>>> The idea is not to explain how to do these things.  This just represents
>>> milestones along the journey.  This is really more about having an agreed
>>> upon vocabulary.
>>> 
>>> Thoughts?   Is something like this helpful to have written down?  Is it
>>> already written down somewhere ? (I am sure that in some ways, this is
>>> captured in some of the material that David Anderson has published over the
>>> years)
>>> 
>>> Level 0: I have an API through which I can control the speed and
>>> direction of the individual wheels. My robot can move!
>>> 
>>> Level 1: I have an API through which I can reliably get the incremental
>>> encoder counts for each wheel.  When motors are commanded with a certain
>>> duty cycle, I can measure what that translates to in terms of encoder
>>> counts per time unit.
>>> 
>>> Level 2: I am keeping track of my Robot's X, Y and Theta via
>>> dead-reckoning / odometry
>>> 
>>> Level 3: I have taken at least basic steps to calibrate my robot's
>>> odometry calculations.
>>> 
>>> Level 4: I can command the robot to move, but using more abstract units
>>> such as "meters/second" or at least "encoder ticks per time unit"
>>> 
>>> Level 5: My robot actually moves at the linear and angular velocity I
>>> tell it to, even when battery voltage is lower or surface friction is
>>> higher.  And because I have completed Level 3, it can go in a somewhat
>>> straight line when I ask it to.
>>> 
>>> Level 6: My robot moves in a fairly smooth fashion, i.e. it changes speed
>>> and direction somewhat gradually. Movement is not "robotic"
>>> 
>>> Level 7:  I can command my robot to go to a certain X,Y coordinate aka
>>> waypoint relative to its current location,  and the robot more or less
>>> reaches that location provided nothing is in the way
>>> 
>>> Level 8:  My robot can reach the target location, even if there are
>>> obstacles in the way.
>>> 
>>> Level 9:  My robot can come back after reaching its destination.
>>> 
>>> Level 10: My robot can do all this, even if the obstacles are moving
>>> around quite a bit or if there are other sources of possible confusion
>>> 
>>> 
>>> 
>>> 
>>> _______________________________________________
>>> DPRGlist mailing list
>>> DPRGlist at lists.dprg.org
>>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>>> 
>> _______________________________________________
>> DPRGlist mailing list
>> DPRGlist at lists.dprg.org
>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/ad522f56/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 9
> Date: Wed, 27 Oct 2021 12:38:40 -0500
> From: Karim Virani <pondersome64 at gmail.com>
> To: DPRG <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] seriously off topic rant about tonight's
>    discussion
> Message-ID:
>    <CAKtnkiyUbO+5m9cW-X3Yw8EdQk2s4n+L5WAt6Ej9h5TqNBDouw at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> pps. I don't expect David to respond to this - David was referring to the
> papers that Donald Hoffman has produced. I looked at his CV last night and
> he has plenty of the publish-or-perish normal scientific investigations
> into human perception systems.  Somebody else posted the TED interview into
> the chat from a google search. That interview is related to what I'd call
> an alternative set of publications centered around his panpsychic
> philosophy which seem to be a distinct thread in his output, but which I
> would resist calling science. Anyhow, I found that particular interview to
> be outlandish enough to warrant some hopefully entertaining intellectual
> fisticuffs.
> 
>> On Wed, Oct 27, 2021 at 3:15 AM Karim Virani <pondersome64 at gmail.com> wrote:
>> 
>> First, the Nature special on bees was just fantastic. I went ahead and
>> watched it after the conversation tonight.
>> https://video.kera.org/video/my-garden-of-a-thousand-bees-trjhzt/
>> 
>> And then ... there's the Donald Hoffman TED interview ...
>> 
>> OMG David!!!
>> 
>> You were totally fun'in us. You meant to provoke! DH is just a Deepak
>> Chopra wannabe. I resist giving credence to these peddlers of soft-shoe
>> quantum theory tincture in pursuit of monetizable wishful thinking.
>> 
>> Granted this was only one interview on a platform that often caters to the
>> intellectual mystics among us (I used to be a fan of TED talks), but this
>> dude outed himself completely.
>> 
>> First he completely mis-characterizes the field of modern cognitive
>> science (if that's what he considers to be his colleagues) and paints it in
>> the light of 70's era progress. As if he was the first to consider fitness
>> as the basis for how evolutionary development works. Almost nobody thinks
>> sensory evolution is driven to create accurate or truthful interpretations
>> of reality. He can't claim that as his unique insight. It's like he's
>> saying his peers all have a 5th grade understanding of evolution.
>> 
>> But then he goes totally bonkers:
>> 
>> 1. Consciousness is hard to describe and investigate - ok so far
>> 2. So let's throw traditional "reality" out the window and assume the
>> universe is fundamentally made up of a network of multi-level conscious
>> entities
>> 3. For those entities bundled up as humans, the network has decided to
>> give them an "interface" that creates time, space, particles, neurons, etc.
>> as a useful fiction. (ie. the software is real and the hardware is the
>> story)
>> 4. Oh, and I have some math, so it's not really BS
>> 5. Oh, and I may or may not believe this, but I'm brave for going out on a
>> limb and daring to shake up the field because hard problems need
>> disruptions to solve. (this is my get out of jail free card, maybe)
>> 
>> I agree with step 1, but step 2, that's a doozy. The rest is a sophomoric
>> attempt to confound interesting modern explorations into the foundations of
>> physics with 70s era pop quantum psychology like in the Dancing Wu Li
>> Masters or the Tao of Physics. My bet, he'd point to those books as his
>> influences. They were fun reads when I was a pup. But they are truly works
>> of fiction. So is this dude.
>> 
>> Now I'm going to tell you how I really feel...
>> ... Oh, and I'm actually very open minded about this guy...
>> 
>> David, thanks for riling my sensibilities - haven't had a good rant
>> defending honest inquiry in ages.
>> 
>> Again, this is effectively a slam of a single interview but if you wish to
>> point to a place where he offers a shred of evidence toward his theory, or
>> can correct my interpretation of this interview, well please share. We can
>> continue the fun :)
>> 
>> Cheers,
>> 
>> Karim
>> ps. dear reader, this acerbic review is a choice in creative expression.
>> If it offends your viewpoint or sensibilities, I'm sorry, it was not meant
>> to do so
>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/2861db09/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 10
> Date: Wed, 27 Oct 2021 17:42:46 +0000
> From: Thalanayar Muthukumar <tnkumar at gmail.com>
> To: Chris N <netterchris at gmail.com>, DPRG <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] NUCLEO STM32 L476RG - need suggestion of
>    development environment to use
> Message-ID:
>    <CAAwP+LZ54kUd9z3SEtXdjXH76jhEpvM=PejMwnc6euHqxBin1Q at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
>> I just attempted VSCode + PlatformIO + Arduino Framework with my NUCLEO
>> STM32L476RG and got this error.
>> I guess there is some additional configuration I need to do to communicate
>> with my board.
>> 
>> Regards.
>> - Kumar
>> 
>> [image: image.png]
>> 
>> 
>>> On Sat, Oct 23, 2021 at 2:22 PM Chris N <netterchris at gmail.com> wrote:
>>> 
>>> 
>>> What did you use for "Blinky" ?  Was that pre-programmed?
>>> 
>>> You mentioned both micropython and STM32CubeIDE - I think you want to
>>> first decide what language you want to use - Python or C/C++.   Some IDEs
>>> are good at both (VS Code), but some IDEs are better than others at working
>>> with micropython or circuitpython boards.
>>> 
>>> I would highly recommend VS Code + Platform IO + Arduino Framework +
>>> FreeRTOS.
>>> 1) Install Visual Studio Code
>>> 2) Install the Platform IO plug-in + any other plugins you find useful
>>> 3) Create a project via the Platform IO GUI and select your board +
>>> Arduino framework
>>> 4) Add libraries (such as FreeRTOS if you want an RTOS, or for OLED
>>> display, etc. etc)
>>> 
>>> It's a decent trade-off between ease of use and giving you access to the
>>> features that the STM32 MCU has to offer.  Although I really like the
>>> STM32CubeIE (see below), for me, the most compelling reason to use
>>> PlatformIO+Arduino is the fact that I can pull in just about any library
>>> that has ever been created for Arduino-compatible boards.  Plus, the VS
>>> Code editor is much better than Eclipse.
>>> 
>>> If you need more fine-grained control over how you use your STM32 MCU,
>>> then you would want to use the STM32CubeIDE.   It is slightly better at
>>> debugging and the APIs that are provided as part of this give you more
>>> control over how all the various peripherals are used.   There is a wizard
>>> sort of thing that helps you get the peripherals initialized, so you don't
>>> need to know all the ins and outs at that level, but taking advantage of
>>> the APIs that are provided is certainly harder than working inside the
>>> Arduino framework.
>>> 
>>> Chris.
>>> 
>>> On Sat, Oct 23, 2021 at 12:44 AM Thalanayar Muthukumar via DPRGlist <
>>> dprglist at lists.dprg.org> wrote:
>>> 
>>>> https://www.youtube.com/watch?v=VNPOWBemGqU
>>>> 
>>>> Got my NUCLEO L476RG today and got my first Blinky work on it.
>>>> I thought of starting with STM32CubeIDE and micropython, but could not
>>>> figure out how to use these environments.
>>>> Then, I came across mbed.org and was able to get the cpp program for
>>>> blinky working.
>>>> 
>>>> What do people use as their development environment for the STM32 boards?
>>>> Need suggestions on what is best to use to get the best experience with
>>>> the STM32.
>>>> 
>>>> Every new chip has its own development environment, startup challenges
>>>> that one needs to go through ...
>>>> 
>>>> Regards.
>>>> - Kumar
>>>> 
>>>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/b61c9bd6/attachment-0001.html>
> -------------- next part --------------
> A non-text attachment was scrubbed...
> Name: image.png
> Type: image/png
> Size: 56185 bytes
> Desc: not available
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/b61c9bd6/attachment-0001.png>
> 
> ------------------------------
> 
> Message: 11
> Date: Wed, 27 Oct 2021 13:13:26 -0500
> From: Jim Merkle <jim at merkles.com>
> To: Chris N <netterchris at gmail.com>
> Cc: DPRG <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] Robotics "Capability Maturity Model" -
>    Thoughts?
> Message-ID: <6d5b4a2ea88e0ae7120d2f414a6ec30e at merkles.com>
> Content-Type: text/plain; charset="us-ascii"; Format="flowed"
> 
> If any of you subscribe to Reddit - Arduino group, you'll see generic 
> questions all the time, many of which are from "tire kickers", folks 
> just thinking about purchasing something but haven't committed to 
> anything yet.  I don't bother getting involved unless the user indicates 
> they have actual hardware they are working with.
> 
> Since Kumar had already purchased an STM32 NUCLEO board, (I love the 
> STM32 NUCLEO boards in general), I had to get involved.
> 
> "Which development environment" is the 2nd most asked question when it 
> comes to STM32 boards.  The most asked is "which STM32 board?".  
> Unfortunately, many folks begin STM32 exploration with a "Blue Pill".  I 
> HATE that platform!!!  Although it's cheap, they often have clone chips 
> and require plenty of additional hardware already provided with a NUCLEO 
> board.
> 
> Much of the capability model involves the individual and their talent 
> set...
> 
> What DPRG provides is mentorship...  Folks that are willing and able to 
> help others get from one _Level_ to the next.  Thanks GUYS !
> 
> The "Build More Robots" is an excellent tool / program to help bring 
> people (and their robots) from "Level 0.0.0" to "Level 5.0" (or so).
> 
> https://www.dprg.org/build-more-robots-series/
> 
> I truly appreciate DPRG !
> 
> ---
> Jim Merkle
> Carrollton, TX 75007
> jim at merkles.com
> 
>> On 2021-10-27 10:03, Chris N via DPRGlist wrote:
>> 
>> So in part because of questions along the line of "where do I get 
>> started" from Kumar, and others before him, that we occasionally get, I 
>> thought it might help to have some sort of capability model in mind 
>> that can be referenced.
>> 
>> That way one can ask "OK - what level are you at currently?  What level 
>> are you trying to reach in the near term?"
>> 
>> Below is what I mean.  For now this is focused on software and 
>> locomotion (but starting with Level 8,  perception comes into the 
>> picture)
>> 
>> I have more explanation to go along with each item but wanted to keep 
>> it brief in this e-mail.
>> 
>> The idea is not to explain how to do these things.  This just 
>> represents milestones along the journey.  This is really more about 
>> having an agreed upon vocabulary.
>> 
>> Thoughts?   Is something like this helpful to have written down?  Is it 
>> already written down somewhere ? (I am sure that in some ways, this is 
>> captured in some of the material that David Anderson has published over 
>> the years)
>> 
>> Level 0: I have an API through which I can control the speed and 
>> direction of the individual wheels. My robot can move!
>> 
>> Level 1: I have an API through which I can reliably get the incremental 
>> encoder counts for each wheel.  When motors are commanded with a 
>> certain duty cycle, I can measure what that translates to in terms of 
>> encoder counts per time unit.
>> 
>> Level 2: I am keeping track of my Robot's X, Y and Theta via 
>> dead-reckoning / odometry
>> 
>> Level 3: I have taken at least basic steps to calibrate my robot's 
>> odometry calculations.
>> 
>> Level 4: I can command the robot to move, but using more abstract units 
>> such as "meters/second" or at least "encoder ticks per time unit"
>> 
>> Level 5: My robot actually moves at the linear and angular velocity I 
>> tell it to, even when battery voltage is lower or surface friction is 
>> higher.  And because I have completed Level 3, it can go in a somewhat 
>> straight line when I ask it to.
>> 
>> Level 6: My robot moves in a fairly smooth fashion, i.e. it changes 
>> speed and direction somewhat gradually. Movement is not "robotic"
>> 
>> Level 7:  I can command my robot to go to a certain X,Y coordinate aka 
>> waypoint relative to its current location,  and the robot more or less 
>> reaches that location provided nothing is in the way
>> 
>> Level 8:  My robot can reach the target location, even if there are 
>> obstacles in the way.
>> 
>> Level 9:  My robot can come back after reaching its destination.
>> 
>> Level 10: My robot can do all this, even if the obstacles are moving 
>> around quite a bit or if there are other sources of possible confusion
>> 
>> _______________________________________________
>> DPRGlist mailing list
>> DPRGlist at lists.dprg.org
>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/b9700ccc/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 12
> Date: Wed, 27 Oct 2021 13:28:36 -0500
> From: Carl Ott <carl.ott.jr at gmail.com>
> To: Karim Virani <pondersome64 at gmail.com>
> Cc: DPRG <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] seriously off topic rant about tonight's
>    discussion
> Message-ID:
>    <CA+XqQZR0e111N2Mk8g93Q+FDTbHK4hMOAAMyR3=WpbL9rbmZsQ at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> OK - I confess to fueling (hopefully entertaining) intellectual fisticuffs
> - Sorry - I was aiming more to include representative fodder for the chat
> records versus a fringe representation...
> 
> Now I'm even more intrigued than before to understand what this bru ha ha
> is all about ;-)
> 
> 
> 
> On Wed, Oct 27, 2021 at 12:38 PM Karim Virani via DPRGlist <
> dprglist at lists.dprg.org> wrote:
> 
>> pps. I don't expect David to respond to this - David was referring to the
>> papers that Donald Hoffman has produced. I looked at his CV last night and
>> he has plenty of the publish-or-perish normal scientific investigations
>> into human perception systems.  Somebody else posted the TED interview into
>> the chat from a google search. That interview is related to what I'd call
>> an alternative set of publications centered around his panpsychic
>> philosophy which seem to be a distinct thread in his output, but which I
>> would resist calling science. Anyhow, I found that particular interview to
>> be outlandish enough to warrant some hopefully entertaining intellectual
>> fisticuffs.
>> 
>> On Wed, Oct 27, 2021 at 3:15 AM Karim Virani <pondersome64 at gmail.com>
>> wrote:
>> 
>>> First, the Nature special on bees was just fantastic. I went ahead and
>>> watched it after the conversation tonight.
>>> https://video.kera.org/video/my-garden-of-a-thousand-bees-trjhzt/
>>> 
>>> And then ... there's the Donald Hoffman TED interview ...
>>> 
>>> OMG David!!!
>>> 
>>> You were totally fun'in us. You meant to provoke! DH is just a Deepak
>>> Chopra wannabe. I resist giving credence to these peddlers of soft-shoe
>>> quantum theory tincture in pursuit of monetizable wishful thinking.
>>> 
>>> Granted this was only one interview on a platform that often caters to
>>> the intellectual mystics among us (I used to be a fan of TED talks), but
>>> this dude outed himself completely.
>>> 
>>> First he completely mis-characterizes the field of modern cognitive
>>> science (if that's what he considers to be his colleagues) and paints it in
>>> the light of 70's era progress. As if he was the first to consider fitness
>>> as the basis for how evolutionary development works. Almost nobody thinks
>>> sensory evolution is driven to create accurate or truthful interpretations
>>> of reality. He can't claim that as his unique insight. It's like he's
>>> saying his peers all have a 5th grade understanding of evolution.
>>> 
>>> But then he goes totally bonkers:
>>> 
>>> 1. Consciousness is hard to describe and investigate - ok so far
>>> 2. So let's throw traditional "reality" out the window and assume the
>>> universe is fundamentally made up of a network of multi-level conscious
>>> entities
>>> 3. For those entities bundled up as humans, the network has decided to
>>> give them an "interface" that creates time, space, particles, neurons, etc.
>>> as a useful fiction. (ie. the software is real and the hardware is the
>>> story)
>>> 4. Oh, and I have some math, so it's not really BS
>>> 5. Oh, and I may or may not believe this, but I'm brave for going out on
>>> a limb and daring to shake up the field because hard problems need
>>> disruptions to solve. (this is my get out of jail free card, maybe)
>>> 
>>> I agree with step 1, but step 2, that's a doozy. The rest is a sophomoric
>>> attempt to confound interesting modern explorations into the foundations of
>>> physics with 70s era pop quantum psychology like in the Dancing Wu Li
>>> Masters or the Tao of Physics. My bet, he'd point to those books as his
>>> influences. They were fun reads when I was a pup. But they are truly works
>>> of fiction. So is this dude.
>>> 
>>> Now I'm going to tell you how I really feel...
>>> ... Oh, and I'm actually very open minded about this guy...
>>> 
>>> David, thanks for riling my sensibilities - haven't had a good rant
>>> defending honest inquiry in ages.
>>> 
>>> Again, this is effectively a slam of a single interview but if you wish
>>> to point to a place where he offers a shred of evidence toward his theory,
>>> or can correct my interpretation of this interview, well please share. We
>>> can continue the fun :)
>>> 
>>> Cheers,
>>> 
>>> Karim
>>> ps. dear reader, this acerbic review is a choice in creative expression.
>>> If it offends your viewpoint or sensibilities, I'm sorry, it was not meant
>>> to do so
>>> 
>> _______________________________________________
>> DPRGlist mailing list
>> DPRGlist at lists.dprg.org
>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/0653cfba/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 13
> Date: Thu, 28 Oct 2021 07:51:26 +1300
> From: Murray Altheim <murray18 at altheim.com>
> To: dprglist at lists.dprg.org
> Subject: Re: [Dprglist] Robotics "Capability Maturity Model" -
>    Thoughts?
> Message-ID: <1ba75c2c-4708-9796-097d-42258fecf449 at altheim.com>
> Content-Type: text/plain; charset=utf-8; format=flowed
> 
>> On 28/10/21 4:03 am, Chris N via DPRGlist wrote:
>> So in part because of questions along the line of "where do I get
>> started" from Kumar, and others before him, that we occasionally 
>> get, I thought it might help to have some sort of capability model
>> in mind that can be referenced.
> [...]
> 
> Hi Chris,
> 
> We get that question a fair bit on the Personal Robotics server, and
> I'd tried even earlier (when I was still actively planning on starting
> a club) to formulate some ideas to help people get started.
> 
> Here's some of my rather still-confused and -disorganised thoughts on
> getting started, no where near as clear as you've has written out. It
> was written as an attempt to help clarify one's goals in building
> robots (maybe merely my own):
> 
>    Robot Related Goals (NZPRG)
>    https://service.robots.org.nz/wiki/Wiki.jsp?page=RobotRelatedGoals
> 
> I also captured an email from David A. on what he called a "Build
> Sequence" at:
> 
>    Build Sequence (NZPRG)
>    https://service.robots.org.nz/wiki/Wiki.jsp?page=BuildSequence
> 
> which is somewhat akin to what you've written.
> 
> I also thought about the idea of robot weight categories. As we all
> know, as you get to a certain size there's a corresponding weight,
> motor torque, amperage, battery size, etc. that goes somewhat with
> that. This was a mild attempt at categorising that:
> 
>   Micro    insect-sized
>   Mini        under 1.2kg, hamster or guinea pig-sized?
>   Small    maybe chicken-sized?
>   Medium    maybe cat or dog-sized?
>   Large    sheep-sized
>   Very Large    up to person-sized
>   Massive    bigger than a person (pronounced 'massif' as if you're French)
>   Humongous    bigger than an elephant but smaller than a planet
> 
>   Robot Weight Classes (NZPRG)
>   https://service.robots.org.nz/wiki/Wiki.jsp?page=RobotWeightClasses
> 
> This was only meant to help people think about their robot project, as
> we in the Personal Robotics have on numerous occasions had people come in
> and want to build human-sized robots, and we'd then explain roughly how
> difficult and expensive that would be.
> 
> If you don't mind I'd like to capture your levels (with credit of course)
> as another good starting point for what I'd not call "beginners" but those
> starting into a robotics project. Everyone brings some skills and has a
> variety of goals, so the question is matching the goals and skills to an
> appropriate platform.
> 
> At some point I'd probably have a "Getting Started" page on the wiki that
> linked to all these different aspects of decision making for those
> considering a new robot project. And of course, one's goals change as we
> dig in deeper.
> 
> Cheers,
> 
> Murray
> 
> ...........................................................................
> Murray Altheim <murray18 at altheim dot com>                       = =  ===
> http://www.altheim.com/murray/                                     ===  ===
>                                                                    = =  ===
>     In the evening
>     The rice leaves in the garden
>     Rustle in the autumn wind
>     That blows through my reed hut.
>            -- Minamoto no Tsunenobu
> 
> 
> 
> ------------------------------
> 
> Message: 14
> Date: Wed, 27 Oct 2021 13:59:22 -0500
> From: "David P. Anderson" <davida at smu.edu>
> To: <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] Robotics "Capability Maturity Model" -
>    Thoughts?
> Message-ID: <f8d9cc7e-8dee-7fe1-966a-81f09f4ac0db at smu.edu>
> Content-Type: text/plain; charset="utf-8"; Format="flowed"
> 
> Hi Chris,
> 
> This is certainly a valuable exercise as these are questions that get 
> addressed all the time.? The progressive approach you outline here is as 
> good as any.?? I might tend to side with Carl that odometry and even PID 
> controllers are later-stage achievements. Lots can be accomplished and 
> learned without either, and with just open loop control.?? If nothing 
> else, that will teach you why you need those things, and in what 
> situations.
> 
> At least that's the way I came up.
> 
> cheers!
> 
> David
> 
> 
> 
>> On 10/27/21 10:03 AM, Chris N via DPRGlist wrote:
>> 
>> */[EXTERNAL SENDER]/*
>> 
>> So in part because of questions along the line of "where do I get 
>> started" from Kumar, and others before him, that we occasionally get, 
>> I thought it might help to have some sort of capability model in mind 
>> that can be referenced.
>> 
>> That way one can ask "OK - what level are you at currently?? What 
>> level are you trying to reach in the near term?"
>> 
>> Below is what I mean.? For now this is focused on software and 
>> locomotion (but starting with Level 8,? perception comes into the picture)
>> 
>> I have more explanation to go along with each item but wanted to keep 
>> it brief in this e-mail.
>> 
>> The idea is not to explain how to do these things.? This just 
>> represents milestones along the journey.? This is really more about 
>> having an agreed upon vocabulary.
>> 
>> Thoughts??? Is something like this helpful to have written down?? Is 
>> it already written down somewhere ? (I am sure that in some ways, this 
>> is captured in some of the material that David Anderson has published 
>> over the years)
>> 
>> Level 0: I have an API through which I can control the speed and 
>> direction of the individual wheels. My robot can move!
>> 
>> Level 1: I have an API through which I can reliably get the 
>> incremental encoder counts for each wheel.? When motors are commanded 
>> with a certain duty cycle, I can measure what that translates to in 
>> terms of encoder counts per time unit.
>> 
>> Level 2: I am keeping track of my Robot's X, Y and Theta via 
>> dead-reckoning / odometry
>> 
>> Level 3: I have taken at least basic steps to calibrate my robot's 
>> odometry calculations.
>> 
>> Level 4: I can command the robot to move, but using more abstract 
>> units such as "meters/second" or at least "encoder ticks per time unit"
>> 
>> Level 5: My robot actually moves at the linear and angular velocity I 
>> tell it to, even when battery voltage is lower or surface friction is 
>> higher.? And because I have completed Level 3, it can go in a somewhat 
>> straight line when I ask it to.
>> 
>> Level 6: My robot moves in a fairly smooth fashion, i.e. it changes 
>> speed and direction somewhat gradually. Movement is not "robotic"
>> 
>> Level 7:? I can command my robot to go to a certain X,Y coordinate aka 
>> waypoint relative to its current location,? and the robot more or less 
>> reaches that location provided nothing is in the way
>> 
>> Level 8:? My robot can reach the target location, even if there are 
>> obstacles in the way.
>> 
>> Level 9:? My robot can come back after reaching its destination.
>> 
>> Level 10: My robot can do all this, even if the obstacles are moving 
>> around quite a bit or if there are other sources of possible confusion
>> 
>> 
>> 
>> 
>> 
>> _______________________________________________
>> DPRGlist mailing list
>> DPRGlist at lists.dprg.org
>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/76d7c2c0/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 15
> Date: Wed, 27 Oct 2021 14:22:37 -0500
> From: "David P. Anderson" <davida at smu.edu>
> To: <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] seriously off topic rant about tonight's
>    discussion
> Message-ID: <0ed946c9-3eaf-520a-5181-59d0b5476aa6 at smu.edu>
> Content-Type: text/plain; charset="utf-8"; Format="flowed"
> 
> Howdy,
> 
> I don't disagree.?? Once the topic moves from the roboty implications of 
> a perhaps new neural understanding of how human vision works to 
> conjectures about? the nature of "conciousness" itself, we are all adrift!
> 
> cheers
> 
> David
> 
> 
>> On 10/27/21 1:28 PM, Carl Ott via DPRGlist wrote:
>> 
>> */[EXTERNAL SENDER]/*
>> 
>> 
>> OK - I confess to fueling (hopefully entertaining) intellectual 
>> fisticuffs - Sorry - I was aiming more to include representative 
>> fodder for the chat records versus a fringe representation...
>> 
>> Now I'm even more intrigued than before to understand what this bru ha 
>> ha is all about ;-)
>> 
>> 
>> 
>> On Wed, Oct 27, 2021 at 12:38 PM Karim Virani via DPRGlist 
>> <dprglist at lists.dprg.org <mailto:dprglist at lists.dprg.org>> wrote:
>> 
>>    pps. I don't expect David to respond to this - David was referring
>>    to the papers that Donald Hoffman has produced. I looked at his CV
>>    last night?and he has plenty?of the publish-or-perish normal
>>    scientific investigations into human perception systems.? Somebody
>>    else posted the TED interview into the chat from a google search.
>>    That interview is related to what I'd call an alternative set of
>>    publications centered around his panpsychic philosophy which seem
>>    to be a distinct thread in his output, but which I would?resist
>>    calling science. Anyhow, I found that particular interview to be
>>    outlandish enough to warrant some hopefully entertaining
>>    intellectual fisticuffs.
>> 
>>    On Wed, Oct 27, 2021 at 3:15 AM Karim Virani
>>    <pondersome64 at gmail.com <mailto:pondersome64 at gmail.com>> wrote:
>> 
>>        First, the Nature special on bees was just fantastic. I went
>>        ahead and watched it after the conversation tonight.
>>        https://video.kera.org/video/my-garden-of-a-thousand-bees-trjhzt/
>>        <https://secure-web.cisco.com/1Lm2J6Ce1g3H5QzXCZxnx3xCB0tmEnEcBuYCIskB3yK7Hz9CRDVJlZqCbNLUN2zBKQXUWX298newtFEHinx6AthimWo8G6QWabAfPdj_zUkcMak7KsqQE54jcC33xOIAssc_Xk9CN6R6fTptE0Qy9eMffAlSEAQ-SSjCOTzbQauA-XWDo76-Hj8YQoMPsB7ob7XE7vA9khrjTYHv8_VtHwBiSskPgWJ9cq69BIZyXrY7UUCA8IH4TO7p_Ze40NkOEU6_79Bpw2-wU29s90vkcp1mOmIGaVZfyO43FkMTA7hs/https%3A%2F%2Fvideo.kera.org%2Fvideo%2Fmy-garden-of-a-thousand-bees-trjhzt%2F>
>> 
>>        And then ... there's the Donald Hoffman TED interview ...
>> 
>>        OMG David!!!
>> 
>>        You were totally fun'in us. You meant to provoke! DH is just a
>>        Deepak Chopra wannabe. I resist giving credence to these
>>        peddlers of soft-shoe quantum theory tincture in pursuit of
>>        monetizable wishful thinking.
>> 
>>        Granted this was only one interview on a platform that often
>>        caters to the intellectual mystics among us (I used to be a
>>        fan of TED talks), but this dude outed himself completely.
>> 
>>        First he completely mis-characterizes the field of modern
>>        cognitive science (if that's what he considers to be his
>>        colleagues) and paints it in the light of 70's era progress.
>>        As if he was the first to consider fitness as the basis for
>>        how evolutionary development works. Almost nobody thinks
>>        sensory evolution is driven to create accurate or truthful
>>        interpretations of reality. He can't claim that as his unique
>>        insight. It's like he's saying his peers all have a 5th grade
>>        understanding of evolution.
>> 
>>        But then he goes totally bonkers:
>> 
>>        1. Consciousness is hard to describe and investigate - ok so far
>>        2. So let's throw traditional "reality" out the window and
>>        assume the universe is fundamentally made up of a network of
>>        multi-level conscious entities
>>        3. For those entities bundled up as humans, the network has
>>        decided to give them an "interface" that creates time, space,
>>        particles, neurons, etc. as a useful fiction. (ie. the
>>        software is real and the hardware is the story)
>>        4. Oh, and I have some math, so it's not really BS
>>        5. Oh, and I may or may not believe this, but I'm brave for
>>        going out on a limb and daring to shake up the field because
>>        hard problems need disruptions to solve. (this is my get out
>>        of jail free card, maybe)
>> 
>>        I agree with step 1, but step 2, that's a doozy. The rest is a
>>        sophomoric attempt to confound interesting modern explorations
>>        into the foundations of physics with 70s era pop quantum
>>        psychology like in the Dancing Wu Li Masters or the Tao of
>>        Physics. My bet, he'd point to those books as his influences.
>>        They were fun reads when I was a pup. But they are truly works
>>        of fiction. So is this dude.
>> 
>>        Now I'm going to tell you how I really feel...
>>        ... Oh, and I'm actually very open minded about this guy...
>> 
>>        David, thanks for riling my sensibilities - haven't had a good
>>        rant defending honest?inquiry in ages.
>> 
>>        Again, this is effectively a slam of a single interview but if
>>        you wish to point to a place where he offers a shred of
>>        evidence toward his theory, or can correct my interpretation
>>        of this interview, well please share. We can continue the fun :)
>> 
>>        Cheers,
>> 
>>        Karim
>>        ps. dear?reader, this acerbic review is a choice in creative
>>        expression. If it offends your viewpoint or sensibilities, I'm
>>        sorry, it was not meant to do so
>> 
>>    _______________________________________________
>>    DPRGlist mailing list
>>    DPRGlist at lists.dprg.org <mailto:DPRGlist at lists.dprg.org>
>>    http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
>>    <http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org>
>> 
>> 
>> _______________________________________________
>> DPRGlist mailing list
>> DPRGlist at lists.dprg.org
>> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/f8252d96/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 16
> Date: Wed, 27 Oct 2021 14:27:52 -0500
> From: Doug Paradis <paradug at gmail.com>
> To: "David P. Anderson" <davida at smu.edu>, DPRG
>    <dprglist at lists.dprg.org>
> Subject: [Dprglist] Fwd: [Webinar] Using Time-of-Flight Range Sensing
>    to Make Appliances Better
> Message-ID:
>    <CAOdUW+ZnciXC6M51gWKnjs+fh6JQqqp7dXfQwMxkMv93Dh_wqg at mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
> 
> David,
>     You expressed interest in using the LV53L5CX and I don't know if you
> are on ST's mailing list so I am sending this to you.
> 
> Regards,
> Doug P.
> 
> 
> ---------- Forwarded message ---------
> From: STMicroelectronics <event at info.st.com>
> Date: Wed, Oct 27, 2021 at 12:41 PM
> Subject: [Webinar] Using Time-of-Flight Range Sensing to Make Appliances
> Better
> To: <paradug at gmail.com>
> 
> 
> View in browser
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2406&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC&p1=%40XjkRbmD3z1NSZexhXFkyzJ7FzQObSOkVwWTSI8mhgbc%3D>
> 
> [image: STMicroelectronics]
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2407&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>
> [image: STMicroelectronics]
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2408&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>
> 
> *Hi Doug,*
> 
> Please join ST for a 1-hour webinar to Learn how multi-zone Time-of-Flight
> (ToF) range sensing enhances functionality in applications like
> thermostats, robot vacuum cleaners and coffee machines.
> 
> You will be introduced to the VL53L5CX, a compact new device representing
> the latest generation of multi-zone Time-of-Flight ranging sensors based on
> ST's patented FlightSense? technology.
> 
> In this webinar, we will show you how the VL53L5CX's sophisticated distance
> sensing can be used to add innovative new features to a broad spectrum of
> consumer appliances.
> 
> *You will learn:*
> 
>   - The fundamentals of Time-of-Flight sensing technology, including
>   field-of-view, multi-target detection, and distance measurement
>   - The unique features of the VL53L5CX ToF sensor that enable accurate
>   gesture recognition, hand tracking, and liquid level sensing
>   - How the sensor's capabilities can be used to enable or enhance the
>   functionality of a wide range of applications
> 
> Register now
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2409&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>
> 
> *SPEAKER*
> [image: John Kvam - STMicroelectronics]
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240a&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>
> 
> *John Kvam*
> 
> John is a Field Applications Engineer at STMicroelectronics. He has over 40
> years of experience in the development of software and sensors. John has
> worked in the fields of signal analysis, satellite communication, digital
> TV, and real-time applications. For the last few years John has dedicated
> his career to making Time-of-Flight a reality. He is ST?s first technical
> point of contact for applications such as laser assisted autofocus, object
> detection and ranging.
> 
> *UPCOMING EVENTS & TECHNICAL SEMINARS*
> 
> Learn new skills and discover the latest technologies through our events,
> online courses and seminars as well as many other resources.
> Find one near you!
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240b&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>
> [image: Tw] <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240c> [image:
> Fb] <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240d> [image: In]
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240e> [image: Ig]
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240f> [image: Yt]
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bae>
> 
> If you are no longer interested in receiving this type of information, you
> can unsubscribe
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb1&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC&p1=%40SxHOXe6HFFAjjxFJQ97JjA%3D%3D>
> or manage
> your preferences in our subscription center
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb2&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC&p1=%40SxHOXe6HFFAjjxFJQ97JjA%3D%3D>
> .
> 
> ?2021 STMicroelectronics Int N.V - All rights reserved | *Terms of Use
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb3&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>*
> |
> *Privacy
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb4&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>Policy
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb5&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>*
> |
> *Contacts
> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb6&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>*
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/dcd38010/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 17
> Date: Wed, 27 Oct 2021 14:29:35 -0500
> From: "David P. Anderson" <davida at smu.edu>
> To: Doug Paradis <paradug at gmail.com>, DPRG <dprglist at lists.dprg.org>
> Subject: Re: [Dprglist] Fwd: [Webinar] Using Time-of-Flight Range
>    Sensing to Make Appliances Better
> Message-ID: <6cb2ffb7-dbec-1edc-4cb7-1dbd18e2f0a0 at smu.edu>
> Content-Type: text/plain; charset="utf-8"; Format="flowed"
> 
> Thanks!
> 
>> On 10/27/21 2:27 PM, Doug Paradis wrote:
>> 
>> */[EXTERNAL SENDER]/*
>> 
>> 
>> David,
>> ? ? ?You expressed interest in using the LV53L5CX and I don't know if 
>> you are on ST's mailing list so I am sending this to you.
>> 
>> Regards,
>> Doug P.
>> 
>> ---------- Forwarded message ---------
>> From: *STMicroelectronics* <event at info.st.com <mailto:event at info.st.com>>
>> Date: Wed, Oct 27, 2021 at 12:41 PM
>> Subject: [Webinar] Using Time-of-Flight Range Sensing to Make 
>> Appliances Better
>> To: <paradug at gmail.com <mailto:paradug at gmail.com>>
>> 
>> 
>> View in browser 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2406&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC&p1=%40XjkRbmD3z1NSZexhXFkyzJ7FzQObSOkVwWTSI8mhgbc%3D>
>> 
>> STMicroelectronics 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2407&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC> 
>> 
>> 
>> 
>> STMicroelectronics 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2408&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC> 
>> 
>> 
>> 
>> *Hi Doug,*
>> 
>> Please join ST for a 1-hour webinar to Learn how multi-zone 
>> Time-of-Flight (ToF) range sensing enhances functionality in 
>> applications like thermostats, robot vacuum cleaners and coffee machines.
>> 
>> You will be introduced to the VL53L5CX, a compact new device 
>> representing the latest generation of multi-zone Time-of-Flight 
>> ranging sensors based on ST's patented FlightSense? technology.
>> 
>> In this webinar, we will show you how the VL53L5CX's sophisticated 
>> distance sensing can be used to add innovative new features to a broad 
>> spectrum of consumer appliances.
>> 
>> *You will learn:*
>> 
>>  * The fundamentals of Time-of-Flight sensing technology, including
>>    field-of-view, multi-target detection, and distance measurement
>>  * The unique features of the VL53L5CX ToF sensor that enable
>>    accurate gesture recognition, hand tracking, and liquid level sensing
>>  * How the sensor's capabilities can be used to enable or enhance the
>>    functionality of a wide range of applications
>> 
>> Register now 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2409&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC> 
>> 
>> 
>> *SPEAKER*
>> 
>> 
>> John Kvam - STMicroelectronics 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240a&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC> 
>> 
>> 
>> *John Kvam*
>> 
>> John is a Field Applications Engineer at STMicroelectronics. He has 
>> over 40 years of experience in the development of software and 
>> sensors. John has worked in the fields of signal analysis, satellite 
>> communication, digital TV, and real-time applications. For the last 
>> few years John has dedicated his career to making Time-of-Flight a 
>> reality. He is ST?s first technical point of contact for applications 
>> such as laser assisted autofocus, object detection and ranging.
>> 
>> 
>> 
>> *UPCOMING EVENTS & TECHNICAL SEMINARS*
>> 
>> Learn new skills and discover the latest technologies through our 
>> events, online courses and seminars as well as many other resources.
>> 
>> Find one near you! 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240b&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC> 
>> 
>> 
>> Tw <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240c>    Fb 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240d>    In 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240e>    Ig 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f240f>    Yt 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bae>
>> 
>> If you are no longer interested in receiving this type of information, 
>> you can unsubscribe 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb1&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC&p1=%40SxHOXe6HFFAjjxFJQ97JjA%3D%3D>?or?manage 
>> your preferences in our subscription center
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb2&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC&p1=%40SxHOXe6HFFAjjxFJQ97JjA%3D%3D>.
>> 
>> ?2021 STMicroelectronics Int N.V - All rights reserved | *_Terms of 
>> Use 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb3&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>_***| 
>> _*Privacy 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb4&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>**Policy 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb5&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>*_**| 
>> *_Contacts 
>> <http://t.info.st.com/r/?id=h1cbd069c,171e9366,172f2bb6&cid=stmDM50492&bid=482150044&uid=BXzEPljdf1tIhDp6AAj0hvWuF3wDZMRC>_*
>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://lists.dprg.org/pipermail/dprglist-dprg.org/attachments/20211027/f7735521/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 18
> Date: Wed, 27 Oct 2021 14:48:44 -0500
> From: "David P. Anderson" <davida at smu.edu>
> To: DPRG <dprglist at lists.dprg.org>
> Subject: [Dprglist] Bees
> Message-ID: <e067dbcf-0f17-c078-67a3-652b445640d4 at smu.edu>
> Content-Type: text/plain; charset="utf-8"; format=flowed
> 
> During discussion last evening about the magnificent honey bees and how 
> they communicate the location of nectar to each other, I forgot the 
> simplest.
> 
> The bees do their dance showing direction and distance for nectar 
> sources that are far from the hive.? And do it horizontally to show the 
> angle to the sun, or vertically using the zenith for the sun.? I guess 
> that makes sense, they only really need the angle. And distance.? And 
> quality of the nectar.
> 
> But, if the source is near the hive, rather that some arduous distance, 
> they instead just dance in a circle.? I guess the implication is, "just 
> go out and fly around the hive, you'll find it."
> 
> Amazing.
> 
> David
> 
> 
> 
> 
> 
> ------------------------------
> 
> Subject: Digest Footer
> 
> _______________________________________________
> DPRGlist mailing list
> DPRGlist at lists.dprg.org
> http://lists.dprg.org/listinfo.cgi/dprglist-dprg.org
> 
> 
> ------------------------------
> 
> End of DPRGlist Digest, Vol 71, Issue 1
> ***************************************


More information about the DPRGlist mailing list