You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mynewt.apache.org by Kevin Townsend <ke...@adafruit.com.INVALID> on 2018/06/29 21:02:09 UTC

Sensor API Timing Constraints

I'm working on some sensor drivers for Mynewt 1.4, and have run into an 
issue that I'm not sure has a single perfect answer, but should perhaps 
be addressed or further discussed.

Most sensors have specific timing constraints, and in certain instances 
these can change dynamically. The TSL2561 light sensor, for example, can 
have it's integration time set from 13ms to 402ms, depending on the 
level of light sensitivity required. The TSL2591 driver I'm writing 
(since the TSL2561 is EOL) has a variable integration time from 100..600ms.

The 'problem' is that there is no concept of minimum time between sample 
reads in the sensor API at present (to my knowledge, feel free to 
correct me!), and I'm not sure the best way to insert this delay between 
valid reads so that the data we get back can be considered reliable or 
fresh.

If a sensor has a 300ms delay between valid samples, for example, we can 
still request data every 10ms but the response is undetermined in the 
sense that each sensor will handle this differently. In the case of the 
TSL2561 and TSL2591 the first sample requested will likely be invalid 
since a single valid integration cycle hasn't finished, and then it will 
buffer and continue to return values until the NEXT valid sample is 
available. This is visible in the following sequence where the first IR 
reading is completely out of range, and some subsequent values are 
actually cached entries that might not reflect current light levels 
since they happen before the next integration time ellapses:

011023 compat> tsl2591 r 10
011799 Full:  30
011799 IR:    61309
011799 Full:  30
011800 IR:    13
011801 Full:  30
011801 IR:    13
011801 Full:  30
011801 IR:    13
011802 Full:  30
011802 IR:    13
011802 Full:  30
011803 IR:    13
011803 Full:  30
011803 IR:    13
011804 Full:  30
011804 IR:    13
011804 Full:  30
011804 IR:    13
011805 Full:  30
011805 IR:    13

I'm not sure what the best way to handle this is, though.

Some options are:

  * Add a blocking delay in the read task to take into account the
    current minimum delay between valid samples (at the risk of causing
    problems on the I2C bus if other devices perform transactions in
    between)
  * Add a concept of 'minimum time' between sample reads at the sensor
    API level and enforce this at a higher level, with one of the
    following consequences for read requests that occur before this
    delay: (*Keep in mind that this min value can change dynamically
    based on sensor config or auto-ranging!)
      o Return an appropriate error value
      o Return the previous cached value with the sample still marked as
        valid
      o Return the previous cached value with the sample marked as invalid
      o Other?

There are other solutions, but I was hoping to get some feedback on this 
to hear what other people think of the issue of the current disparity 
between the sensor API and real-world timing constraints of the sensors 
themselves. An argument could be made that the end user should know and 
work with the constraints of their HW, but it seems like this could also 
be handled with some small API additions as well?


Re: Sensor API Timing Constraints

Posted by Kevin Townsend <ke...@adafruit.com.INVALID>.
To try to frame this in a concrete example, I've pushed out a PR with 
the initial TSL2591 driver. It would be worthwhile discussing there how 
to handle minimum delays between valid sensor reads: 
https://github.com/apache/mynewt-core/pull/1237


On 30/06/18 19:12, Kevin Townsend wrote:
> Hi Amr,
>
> Thanks for the relevant response and pointers. I'll dig deeper into 
> the suggested functions, and the BME280 is a really useful sensor so 
> any improvements there are useful!
>
> The solution you're proposing is definately valid, but puts all of the 
> timing constraint considerations solely on the shoulders of the 
> developper and at the application level. That's not unfair (you should 
> fully understand your HW!), but I can't help but think that there will 
> be situations where the timing can change dynamically and that a small 
> addition to the sensor API would be helpful to expose 'minimum sample 
> time' as a value. This would be useful in situations where you are 
> implementing auto-ranging like with a light sensor or accelerometer, 
> where you need to dynamically toggle the range up or down to get a 
> valid sample, but the end user may not be aware of these changes.
>
> This is what Android does at a low level with it's own sensor API, 
> just as an example: 
> https://developer.android.com/reference/android/hardware/Sensor.html#getMinDelay()
>
> There may be some other meta-data in the Android API that is worth 
> considering adding, but I think exposing min delay at the sensor 
> driver level across all devices may be useful and allows for dynamic 
> changes to timing constraints without the user having to be aware of 
> all the conditions behind that?
>
> Kevin
>
>
> On 30/06/18 14:43, Amr Bekhit wrote:
>> You could also have a look at the Sensor Listener API and the
>> sensor_set_poll_rate_ms function - if all you need is to read the
>> sensor at a specified interval, that API could be all that you need.
>> On Sat, 30 Jun 2018 at 15:36, Amr Bekhit <am...@gmail.com> wrote:
>>> It just so happens that I've recently completed the driver for the
>>> Bosch BME680 environmental sensor as well as a separate module for the
>>> Bosch BSEC sensor fusion library. I've yet to submit the patches so I
>>> don't know if the method I chose would be accepted but I'd like to
>>> share my own thoughts to the discussion. First let me give a brief
>>> background on the sensor and the BSEC library.
>>>
>>> The BME680 contains sensors for temperature, humidity, pressure and
>>> gas. To take a reading including the gas sensor, you first trigger a
>>> reading (which involves the sensor measuring the first three
>>> parameters and then turning on the gas sensor heater for a period of
>>> time), after which you wait a specified amount of time (approximately
>>> 200ms) and then read the results. The BSEC library requires that the
>>> sensor is read at precise intervals (it contains profiles for 3s and
>>> 300s periods) and the readings are combined together to perform
>>> compensation, background calibration and air quality measurement.
>>> Bosch states that sensor is not really meant to be used standalone but
>>> in conjunction with the BSEC library.
>>>
>>> I decided to have the sensor driver layer only perform the
>>> trigger-wait-read process (so my sensor_read function triggers a
>>> reading, calls os_time_delay to wait, and then reads and returns the
>>> result). Regarding the BSEC library, because this is a bit more higher
>>> level, I packaged that into a separate package that spawns its own
>>> thread that takes care of the timing requirements for reading and
>>> provides another API for reading the processed results. So in summary,
>>> the BSEC library uses the mynewt sensor API to access the BME680, but
>>> my application code uses a separate custom API to access the processed
>>> data. I did it this way, because I didn't feel that the sensor API was
>>> meant to be that high level that it would spawn its own background
>>> threads and so put that in a separate package.
>>>
>>> So in your case, I would have simply implemented the sensor read as
>>> part of the sensor API, but created a separate package that then is
>>> responsible for regularly reading from the sensor and providing a
>>> "always ready" result to the application.
>>>
>>> I guess the "right way" depends on how high level the sensor API is
>>> intended to be - I've taken the assumption that the API aims to cover
>>> simply taking a single measurement from the sensor, rather than
>>> providing a fully processed and always-up-to-date result.
>>>
>>> On Sat, 30 Jun 2018 at 00:02, Kevin Townsend 
>>> <ke...@adafruit.com.invalid> wrote:
>>>> I'm working on some sensor drivers for Mynewt 1.4, and have run 
>>>> into an
>>>> issue that I'm not sure has a single perfect answer, but should 
>>>> perhaps
>>>> be addressed or further discussed.
>>>>
>>>> Most sensors have specific timing constraints, and in certain 
>>>> instances
>>>> these can change dynamically. The TSL2561 light sensor, for 
>>>> example, can
>>>> have it's integration time set from 13ms to 402ms, depending on the
>>>> level of light sensitivity required. The TSL2591 driver I'm writing
>>>> (since the TSL2561 is EOL) has a variable integration time from 
>>>> 100..600ms.
>>>>
>>>> The 'problem' is that there is no concept of minimum time between 
>>>> sample
>>>> reads in the sensor API at present (to my knowledge, feel free to
>>>> correct me!), and I'm not sure the best way to insert this delay 
>>>> between
>>>> valid reads so that the data we get back can be considered reliable or
>>>> fresh.
>>>>
>>>> If a sensor has a 300ms delay between valid samples, for example, 
>>>> we can
>>>> still request data every 10ms but the response is undetermined in the
>>>> sense that each sensor will handle this differently. In the case of 
>>>> the
>>>> TSL2561 and TSL2591 the first sample requested will likely be invalid
>>>> since a single valid integration cycle hasn't finished, and then it 
>>>> will
>>>> buffer and continue to return values until the NEXT valid sample is
>>>> available. This is visible in the following sequence where the 
>>>> first IR
>>>> reading is completely out of range, and some subsequent values are
>>>> actually cached entries that might not reflect current light levels
>>>> since they happen before the next integration time ellapses:
>>>>
>>>> 011023 compat> tsl2591 r 10
>>>> 011799 Full:  30
>>>> 011799 IR:    61309
>>>> 011799 Full:  30
>>>> 011800 IR:    13
>>>> 011801 Full:  30
>>>> 011801 IR:    13
>>>> 011801 Full:  30
>>>> 011801 IR:    13
>>>> 011802 Full:  30
>>>> 011802 IR:    13
>>>> 011802 Full:  30
>>>> 011803 IR:    13
>>>> 011803 Full:  30
>>>> 011803 IR:    13
>>>> 011804 Full:  30
>>>> 011804 IR:    13
>>>> 011804 Full:  30
>>>> 011804 IR:    13
>>>> 011805 Full:  30
>>>> 011805 IR:    13
>>>>
>>>> I'm not sure what the best way to handle this is, though.
>>>>
>>>> Some options are:
>>>>
>>>>    * Add a blocking delay in the read task to take into account the
>>>>      current minimum delay between valid samples (at the risk of 
>>>> causing
>>>>      problems on the I2C bus if other devices perform transactions in
>>>>      between)
>>>>    * Add a concept of 'minimum time' between sample reads at the 
>>>> sensor
>>>>      API level and enforce this at a higher level, with one of the
>>>>      following consequences for read requests that occur before this
>>>>      delay: (*Keep in mind that this min value can change dynamically
>>>>      based on sensor config or auto-ranging!)
>>>>        o Return an appropriate error value
>>>>        o Return the previous cached value with the sample still 
>>>> marked as
>>>>          valid
>>>>        o Return the previous cached value with the sample marked as 
>>>> invalid
>>>>        o Other?
>>>>
>>>> There are other solutions, but I was hoping to get some feedback on 
>>>> this
>>>> to hear what other people think of the issue of the current disparity
>>>> between the sensor API and real-world timing constraints of the 
>>>> sensors
>>>> themselves. An argument could be made that the end user should know 
>>>> and
>>>> work with the constraints of their HW, but it seems like this could 
>>>> also
>>>> be handled with some small API additions as well?
>>>>
>


Re: Sensor API Timing Constraints

Posted by Kevin Townsend <ke...@adafruit.com.INVALID>.
Hi Amr,

Thanks for the relevant response and pointers. I'll dig deeper into the 
suggested functions, and the BME280 is a really useful sensor so any 
improvements there are useful!

The solution you're proposing is definately valid, but puts all of the 
timing constraint considerations solely on the shoulders of the 
developper and at the application level. That's not unfair (you should 
fully understand your HW!), but I can't help but think that there will 
be situations where the timing can change dynamically and that a small 
addition to the sensor API would be helpful to expose 'minimum sample 
time' as a value. This would be useful in situations where you are 
implementing auto-ranging like with a light sensor or accelerometer, 
where you need to dynamically toggle the range up or down to get a valid 
sample, but the end user may not be aware of these changes.

This is what Android does at a low level with it's own sensor API, just 
as an example: 
https://developer.android.com/reference/android/hardware/Sensor.html#getMinDelay()

There may be some other meta-data in the Android API that is worth 
considering adding, but I think exposing min delay at the sensor driver 
level across all devices may be useful and allows for dynamic changes to 
timing constraints without the user having to be aware of all the 
conditions behind that?

Kevin


On 30/06/18 14:43, Amr Bekhit wrote:
> You could also have a look at the Sensor Listener API and the
> sensor_set_poll_rate_ms function - if all you need is to read the
> sensor at a specified interval, that API could be all that you need.
> On Sat, 30 Jun 2018 at 15:36, Amr Bekhit <am...@gmail.com> wrote:
>> It just so happens that I've recently completed the driver for the
>> Bosch BME680 environmental sensor as well as a separate module for the
>> Bosch BSEC sensor fusion library. I've yet to submit the patches so I
>> don't know if the method I chose would be accepted but I'd like to
>> share my own thoughts to the discussion. First let me give a brief
>> background on the sensor and the BSEC library.
>>
>> The BME680 contains sensors for temperature, humidity, pressure and
>> gas. To take a reading including the gas sensor, you first trigger a
>> reading (which involves the sensor measuring the first three
>> parameters and then turning on the gas sensor heater for a period of
>> time), after which you wait a specified amount of time (approximately
>> 200ms) and then read the results. The BSEC library requires that the
>> sensor is read at precise intervals (it contains profiles for 3s and
>> 300s periods) and the readings are combined together to perform
>> compensation, background calibration and air quality measurement.
>> Bosch states that sensor is not really meant to be used standalone but
>> in conjunction with the BSEC library.
>>
>> I decided to have the sensor driver layer only perform the
>> trigger-wait-read process (so my sensor_read function triggers a
>> reading, calls os_time_delay to wait, and then reads and returns the
>> result). Regarding the BSEC library, because this is a bit more higher
>> level, I packaged that into a separate package that spawns its own
>> thread that takes care of the timing requirements for reading and
>> provides another API for reading the processed results. So in summary,
>> the BSEC library uses the mynewt sensor API to access the BME680, but
>> my application code uses a separate custom API to access the processed
>> data. I did it this way, because I didn't feel that the sensor API was
>> meant to be that high level that it would spawn its own background
>> threads and so put that in a separate package.
>>
>> So in your case, I would have simply implemented the sensor read as
>> part of the sensor API, but created a separate package that then is
>> responsible for regularly reading from the sensor and providing a
>> "always ready" result to the application.
>>
>> I guess the "right way" depends on how high level the sensor API is
>> intended to be - I've taken the assumption that the API aims to cover
>> simply taking a single measurement from the sensor, rather than
>> providing a fully processed and always-up-to-date result.
>>
>> On Sat, 30 Jun 2018 at 00:02, Kevin Townsend <ke...@adafruit.com.invalid> wrote:
>>> I'm working on some sensor drivers for Mynewt 1.4, and have run into an
>>> issue that I'm not sure has a single perfect answer, but should perhaps
>>> be addressed or further discussed.
>>>
>>> Most sensors have specific timing constraints, and in certain instances
>>> these can change dynamically. The TSL2561 light sensor, for example, can
>>> have it's integration time set from 13ms to 402ms, depending on the
>>> level of light sensitivity required. The TSL2591 driver I'm writing
>>> (since the TSL2561 is EOL) has a variable integration time from 100..600ms.
>>>
>>> The 'problem' is that there is no concept of minimum time between sample
>>> reads in the sensor API at present (to my knowledge, feel free to
>>> correct me!), and I'm not sure the best way to insert this delay between
>>> valid reads so that the data we get back can be considered reliable or
>>> fresh.
>>>
>>> If a sensor has a 300ms delay between valid samples, for example, we can
>>> still request data every 10ms but the response is undetermined in the
>>> sense that each sensor will handle this differently. In the case of the
>>> TSL2561 and TSL2591 the first sample requested will likely be invalid
>>> since a single valid integration cycle hasn't finished, and then it will
>>> buffer and continue to return values until the NEXT valid sample is
>>> available. This is visible in the following sequence where the first IR
>>> reading is completely out of range, and some subsequent values are
>>> actually cached entries that might not reflect current light levels
>>> since they happen before the next integration time ellapses:
>>>
>>> 011023 compat> tsl2591 r 10
>>> 011799 Full:  30
>>> 011799 IR:    61309
>>> 011799 Full:  30
>>> 011800 IR:    13
>>> 011801 Full:  30
>>> 011801 IR:    13
>>> 011801 Full:  30
>>> 011801 IR:    13
>>> 011802 Full:  30
>>> 011802 IR:    13
>>> 011802 Full:  30
>>> 011803 IR:    13
>>> 011803 Full:  30
>>> 011803 IR:    13
>>> 011804 Full:  30
>>> 011804 IR:    13
>>> 011804 Full:  30
>>> 011804 IR:    13
>>> 011805 Full:  30
>>> 011805 IR:    13
>>>
>>> I'm not sure what the best way to handle this is, though.
>>>
>>> Some options are:
>>>
>>>    * Add a blocking delay in the read task to take into account the
>>>      current minimum delay between valid samples (at the risk of causing
>>>      problems on the I2C bus if other devices perform transactions in
>>>      between)
>>>    * Add a concept of 'minimum time' between sample reads at the sensor
>>>      API level and enforce this at a higher level, with one of the
>>>      following consequences for read requests that occur before this
>>>      delay: (*Keep in mind that this min value can change dynamically
>>>      based on sensor config or auto-ranging!)
>>>        o Return an appropriate error value
>>>        o Return the previous cached value with the sample still marked as
>>>          valid
>>>        o Return the previous cached value with the sample marked as invalid
>>>        o Other?
>>>
>>> There are other solutions, but I was hoping to get some feedback on this
>>> to hear what other people think of the issue of the current disparity
>>> between the sensor API and real-world timing constraints of the sensors
>>> themselves. An argument could be made that the end user should know and
>>> work with the constraints of their HW, but it seems like this could also
>>> be handled with some small API additions as well?
>>>


Re: Sensor API Timing Constraints

Posted by Amr Bekhit <am...@gmail.com>.
You could also have a look at the Sensor Listener API and the
sensor_set_poll_rate_ms function - if all you need is to read the
sensor at a specified interval, that API could be all that you need.
On Sat, 30 Jun 2018 at 15:36, Amr Bekhit <am...@gmail.com> wrote:
>
> It just so happens that I've recently completed the driver for the
> Bosch BME680 environmental sensor as well as a separate module for the
> Bosch BSEC sensor fusion library. I've yet to submit the patches so I
> don't know if the method I chose would be accepted but I'd like to
> share my own thoughts to the discussion. First let me give a brief
> background on the sensor and the BSEC library.
>
> The BME680 contains sensors for temperature, humidity, pressure and
> gas. To take a reading including the gas sensor, you first trigger a
> reading (which involves the sensor measuring the first three
> parameters and then turning on the gas sensor heater for a period of
> time), after which you wait a specified amount of time (approximately
> 200ms) and then read the results. The BSEC library requires that the
> sensor is read at precise intervals (it contains profiles for 3s and
> 300s periods) and the readings are combined together to perform
> compensation, background calibration and air quality measurement.
> Bosch states that sensor is not really meant to be used standalone but
> in conjunction with the BSEC library.
>
> I decided to have the sensor driver layer only perform the
> trigger-wait-read process (so my sensor_read function triggers a
> reading, calls os_time_delay to wait, and then reads and returns the
> result). Regarding the BSEC library, because this is a bit more higher
> level, I packaged that into a separate package that spawns its own
> thread that takes care of the timing requirements for reading and
> provides another API for reading the processed results. So in summary,
> the BSEC library uses the mynewt sensor API to access the BME680, but
> my application code uses a separate custom API to access the processed
> data. I did it this way, because I didn't feel that the sensor API was
> meant to be that high level that it would spawn its own background
> threads and so put that in a separate package.
>
> So in your case, I would have simply implemented the sensor read as
> part of the sensor API, but created a separate package that then is
> responsible for regularly reading from the sensor and providing a
> "always ready" result to the application.
>
> I guess the "right way" depends on how high level the sensor API is
> intended to be - I've taken the assumption that the API aims to cover
> simply taking a single measurement from the sensor, rather than
> providing a fully processed and always-up-to-date result.
>
> On Sat, 30 Jun 2018 at 00:02, Kevin Townsend <ke...@adafruit.com.invalid> wrote:
> >
> > I'm working on some sensor drivers for Mynewt 1.4, and have run into an
> > issue that I'm not sure has a single perfect answer, but should perhaps
> > be addressed or further discussed.
> >
> > Most sensors have specific timing constraints, and in certain instances
> > these can change dynamically. The TSL2561 light sensor, for example, can
> > have it's integration time set from 13ms to 402ms, depending on the
> > level of light sensitivity required. The TSL2591 driver I'm writing
> > (since the TSL2561 is EOL) has a variable integration time from 100..600ms.
> >
> > The 'problem' is that there is no concept of minimum time between sample
> > reads in the sensor API at present (to my knowledge, feel free to
> > correct me!), and I'm not sure the best way to insert this delay between
> > valid reads so that the data we get back can be considered reliable or
> > fresh.
> >
> > If a sensor has a 300ms delay between valid samples, for example, we can
> > still request data every 10ms but the response is undetermined in the
> > sense that each sensor will handle this differently. In the case of the
> > TSL2561 and TSL2591 the first sample requested will likely be invalid
> > since a single valid integration cycle hasn't finished, and then it will
> > buffer and continue to return values until the NEXT valid sample is
> > available. This is visible in the following sequence where the first IR
> > reading is completely out of range, and some subsequent values are
> > actually cached entries that might not reflect current light levels
> > since they happen before the next integration time ellapses:
> >
> > 011023 compat> tsl2591 r 10
> > 011799 Full:  30
> > 011799 IR:    61309
> > 011799 Full:  30
> > 011800 IR:    13
> > 011801 Full:  30
> > 011801 IR:    13
> > 011801 Full:  30
> > 011801 IR:    13
> > 011802 Full:  30
> > 011802 IR:    13
> > 011802 Full:  30
> > 011803 IR:    13
> > 011803 Full:  30
> > 011803 IR:    13
> > 011804 Full:  30
> > 011804 IR:    13
> > 011804 Full:  30
> > 011804 IR:    13
> > 011805 Full:  30
> > 011805 IR:    13
> >
> > I'm not sure what the best way to handle this is, though.
> >
> > Some options are:
> >
> >   * Add a blocking delay in the read task to take into account the
> >     current minimum delay between valid samples (at the risk of causing
> >     problems on the I2C bus if other devices perform transactions in
> >     between)
> >   * Add a concept of 'minimum time' between sample reads at the sensor
> >     API level and enforce this at a higher level, with one of the
> >     following consequences for read requests that occur before this
> >     delay: (*Keep in mind that this min value can change dynamically
> >     based on sensor config or auto-ranging!)
> >       o Return an appropriate error value
> >       o Return the previous cached value with the sample still marked as
> >         valid
> >       o Return the previous cached value with the sample marked as invalid
> >       o Other?
> >
> > There are other solutions, but I was hoping to get some feedback on this
> > to hear what other people think of the issue of the current disparity
> > between the sensor API and real-world timing constraints of the sensors
> > themselves. An argument could be made that the end user should know and
> > work with the constraints of their HW, but it seems like this could also
> > be handled with some small API additions as well?
> >

Re: Sensor API Timing Constraints

Posted by Amr Bekhit <am...@gmail.com>.
It just so happens that I've recently completed the driver for the
Bosch BME680 environmental sensor as well as a separate module for the
Bosch BSEC sensor fusion library. I've yet to submit the patches so I
don't know if the method I chose would be accepted but I'd like to
share my own thoughts to the discussion. First let me give a brief
background on the sensor and the BSEC library.

The BME680 contains sensors for temperature, humidity, pressure and
gas. To take a reading including the gas sensor, you first trigger a
reading (which involves the sensor measuring the first three
parameters and then turning on the gas sensor heater for a period of
time), after which you wait a specified amount of time (approximately
200ms) and then read the results. The BSEC library requires that the
sensor is read at precise intervals (it contains profiles for 3s and
300s periods) and the readings are combined together to perform
compensation, background calibration and air quality measurement.
Bosch states that sensor is not really meant to be used standalone but
in conjunction with the BSEC library.

I decided to have the sensor driver layer only perform the
trigger-wait-read process (so my sensor_read function triggers a
reading, calls os_time_delay to wait, and then reads and returns the
result). Regarding the BSEC library, because this is a bit more higher
level, I packaged that into a separate package that spawns its own
thread that takes care of the timing requirements for reading and
provides another API for reading the processed results. So in summary,
the BSEC library uses the mynewt sensor API to access the BME680, but
my application code uses a separate custom API to access the processed
data. I did it this way, because I didn't feel that the sensor API was
meant to be that high level that it would spawn its own background
threads and so put that in a separate package.

So in your case, I would have simply implemented the sensor read as
part of the sensor API, but created a separate package that then is
responsible for regularly reading from the sensor and providing a
"always ready" result to the application.

I guess the "right way" depends on how high level the sensor API is
intended to be - I've taken the assumption that the API aims to cover
simply taking a single measurement from the sensor, rather than
providing a fully processed and always-up-to-date result.

On Sat, 30 Jun 2018 at 00:02, Kevin Townsend <ke...@adafruit.com.invalid> wrote:
>
> I'm working on some sensor drivers for Mynewt 1.4, and have run into an
> issue that I'm not sure has a single perfect answer, but should perhaps
> be addressed or further discussed.
>
> Most sensors have specific timing constraints, and in certain instances
> these can change dynamically. The TSL2561 light sensor, for example, can
> have it's integration time set from 13ms to 402ms, depending on the
> level of light sensitivity required. The TSL2591 driver I'm writing
> (since the TSL2561 is EOL) has a variable integration time from 100..600ms.
>
> The 'problem' is that there is no concept of minimum time between sample
> reads in the sensor API at present (to my knowledge, feel free to
> correct me!), and I'm not sure the best way to insert this delay between
> valid reads so that the data we get back can be considered reliable or
> fresh.
>
> If a sensor has a 300ms delay between valid samples, for example, we can
> still request data every 10ms but the response is undetermined in the
> sense that each sensor will handle this differently. In the case of the
> TSL2561 and TSL2591 the first sample requested will likely be invalid
> since a single valid integration cycle hasn't finished, and then it will
> buffer and continue to return values until the NEXT valid sample is
> available. This is visible in the following sequence where the first IR
> reading is completely out of range, and some subsequent values are
> actually cached entries that might not reflect current light levels
> since they happen before the next integration time ellapses:
>
> 011023 compat> tsl2591 r 10
> 011799 Full:  30
> 011799 IR:    61309
> 011799 Full:  30
> 011800 IR:    13
> 011801 Full:  30
> 011801 IR:    13
> 011801 Full:  30
> 011801 IR:    13
> 011802 Full:  30
> 011802 IR:    13
> 011802 Full:  30
> 011803 IR:    13
> 011803 Full:  30
> 011803 IR:    13
> 011804 Full:  30
> 011804 IR:    13
> 011804 Full:  30
> 011804 IR:    13
> 011805 Full:  30
> 011805 IR:    13
>
> I'm not sure what the best way to handle this is, though.
>
> Some options are:
>
>   * Add a blocking delay in the read task to take into account the
>     current minimum delay between valid samples (at the risk of causing
>     problems on the I2C bus if other devices perform transactions in
>     between)
>   * Add a concept of 'minimum time' between sample reads at the sensor
>     API level and enforce this at a higher level, with one of the
>     following consequences for read requests that occur before this
>     delay: (*Keep in mind that this min value can change dynamically
>     based on sensor config or auto-ranging!)
>       o Return an appropriate error value
>       o Return the previous cached value with the sample still marked as
>         valid
>       o Return the previous cached value with the sample marked as invalid
>       o Other?
>
> There are other solutions, but I was hoping to get some feedback on this
> to hear what other people think of the issue of the current disparity
> between the sensor API and real-world timing constraints of the sensors
> themselves. An argument could be made that the end user should know and
> work with the constraints of their HW, but it seems like this could also
> be handled with some small API additions as well?
>