-
Notifications
You must be signed in to change notification settings - Fork 518
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ensure consistent BLE scan behavior #2718
Conversation
wiring/src/spark_wiring_ble.cpp
Outdated
if (!delegator->filter_.allowDuplicates() && delegator->isCachedDevice(event->peer_addr)) { | ||
return; | ||
} | ||
delegator->cachedDevices_.append(event->peer_addr); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should not be populated in case of allowDuplicates()
to avoid uncontrollable growth of the vector, as otherwise it will continue to fill-up with duplicates.
Same can potentially happen normally depending on scan duration, although highly unlikely, but should potentially be taken care of somehow as well.
if (delegator->isCachedDevice(event->peer_addr)) { | ||
return; | ||
} | ||
delegator->cachedDevices_.append(event->peer_addr); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@XuGuohui Do you think it would make sense to put a hard limit here on a maximum number of results?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it wouldn't make sense, cos when the vector reaches the limit the device will report a large amount of duplicates, which more looks like a bug and might make the user panic. In whichever way we go we should document the limitation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm talking specifically about the case of non-duplicates. E.g. a moving device with a long scan duration and capturing a large number of devices around.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about introducing an option BleScanFilter.cachedDeviceSize()
so that user can explicitly notice the limitation, otherwise, the cache can be uncontrollable to grow?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doubt anyone would use it :) Some relatively large default (e.g. 512-1024) is probably better, but honestly up to you to leave as-is potentially as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool. I'll leave it as-is, as you said it's highly unlikely happen and we can react on it when there is customers encounter a shortage of RAM.
Cc. @rickkas7 Introduced an option |
Problem
BLE.scan()
andBLE.scanWithFilter()
filter duplicate results on Gen3, but it's not true on Gen4.Solution
Make the BLE HAL report all of the scanned results and implement a filtering mechanism in wiring layer instead.
Steps to Test
Build and run the
tests/app/ble/scanner
app.References
N/A
Completeness