-
-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add CombinedOptimizer #990
base: main
Are you sure you want to change the base?
Conversation
📝 WalkthroughWalkthroughThis PR introduces a new optimization strategy by adding the Changes
Sequence Diagram(s)sequenceDiagram
participant Runner as OptimizationRunner
participant Optimizer as CombinedOptimizer
Runner->>Optimizer: Optimize(OptimizationSnapshot)
Note right of Optimizer: Gather active nodes and unique device IDs
Optimizer->>Optimizer: OptimizeRxAdjRssiAndPathAbsorption()
Optimizer->>Optimizer: OptimizeNodeAbsorptions()
Optimizer-->>Runner: Return OptimizationResults (includes optimized values and error)
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (1)
src/Optimizers/OptimizationRunner.cs (1)
19-19
: Consider injecting a logger intoCombinedOptimizer
.
Although referencing_state
alone might suffice for certain aspects of optimization, it may be beneficial to pass in the logger as well to provide consistent logging throughout. If extensive logs are required during the two-step optimization, consider updating the constructor signature ofCombinedOptimizer
to accept a logger or reuse the one already injected here.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
src/Optimizers/CombinedOptimizer.cs
(1 hunks)src/Optimizers/OptimizationRunner.cs
(1 hunks)
🧰 Additional context used
🪛 GitHub Check: build
src/Optimizers/CombinedOptimizer.cs
[warning] 162-162:
Cannot convert null literal to non-nullable reference type.
[warning] 162-162:
Cannot convert null literal to non-nullable reference type.
[failure] 125-125:
'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'
[failure] 125-125:
'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'
🪛 GitHub Actions: Build and test
src/Optimizers/CombinedOptimizer.cs
[warning] 162-162: Cannot convert null literal to non-nullable reference type.
[warning] 162-162: Cannot convert null literal to non-nullable reference type.
[warning] 34-34: Possible null reference argument for parameter 'optimization' in '(Dictionary<string, double> RxAdjRssi, Dictionary<(string, string), double> PathAbsorption, double Error) CombinedOptimizer.OptimizeRxAdjRssiAndPathAbsorption(List allNodes, List uniqueDeviceIds, ConfigOptimization optimization)'.
[error] 125-125: 'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'.
🪛 GitHub Actions: Deploy to Docker
src/Optimizers/CombinedOptimizer.cs
[warning] 162-162: Cannot convert null literal to non-nullable reference type.
[warning] 162-162: Cannot convert null literal to non-nullable reference type.
[warning] 34-34: Possible null reference argument for parameter 'optimization' in '(Dictionary<string, double> RxAdjRssi, Dictionary<(string, string), double> PathAbsorption, double Error) CombinedOptimizer.OptimizeRxAdjRssiAndPathAbsorption(List allNodes, List uniqueDeviceIds, ConfigOptimization optimization)'.
[error] 125-125: 'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'.
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Analyze (csharp)
} | ||
|
||
private double CalculateError(List<Measure> nodes, Dictionary<string, double> rxAdjRssiDict, | ||
Dictionary<string, double> nodeAbsorptionDict = null, Dictionary<(string, string), double> pathAbsorptionDict = null) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Resolve the “Cannot convert null literal to non-nullable reference type” warnings.
Since you pass null
by default, consider marking these parameters as nullable or adjusting the method signature to avoid the warning, for example:
-private double CalculateError(List<Measure> nodes, Dictionary<string, double> rxAdjRssiDict,
- Dictionary<string, double> nodeAbsorptionDict = null, Dictionary<(string, string), double> pathAbsorptionDict = null)
+private double CalculateError(
+ List<Measure> nodes,
+ Dictionary<string, double> rxAdjRssiDict,
+ Dictionary<string, double>? nodeAbsorptionDict = null,
+ Dictionary<(string, string), double>? pathAbsorptionDict = null)
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
Dictionary<string, double> nodeAbsorptionDict = null, Dictionary<(string, string), double> pathAbsorptionDict = null) | |
private double CalculateError( | |
List<Measure> nodes, | |
Dictionary<string, double> rxAdjRssiDict, | |
Dictionary<string, double>? nodeAbsorptionDict = null, | |
Dictionary<(string, string), double>? pathAbsorptionDict = null) |
🧰 Tools
🪛 GitHub Check: build
[warning] 162-162:
Cannot convert null literal to non-nullable reference type.
[warning] 162-162:
Cannot convert null literal to non-nullable reference type.
🪛 GitHub Actions: Build and test
[warning] 162-162: Cannot convert null literal to non-nullable reference type.
[warning] 162-162: Cannot convert null literal to non-nullable reference type.
🪛 GitHub Actions: Deploy to Docker
[warning] 162-162: Cannot convert null literal to non-nullable reference type.
[warning] 162-162: Cannot convert null literal to non-nullable reference type.
try | ||
{ | ||
// Step 1: Optimize RxAdjRssi and path-specific absorptions | ||
var (rxAdjRssiDict, pathAbsorptionDict, error) = OptimizeRxAdjRssiAndPathAbsorption(allNodes, uniqueDeviceIds, optimization); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Check for null optimization
before usage.
The pipeline warns about a possible null reference to optimization
. Ensure you either enforce a non-null guarantee before calling OptimizeRxAdjRssiAndPathAbsorption
or handle the case where _state.Config?.Optimization
is null.
- var (rxAdjRssiDict, pathAbsorptionDict, error) = OptimizeRxAdjRssiAndPathAbsorption(allNodes, uniqueDeviceIds, optimization);
+ if (optimization == null)
+ {
+ Log.Warning("Optimization configuration is not available. Skipping step 1.");
+ return results;
+ }
+ var (rxAdjRssiDict, pathAbsorptionDict, error) = OptimizeRxAdjRssiAndPathAbsorption(allNodes, uniqueDeviceIds, optimization);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
var (rxAdjRssiDict, pathAbsorptionDict, error) = OptimizeRxAdjRssiAndPathAbsorption(allNodes, uniqueDeviceIds, optimization); | |
if (optimization == null) | |
{ | |
Log.Warning("Optimization configuration is not available. Skipping step 1."); | |
return results; | |
} | |
var (rxAdjRssiDict, pathAbsorptionDict, error) = OptimizeRxAdjRssiAndPathAbsorption(allNodes, uniqueDeviceIds, optimization); |
🧰 Tools
🪛 GitHub Actions: Build and test
[warning] 34-34: Possible null reference argument for parameter 'optimization' in '(Dictionary<string, double> RxAdjRssi, Dictionary<(string, string), double> PathAbsorption, double Error) CombinedOptimizer.OptimizeRxAdjRssiAndPathAbsorption(List allNodes, List uniqueDeviceIds, ConfigOptimization optimization)'.
🪛 GitHub Actions: Deploy to Docker
[warning] 34-34: Possible null reference argument for parameter 'optimization' in '(Dictionary<string, double> RxAdjRssi, Dictionary<(string, string), double> PathAbsorption, double Error) CombinedOptimizer.OptimizeRxAdjRssiAndPathAbsorption(List allNodes, List uniqueDeviceIds, ConfigOptimization optimization)'.
var obj = ObjectiveFunction.ValueAndGradient(x => | ||
{ | ||
// Objective function (error calculation) remains the same | ||
var value = CalculateError(allNodes, rxAdjRssiDict, pathAbsorptionDict: absorptionDict); | ||
|
||
// Numerically approximate the gradient | ||
var gradient = new double[x.Count]; | ||
double epsilon = 1e-5; | ||
|
||
for (int i = 0; i < x.Count; i++) | ||
{ | ||
var xPlusEps = x.Clone(); | ||
xPlusEps[i] += epsilon; | ||
|
||
var errorPlusEps = CalculateError(allNodes, rxAdjRssiDict, pathAbsorptionDict: absorptionDict); | ||
gradient[i] = (errorPlusEps - value) / epsilon; | ||
} | ||
|
||
return (value, Vector<double>.Build.Dense(gradient)); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix the compile-time error and undefined variable reference.
- The pipeline logs an error indicating
'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'
. Verify that you are using a Math.NET version which supportsValueAndGradient
; otherwise, update to a supported function (e.g.,ObjectiveFunctionGradient.Create(...)
). - The variable
absorptionDict
is referenced in this block but never declared or assigned. You likely meant to usenodeAbsorptionDict
.
private Dictionary<string, double> OptimizeNodeAbsorptions(...){
var obj = ObjectiveFunction. /* Confirm correct Gradient variant */
- ValueAndGradient(x =>
+ Gradient(x =>
{
// ...
- var value = CalculateError(allNodes, rxAdjRssiDict, pathAbsorptionDict: absorptionDict);
+ var value = CalculateError(allNodes, rxAdjRssiDict, nodeAbsorptionDict: nodeAbsorptionDict);
// ...
});
// ...
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
var obj = ObjectiveFunction.ValueAndGradient(x => | |
{ | |
// Objective function (error calculation) remains the same | |
var value = CalculateError(allNodes, rxAdjRssiDict, pathAbsorptionDict: absorptionDict); | |
// Numerically approximate the gradient | |
var gradient = new double[x.Count]; | |
double epsilon = 1e-5; | |
for (int i = 0; i < x.Count; i++) | |
{ | |
var xPlusEps = x.Clone(); | |
xPlusEps[i] += epsilon; | |
var errorPlusEps = CalculateError(allNodes, rxAdjRssiDict, pathAbsorptionDict: absorptionDict); | |
gradient[i] = (errorPlusEps - value) / epsilon; | |
} | |
return (value, Vector<double>.Build.Dense(gradient)); | |
}); | |
var obj = ObjectiveFunction.Gradient(x => | |
{ | |
// Objective function (error calculation) remains the same | |
var value = CalculateError(allNodes, rxAdjRssiDict, nodeAbsorptionDict: nodeAbsorptionDict); | |
// Numerically approximate the gradient | |
var gradient = new double[x.Count]; | |
double epsilon = 1e-5; | |
for (int i = 0; i < x.Count; i++) | |
{ | |
var xPlusEps = x.Clone(); | |
xPlusEps[i] += epsilon; | |
var errorPlusEps = CalculateError(allNodes, rxAdjRssiDict, nodeAbsorptionDict: nodeAbsorptionDict); | |
gradient[i] = (errorPlusEps - value) / epsilon; | |
} | |
return (value, Vector<double>.Build.Dense(gradient)); | |
}); |
🧰 Tools
🪛 GitHub Check: build
[failure] 125-125:
'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'
[failure] 125-125:
'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'
🪛 GitHub Actions: Build and test
[error] 125-125: 'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'.
🪛 GitHub Actions: Deploy to Docker
[error] 125-125: 'ObjectiveFunction' does not contain a definition for 'ValueAndGradient'.
Summary by CodeRabbit
New Features
Refactor