You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I recently ran into an issue where the random node selector wasn't working because I was using the same features at two different levels of multiple model decision trees. This process is avoidable if Aloha takes care of the salts automatically.
This can be done by having a flag which defaults to true called automaticSalts. If this is set to true, then Aloha will append the depth in the model decision tree to the set of features for the hash. This will prevent the same set of features from being used twice.
This feature should be extended to CategoricalDistribution models, and any other place randomization is needed.
In order for this work something about the model should be included in the salt. An idea that came up is a hash of the JSON.
The text was updated successfully, but these errors were encountered:
I recently ran into an issue where the random node selector wasn't working because I was using the same features at two different levels of multiple model decision trees. This process is avoidable if Aloha takes care of the salts automatically.
This can be done by having a flag which defaults to true called automaticSalts. If this is set to true, then Aloha will append the depth in the model decision tree to the set of features for the hash. This will prevent the same set of features from being used twice.
This feature should be extended to CategoricalDistribution models, and any other place randomization is needed.
In order for this work something about the model should be included in the salt. An idea that came up is a hash of the JSON.
The text was updated successfully, but these errors were encountered: