'What does 'INFO:tensorflow:Oracle triggered exit' mean with keras tuner?
When I run Keras Tuner search, the code runs for some epochs and then says: 'INFO:tensorflow:Oracle triggered exit'.
What does this mean? I am still able to extract best hyperparameters. Is it due to early stopping? I have tried both randomsearch and hyperband.
Solution 1:[1]
Probably the reason is, directory is already created.
Try following steps:
- Change the directory name.
- Restart the kernel.
- re-run all the codes.
Solution 2:[2]
Try adding the directory argument where you have defined your tuner, or if you have already added directory arg, try changing the value of that arg.
regard the last line in the below example of RandomSearch tuner:
tuner = RandomSearch(
tune_rnn_model,
objective='val_accuracy',
seed=SEED,
max_trials=MAX_TRIALS,
directory='**change-this-value**',
)
Solution 3:[3]
I solved this issue by setting these two conditions in my Tuner:
overwrite = False- a value for
max_trialsin the Oracle greater than the one I used until the error "Oracle triggered exit" occurred (I'm usingkerastuner.oracles.BayesianOptimizationOracle)
Solution 4:[4]
I found the same issue and I found a very easy solution. It can be very easy if you just remove two files from the directory generated by the keras tunner. oracle.json and other .json files and Run it again it will work.
Solution 5:[5]
I believe this is occuring because you are working on a small dataset which is resulting in a large number of collisions while performing random search.
Please try reducing the number of 'max-trials' in your random-search, that may fix the issue.
Solution 6:[6]
I had the same issue with the Hyperband search.
For me the issue was solved by removing the "Early Stopping" callback from the tuner search.
Solution 7:[7]
For me i resolved this issue by removing the hp = HyperParameters() out of the build_model function. I mean, initialize the hp variable outside of the build model function.
Solution 8:[8]
I had this issue because I named two hyperparameters with the same names.
E.g., in the build_model(hp) function I had:
def build_model(hp):
...
a = hp.Choice('embedding_dim', [32, 64])
b = hp.Choice('embedding_dim', [128, 256])
...
A final note is to be careful to have more hyperparameters' combinations that trials. In my example of build_model function I have 4 possible combination of hyperparameters (2*2), so max_trials <= 4.
I hope it will help someone.
Solution 9:[9]
I had the same question and didn't find what I was looking for.
If the tuner have finished at trial, that is lower then your max_trial parameter, the most probable reason is that the tuner have already tried all the combinations possible for the field of hyperparameters that you set before.
Example: I have 2 parameters for tuner to try, fist can optain 8 values, second 18. If you multiply these two it gives you 144 combinations and that is exactly the number of trials that the tuner stopped at.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
