diff --git a/notebook/integrate_openai.ipynb b/notebook/integrate_openai.ipynb index 563a631a35..b8b5ae7ecb 100644 --- a/notebook/integrate_openai.ipynb +++ b/notebook/integrate_openai.ipynb @@ -21,7 +21,7 @@ "\n", "FLAML requires `Python>=3.7`. To run this notebook example, please install flaml with the [openai] option:\n", "```bash\n", - "pip install flaml[openai]\n", + "pip install flaml[openai]==1.1.3\n", "```" ] }, @@ -38,7 +38,7 @@ }, "outputs": [], "source": [ - "# %pip install flaml[openai] datasets" + "# %pip install flaml[openai]==1.1.3 datasets" ] }, { @@ -447,31 +447,33 @@ "* `optimization_budget` is the total budget allowed to perform the tuning. For example, 5 means 5 dollars are allowed in total, which translates to 250K tokens for the text Davinci model.\n", "* `num_sumples` is the number of different hyperparameter configurations which is allowed to try. The tuning will stop after either num_samples trials or after optimization_budget dollars spent, whichever happens first. -1 means no hard restriction in the number of trials and the actual number is decided by `optimization_budget`.\n", "\n", - "Users can specify tuning data, optimization metric, optimization mode, evaluation function, search spaces etc.. The default search space is \n", - "```python\n", - " price1K = {\n", - " \"text-ada-001\": 0.0004,\n", - " \"text-babbage-001\": 0.0005,\n", - " \"text-curie-001\": 0.002,\n", - " \"code-cushman-001\": 0.024,\n", - " \"code-davinci-002\": 0.1,\n", - " \"text-davinci-002\": 0.02,\n", - " \"text-davinci-003\": 0.02,\n", - " }\n", + "Users can specify tuning data, optimization metric, optimization mode, evaluation function, search spaces etc.. The default search space is:\n", "\n", - " default_search_space = {\n", - " \"model\": tune.choice(list(price1K.keys())),\n", - " \"temperature_or_top_p\": tune.choice(\n", - " [\n", - " {\"temperature\": tune.uniform(0, 1)},\n", - " {\"top_p\": tune.uniform(0, 1)},\n", - " ]\n", - " ),\n", - " \"max_tokens\": tune.lograndint(50, 1000),\n", - " \"n\": tune.randint(1, 100),\n", - " \"prompt\": \"{prompt}\",\n", - " }\n", + "```python\n", + "price1K = {\n", + " \"text-ada-001\": 0.0004,\n", + " \"text-babbage-001\": 0.0005,\n", + " \"text-curie-001\": 0.002,\n", + " \"code-cushman-001\": 0.024,\n", + " \"code-davinci-002\": 0.1,\n", + " \"text-davinci-002\": 0.02,\n", + " \"text-davinci-003\": 0.02,\n", + "}\n", + "\n", + "default_search_space = {\n", + " \"model\": tune.choice(list(price1K.keys())),\n", + " \"temperature_or_top_p\": tune.choice(\n", + " [\n", + " {\"temperature\": tune.uniform(0, 1)},\n", + " {\"top_p\": tune.uniform(0, 1)},\n", + " ]\n", + " ),\n", + " \"max_tokens\": tune.lograndint(50, 1000),\n", + " \"n\": tune.randint(1, 100),\n", + " \"prompt\": \"{prompt}\",\n", + "}\n", "```\n", + "\n", "The default search space can be overriden by users' input.\n", "For example, the following code specifies two choices for the model, four choices for the prompt and a fixed list of stop sequences. For hyperparameters which don't appear in users' input, the default search space will be used." ]