autogen/notebook/automl_time_series_forecast...

4623 lines
461 KiB
Plaintext
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Time Series Forecasting with FLAML Library"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 1. Introduction\n",
"\n",
"FLAML is a Python library (https://github.com/microsoft/FLAML) designed to automatically produce accurate machine learning models with low computational cost. It is fast and economical. The simple and lightweight design makes it easy to use and extend, such as adding new learners. FLAML can\n",
"\n",
" - serve as an economical AutoML engine,\n",
" - be used as a fast hyperparameter tuning tool, or\n",
" - be embedded in self-tuning software that requires low latency & resource in repetitive tuning tasks.\n",
"\n",
"In this notebook, we demonstrate how to use FLAML library for time series forecasting tasks: univariate time series forecasting (only time), multivariate time series forecasting (with exogneous variables) and forecasting discrete values.\n",
"\n",
"FLAML requires Python>=3.7. To run this notebook example, please install flaml with the notebook and forecast option:\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%pip install flaml[notebook,ts_forecast]\n",
"# avoid version 1.0.2 to 1.0.5 for this notebook due to a bug for arima and sarimax's init config"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 2. Forecast Problem\n",
"\n",
"### Load data and preprocess\n",
"\n",
"Import co2 data from statsmodel. The dataset is from “Atmospheric CO2 from Continuous Air Samples at Mauna Loa Observatory, Hawaii, U.S.A.,” which collected CO2 samples from March 1958 to December 2001. The task is to predict monthly CO2 samples given only timestamps."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"import statsmodels.api as sm\n",
"data = sm.datasets.co2.load_pandas().data\n",
"# data is given in weeks, but the task is to predict monthly, so use monthly averages instead\n",
"data = data['co2'].resample('MS').mean()\n",
"data = data.bfill().ffill() # makes sure there are no missing values\n",
"data = data.to_frame().reset_index()"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"# split the data into a train dataframe and X_test and y_test dataframes, where the number of samples for test is equal to\n",
"# the number of periods the user wants to predict\n",
"num_samples = data.shape[0]\n",
"time_horizon = 12\n",
"split_idx = num_samples - time_horizon\n",
"train_df = data[:split_idx] # train_df is a dataframe with two columns: timestamp and label\n",
"X_test = data[split_idx:]['index'].to_frame() # X_test is a dataframe with dates for prediction\n",
"y_test = data[split_idx:]['co2'] # y_test is a series of the values corresponding to the dates for prediction"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEGCAYAAACKB4k+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nOy9eXhkd3nn+/mVpNpVpdK+dku9ubvdbrdNewFjFtvEJjCQwIR4huRmmQs3CbmZkNnCwGQCE0/IPjNZyMNkgdwnCZgMzBATAo6xjcHY7W73vi/a1ypJVapFVaWq+t0/zlLnqOR22+7qVrfez/Pocen3q3N0jjHnPe/2fZXWGkEQBEEA8FzvCxAEQRDWD2IUBEEQBBsxCoIgCIKNGAVBEATBRoyCIAiCYNN4vS/gjdDe3q4HBwev92UIgiDcUBw6dCihte5Ya++GNgqDg4McPHjwel+GIAjCDYVSavSV9iR8JAiCINiIURAEQRBsxCgIgiAINmIUBEEQBBsxCoIgCIKNGAVBEATBRoyCIAiCYCNGQRAEYR1TKJX5mxfH+PujU9fk793QzWuCIAg3O3/w5Hn+9NmLANw91EpXxF/XvyeegiAIwnUmUyiRLZTW3Ht5bNH+PJVcrvu1iFEQBEG4zvybx49w63/+FhfmMjV7l+IZbuuLAjC7lK/7tdTNKCil/EqpA0qpo0qpk0qpT5vrX1ZKHTF/RpRSRxzHfEIpdUEpdVYp9XC9rk0QBGE98e1TswB85eC4az2ZK5LIFHnLtjYAZpcKdb+WenoKBeABrfXtwD7gEaXUvVrrH9da79Na7wP+F/BVAKXUbuBR4FbgEeBPlFINdbw+QRCE606pXMGjFAD/dHrWtWd5DncPttLUoJi5kT0FbWD5Qk3mj7b2lVIK+BDwt+bS+4Evaa0LWuth4AJwd72uTxAEYT0wncpTrmj6WgJcjGdZLpbtPcsobO9sprPZf2OHjwCUUg1meGgOeFJr/aJj+35gVmt93vy9D3D6ThPm2upzflQpdVApdTAej9fr0gVBEK4J44s5AN62ox2AsYWcvXdhLoOv0UNfLEBnxGcbheViGa117cmuAnU1Clrrshkm6gfuVkrtcWz/C6peAoBa6xRrnPPzWuv9Wuv9HR1rzogQBEG4YTg3kwbgHbd0AjA6n7X3zs6mGWoP0eBRdEf8zKQMo/Ajf/x9fvFvD9fleq5J9ZHWOgk8g5ErQCnVCHwA+LLjaxPAgOP3fuDadGsIgiDUmXOzad78m09xamrJtX5wdJHeqJ97hlqBqqcwksjy/QsJ7t9ueBBdET9zSwW01owv5ugI++pynfWsPupQSrWYnwPAQ8AZc/sh4IzWesJxyNeBR5VSPqXUELAdOFCv6xMEQbiWfOXgONOpPJ/9xzOu9UOji9y5OUZL0EvY18ik2Yvw1Jk5Khp+5r4hwDAK6UKJicVlcsUym1qDdbnOenY09wBfNCuIPMDjWusnzL1HcYeO0FqfVEo9DpwCSsDHtNZlBEEQbgIOjhpNaCcnU/baZHKZ6VSe/ZtjALSFvSxki4CRT2gJNtETNTqYu6M+8zwLAAzcaEZBa30MuOMV9n76FdYfAx6r1zUJgiBcD7TWnDVzB/PZIqncCtFgE4dMQ7F/0AgdtYaqRuFiPMO2jjDKLFftajaMw8ER45iB1kBdrlU6mgVBEOpMPF0gVyzb+YFhM5l8ciqFt8HDzu5mANpCXuYzplGYy7CtM2yfoyu6yijE6uMpiFEQBEGoM5cShhF4YKdRYXQpbvQfjCZyDLQGaGwwHsWtIS/z2QLp/Arz2SKD7SH7HJYQ3tnZNK0hLyFffQI9YhQEQRDqzIhpFO7fbpTRW8J2I/NZNrdVH/ytIR8L2SITi8a+0xsI+xoJm4ZgIFaf0BGIURAEQag7w/NZvA0ehtpDRPyNzKWN0tKxhRyb26oP/raQl5Wy5vS0Ubbav+rh3xUxks39dUoygxgFQRCEq8Zkcpnf+dYZyhV33+1IIsumtiANHkWn2W8Qzxh5hs2OB3xryAvAsQmjQml1hZG/yZCDq1c+AcQoCIIgXDV+7HPP88dPX6xpUBtOZBk0w0SdzT7imQJj80aT2mZH3qA1bBiFI+NJgt4GYsEm13msczx8a1fd7kGMgiAIwlWgUtFMmTIUww6pinJFMzqfY6jdeLvvbPYxl84zYhkFhzfQHjLCQ8cmkgzEgnY5qsWn338r3/7427hjU6xu9yHjOAVBEK4CC7mi/dmqLgKjCa1QqrC7NwJgh49G57N4FPQ7QkGWp1DRtfkEgPawj/Y6yVtYiKcgCIJwFXDKWl+KVz2FoxNJAG7rawGgI+yjUKpwYjJFb0sAb2P1Mdxm5hSgfh3Lr4YYBUEQhNfAX784yp7//C1yRfdM5TlzKlqzv5FLiaqncGIyRdjXyBYzd9BpVhC9NLJo5wgsrEQyrO0pXAvEKAiCILwGPv33p8gUSvz9UbeI81za8BTevKWN4XjWnncwsbjM5rYgHo+RH+hoNoxCplBylaOuZktH6BX36okYBUEQhCskVyzZD/vnzidce9b85Hu2tJEtlu3fp5LL9ESrb/2dpoYRcFmjcN+29qt23a8FSTQLgiBcIS9cmmelbBgFS+LaYjiRpbPZZ+sYXYpn6I76mUouc7c5KwGq4SPA1c1s8bkP30lZa3yN12dEvRgFQRCEK+S75xL4mzw8fGs3P7g479p7aWSBN22O2bmA6VSeTKHEUr5Eb0vVU2h2aBbt6YvW/I1339ZTp6u/MiR8JAiCsIoTkym+fXKmZv3w2CJ3DMTY0h5mLl0gv2KMfJlOLTOxuMxdg612yWg8U2Da9CasmQgASikaPIr+WIC+luuTTL4c4ikIgiA4KJUrvPcPvwfA4f/0LmJmmWipXOHMTJqfvHezyxsYag/xkilnfddgKyFfI0FvA/F0wQ4xrX74H/rUQ65Ko/WEeAqCIAgOjjkmoz1xrFphNJzI2k1o3eabv9WbcHBkgZC3gV09Rj6ho9lHPF1g2uxw7lllFFqC3nVrFMRTEARBcDCbqjahXZjL1Hze0dVMY4NRXmoNxDkwvMCdm2P2XISOsI9EpsBUchmPgq7m+nYhX03EUxAEQXBgvf13NPtcFUaWntFge8jOGyQyBVLLK5ydTXPXYLXCyPIUppJ5uiJ+21jcCNw4VyoIgnANmE0XaGpQ7OmNMJmseg2jiRztYR9hXyOxoBePgvlMgSPjSbSG/ZurInXtYUMJdSq57Ko8uhEQoyAIguBgbqlAR9hHfyxoT0gDw1OwlE4bPIrWkJd4psjYgqF2utUxTzkW8pJaXmEimXNVHt0IiFEQBGFD8tz5ON89F69Zn0vn6Yz46W0JkFpeIVswNI5GEu7RmW0hH/OZArOpPA0e5VIvbQ02oTWMLyyvy7LTyyGJZkEQNhzFUoWf/PMDAIx89j2uvcnFZXZ0NdNmylgvZIsoBXPpAkOOgThtYS/z2SIzS3k6wj4aPNXZBzGH2ql4CoIgCOuc71+s6hbNOSSvl/IrXEpkubU3QixoPNgXc0VGEuZAHIdWUbtZYTS7lKdr1YPfOhaQnIIgCMJ659xM2v58cHTR/nzCnI28d6DFHoW5mFth1Ko8alvlKWSKzKTydEfcJaetITEKNSil/EqpA0qpo0qpk0qpTzv2/l+l1Flz/bcd659QSl0w9x6u17UJgrCxGU5k8TcZj79RcywmVBvXbuuL0mIahWSu6CpHtWgP+8gUSlxKZF0qqIB9LFy/YTmvl3rmFArAA1rrjFKqCfieUuqbQAB4P7BXa11QSnUCKKV2A48CtwK9wD8ppXZorct1vEZBEG5ivnl8ms6Inzdtds80Hk5k2dMb5UI8w2TSYRQmkgy0BmgNeamYEtnJ3IqrHNWi3cw5lCvaVXkEbk8hGmjiRqJunoI2sNoBm8wfDfw88FmtdcH83pz5nfcDX9JaF7TWw8AF4O56XZ8gCDc3Wmt+/q9f5oOfe75mbziRZbA9RG80wJSjF+HoeIq9/cbYzJaAFT4quspRLdpC1ZDRtg63UQg0NdAe9vLxh3Zctfu5VtQ1p6CUalBKHQHmgCe11i8CO4D7lVIvKqWeVUrdZX69Dxh3HD5hrq0+50eVUgeVUgfj8dpyMkEQNhYvXponni7UrFv9A4BrdGa2ULIrifpiASYXjV6E+YwhYLfXlLNubPDQ7G8kmVupKUcF7OokgG2rPAWlFAc/9S7+9UPb3/gNXmPqahS01mWt9T6gH7hbKbUHI2QVA+4F/h3wuFJKAWqtU6xxzs9rrfdrrfd3dHTU8eoFQVgvPH1mzu4XcDKXzvPjn3+Bf/k/X6jZO+4Qtjs2Uf08YuYHhtpD9LUE7AY1K59geQpg5AamksvMpQsMrpqS1uHQM2p3GIgbnWtSfaS1TgLPAI9geABfNcNLB4AK0G6uDzgO6wemEARhQ3NiMsXPfOElfvObp2v2vn7EeEScn8swn3F7C06j4PQkhhPVSqKOZh/pQon8SpnjEymUgj19Efu73RE/h8eTQG0VUV9LgF97726+/6sPYLzX3hzUs/qoQynVYn4OAA8BZ4D/DTxgru8AvEAC+DrwqFLKp5QaArYDB+p1fYIg3Bi8cMmYcOasErJwPvhXj8c8MZmyu4mdBuPMdBqPgsH2IG2haoPacCJLbzRAs99RORQL2gale1UvglKKn33r0A3Xsfxq1LP6qAf4olKqAcP4PK61fkIp5QX+Qil1AigCP6WNSdgnlVKPA6eAEvAxqTwSBOGlkQWANcNHI4ksIW8D2WKZ+WzRXtdac2JyiXfv6ebLB8dZcOw9dWaO/ZtbCXobaTOlKeYzRVO8zv3g73eUk64uO71ZqZtR0FofA+5YY70I/MQrHPMY8Fi9rkkQhBuPCTMRfH42g9baDtVorRlOZLlzc4znzidYyFQf/PG0IWm9qydCa9BLwjQKC9kip6eX+A+P7ASqpaOJbIGp1DJ3bnKXrm5yGIXuyI0lV/F6kY5mQRDWNdb0snShRLZYDR4s5lZYypfsHgSnNzC+aISaNrUGaQ15bYNhVRpt7TAqiawEcTxdYCaVr/EGtnRUK44C3vU5Ke1qI4J4giCsW/IrZRayRYbaQwwnsixkinYDmVVFdFtflKYG5QofjS8YD/+B1gBtYa9tMKy8g5U0tsJH52bSrJQ1favCR3cMtPBLD24nt0bo6mZFjIIgCOsWq1z01t4Iw4ks89kCm8zS0JFEVXqiNeRlIVtNJo+bPQr9sSBtIR+nZ5Zc57OMQsjbgK/RY5esrvYUlFL8yrtuvAa0N4KEjwRBuO6cn00z+Kvf4NDogmvdCh3dZjaULeaq3sBIIotHGRVCrSGfK3x0zKw88jc1uDyF6dQyvkaPLXanlKIn6ufQmCGKd6OJ19UDMQqCIFx3njw9C8DXDk+61idtT8EwCvOOZPLwfI6+WABvo4e2kNcOH6VyKzxzdo537+kGjGRyMrfCSrnCVDJPb0vA1VfQHwtSrhh9squrjzYiYhQEQbjuzJoewXKx4lqfNnWJdvcaDWVOb+D09BI7OpsBzPCRsXd8MsVKWfPOnZ1ANW+wmCsyvpijP1bbhAYQ9DbccOJ19UCMgiAI150TU0bM/+zskmt9OrVMe9hHLNiEt9FjP/hTyytcmMuwb8CQpHBWGDllLAC7QW0+U2R8IVcjZW0ZidaQ96bqTH69iFEQBOGaoXWNnBm5YoljE4aUxEgi5/rOpNlQppRyhYis7+/bZBiFtpCXdKFEoVRmdD6Lr9Fj9xVYRmFsIcdiboWBmNsoDJllp3v7o1fzVm9YxCgIgnBN+PujU7zls9+xp5hZHBxZZKWsuX97O5lCidTyir03ubhMb7T6Jr9oGoUjY4ZRsMTrWs1+g8XsCiPzOTa3BfGYM5MtNdOjpobRQKs7fPTuPT387Ufu5b8/WtNruyERoyAIwjXh418+wnQqz5dfGnetW/pF77u9F6h2MOdXyozMZ9nRXc0bWJ7CkfEkWztCdg7ADhFlC4zOu2WurbkHh01DstpTaPAo3ry1jaYGeRyCGAVBEK4BxVKFklnh8+w59xyUi3MZeqJ+dnYbyWTLKJydSVPRsLunNpl8bDLF7QNVievWUFXDaHQ+55K5bgk24Wv02BpKN9p4zGuNGAVBEOpO3FQp9Td5GJ135w0uxDNs6wzbCd8JU6LipJl8toyFZRSyhRLxdIGtjmlnllzF8ckUhVLF5SkopehtCVCqaELeBrtHQVgbMQqCIFw1yhXN3FK+Zn3WXLtrsJVMoWS/8WutuTiXYWtHmBazwmjOlKp+6vQsvVE/m823/raQl0yhZM9DcL7xWwNvLG/Aqjyy6DFlrwdag1Jh9CqIURAE4arxR9+5wN3/9Sn7bd9izmEUAEZNGYrpVJ5sscy2zjBKKTqbfcTTBfIrZZ47n+CRPT32Q9wKEVmSFM5+g7CvkUBTAz+4aMxeGKwxCoGaY4S1EaMgCMJV4yuHjCTylw64k8mzS8bbv2UULG2iC3MZoDrjuMM0ClPJZYrlimsKmiVzbVUROR/wSik6Iz4KpQodzT56awbiGP98+y2db/wmb3JEEE8QhKtCfqVsaxUdNfsILEbnc3gbPezoMh7+llxFjVEI+xiZz9ryFs6pZp0Rw1M4NLaIr9FDR7g6I9k6dnQ+x/7NsZoQ0c+/Yyu9UT//8u5NV+Veb2bEUxAE4aowlVy2NYSmVo3GfP5igv2bY7QEvXhUVdjuQjxDS7DJLintjBiegjX3wClQ12U2o12Yy9AfC9Q8+K2qpbdub6+5tq0dYX7lh26hwSP5hFdDjIIgCFcF6+1+T1+E6VTerjB6/mKCMzNp3rq9nQaPoiVYLS29MJdhW0fYfsB3hP0s5owGNI9yz0V2egb9sdqy0nfu7ADgR+/oq88NbhDEKAiCcFWw3tT3b24lVyyzlDcG0/zF94bpifr5v948CEAs2GR7ClblkYUVIjo6nqQ74nc1lFlqqLB2wvjT79vDkV97F0GvRMXfCGIUBEG4KkwuLtPgUdxh6hFNpwwjMbaQY09f1J6YZvUbLGaLzGeLdj4Bqt7A4fFF+tZ48BfLhorqWg1o3kYPLUHv1b2pDYgYBUEQrgpHJ5L0xwJ2cnjGDCFNLC67pCViQS+L2RUuxN1JZqj2G+RXKmsOvHnbDiNEdP8aeQPh6iBGQRCE18TfHhjjJ/7sRVdX8kgiy3PnE3xo/4D9YE9kiizmVsgVy65wT2vIy0KuyMW5WqNghY/AXXlk8ZsfuI3D/+ld9tAd4eojwTdBEF4Tn/jqcQDOz2XY0WXoElmlpW/d1k67GQKKpwuOWcnVB3zMVDs9N5vB3+RxPfwt8TpgzfBRxC8SFfVGPAVBEK4Yp3fw9Jk5+7PVwdwfCxDyNRLyNpDIFOzkszMH0NXso1TRvDSywJb2sC1xDUZewGKTCNddF8QoCIJwxcw4dI1GHHMRJhaXCTQ12F3H7WZnsmUsnG/9Vpnp8cmUK3RkEfEbAYw3b2m7+jcgvCp1MwpKKb9S6oBS6qhS6qRS6tPm+q8rpSaVUkfMnx92HPMJpdQFpdRZpdTD9bo2QRBeH5YYHVSlK8AwCs6Gso6wYRTGF3NEA02usI/VhAasaRSe/JW3c+TX3kWjzDe4LtQzp1AAHtBaZ5RSTcD3lFLfNPf+QGv9u84vK6V2A48CtwK9wD8ppXZorct1vEZBENYgkSlQKFVqkr0TC0Y4aFtn2FY+BZhI5lzeQEezj3OzaXxNnppJZ69mFJz7wrXnNZlipVRMKbX3Sr6rDTLmr03mT+2A1irvB76ktS5orYeBC8Ddr+X6BEG4Otz/W09z32e/U7M+vmh0Gu8baFnTU7DojvqZSeUZX8jVGBarOgngFnOqmrB+eFWjoJR6RikVUUq1AkeBv1RK/f6VnFwp1aCUOgLMAU9qrV80t35RKXVMKfUXSqmYudYHOKUVJ8y11ef8qFLqoFLqYDweX70tCMIbJJkrsrxiOOgzKfdshLGFHD1RoxdhPltgpVwhnV8hmVtxSU/0RgNki2UuxrM13kBTg4d33NLBz719q6ubWVgfXImnENVaLwEfAP5Sa/0m4KErObnWuqy13gf0A3crpfYAnwO2AvuAaeD3zK+vpVRV41lorT+vtd6vtd7f0dFxJZchCMJr4OWxRfvzi8Pzrr2xhRybWoN0RfxobZSdWppHTk+hp6UaArLKVp184Wfu5lffvfNqX7pwFbgSo9ColOoBPgQ88Xr+iNY6CTwDPKK1njWNRQX4n1RDRBPAgOOwfmDq9fw9QRBePzOpalhoJFEdlqO15sLs6tGZy3aeweUpOEJGaxkFYf1yJUbhM8C3gAta65eUUluA8692kFKqQynVYn4OYHgXZ0wDY/GjwAnz89eBR5VSPqXUELAdOHDltyIIwmvhR/74+/zylw7XrM+ljZBRS7DJ1i8Co9ooXSixoyts9x2MLeTs0tQBh6fQG61+3tLhnoImrG9etfpIa/0V4CuO3y8BH7yCc/cAX1RKNWAYn8e11k8opf4/pdQ+jNDQCPD/mOc9qZR6HDgFlICPSeWRINSH8YUcR8aTHBlP8rs/drur/HN2qUBbyEtvS8AemgNwemYJgG2dzfS2+FHKOM90apn2sJc2h7R1V8THz943xHv29uBrbLh2Nya8YV7RKCil/pDLVAtprX/pcifWWh8D7lhj/Scvc8xjwGOXO68gCG+c7zi6kY9PprhjU8z+PZ7O0xnx0xP1MzpfDR9968QMIW8D+wZa8DU20B3xM76Y42I8WxMiUkrxa/9sd/1vRLjqXM5TOHjNrkIQhGvKsYmU/Xk4kXUZhdmlAp3NPnqifn5wyUg0a6158tQsD+3uIuA13vz7YwEmFpY5P5vmQ/sHEG4OXtEoaK2/6PxdKRXSWmdf6fuCIKwvVsoVZlL5NUdXHp9Mcs9QKy8OL7j6DcAYpXlrb4SuqJ90vsRysUyuWGI+W2Rvf4v9va6In2fPxskVywy2iU7RzcKV9Cm8WSl1Cjht/n67UupP6n5lgiC8Ib7w/RHu/+2n+fx3L7nWV8oVLsxluGuwlbCv0U4sg1FiOp8tsr2rmfaQJYFdsFVQnT0HPVE/6YIxXW2t2QfCjcmVVB/9N+BhYB5Aa30UeFs9L0oQhDfO6ILh2P/vI+7K7ni6QEUbD/LOiI85h6dwdiYNwM7uZtqbDXG7RKZgD8TZ6qgkcspRrCVzLdyYXJHMhdZ6fNWSVAUJwjpnMbsCwOnpJRayRXvd0izqivjobPa5PIXjk0auYWd3sz0XIZEpMhzP4mv0uEpNLbVTgP4WCR/dLFyJURhXSr0F0Eopr1Lq32KGkgRBWL84DcG52bT92cohdEX8dEX8rpzCs+fm2NndTFvYZ5eYWnMR+mMB1+yDboenEAnIvK6bhSsxCj8HfAxDh2gCQ57iY/W8KEEQ3jgL2SLbzRzAVNLZhGZ5Cn7aQj7mM4ZRSOdXODiyyDtu6QSgzZyNMJ8pMJHMuQblANzWH+Un7t3E73/o9ppEtnDjciXmXWmtP1z3KxEE4XVxMZ5hU2uQplXzBxZyRd66rZ3zcxmXUZhO5Wn0KNpCXtrCXrLFMvmVMs9fnKdU0bzjFkNTzN/UQLO/0Ryrucy+gRbX+X2NDfzGj9xW/xsUrilX4ik8r5T6tlLqX1myFYIgrA8WskUe/L1n+eUvHXGta61ZzBbpifppD3tt0TqA87NptnSE8JiGwTrP984nCPsaedPmas/CptYgxydTpJbdKqjCzcurGgWt9XbgUxjDb15WSj2hlPqJul+ZIAivysGRBQC+cXya/Eq1/mMpX6JU0bSachXWrGSAU9NL7O6JANjjMxeyRS7GM2zvCrs8jlu6mnl5LAm4VVCFm5crrT46oLX+FQxF0wXgi69yiCAI14BDo1WZa6dO0UWzhLQ/Zshcx9NG3mAhW2Q6lWd3r2EU2sLVstOJxWUGVnkDziE4q/eEm5MraV6LKKV+yhyl+TzGDASZiCYI64CL8arIgFPR9GXTWNy5qYX2sJdExqhEOjpuvPXf1mdEglvNBrV4usBUcrnGG9hlehQgnsJG4UoSzUeB/w18Rmv9gzpfjyAIr4G5dJ7NbUFG53NMJ6uewstji/S1BOg0K4wWsgUqFc2h0UUaPMpOGluewqnpJUoVXZM3uGNTNY1ohZqEm5srCR9t0Vp/HDhW74sRBKGWUrnCF74/TDq/UrM3u5TndlOPyPIUtNYcGF7g7qFWANrDXioakssrHBlPsrsnYovaNfsaCXobODhieBYDrW5voNnfZH+WstONwZUYhXtF+0gQrh/fOD7Nr//9Kf7HU+7ZVuWKJp4usLktSCzYZOcULsazJDJF7jGNgrMJbWQ+65KqUErZFUbAmhVG//BL9/PVX3hLXe5NWH+I9pEgrHO+fXIWgDMzadf6fMbQMDJmH1QH4lhzle/Z0gZUQ0QzqTxTyWU2rWpCs35XCnods5UtdvdGuNMhrS3c3Ij2kSCsY7TWvGDONDg8lkTr6tyrGaszudlHb4vfblB78dICnc0+W866w/QUjk+mqGjoX2UUNju+J1PSBNE+EoR1wHymwK9//SQZU4raYmwhx3y2yNaOEJlCiWSumlc4MWmNxwzTHfXbRuLw+CJ3DbbaOQArfHR4zMgbrPYUtptT07Kr/rawMXm92ke/UM+LEoSNxn/6Pyf4wvMj/MPxadf6UXNC2nv39gK4mtBeHJ6no9nHUHuInmiAZG6FXLHEbKrg0ilqCTTR4FEcNpvQVhuFH9nXx3tu6+FfP7S9Lvcm3FhcSUdzQmv9Ya11l9a6U2v9E8B/vAbXJggbhgPDRmfyoZFF1/qleAal4O2mHtHEYnVm8vGJFHduakEpZecCzsykKZYrdDT77O95PIrWkJf5bJGmBuWagwDgbfTwxx++k4++bWtd7k24sbiinMIafOiqXoUgbGAWs0W7uezwuNsojM7n6I0G2NphqJ1ankKxVGF0IWdPQrMe9CfMKiKnUYCq4ml/LEiDR0pLhVfm9RoF+a9KEK4S1qyDwbYgMw6pCoDhRJbB9iDRQBNBb4OdNxhbyFGuaLa0G0ahzexMtianWcllC2tgzmr5a0FYzSsaBV494sMAACAASURBVKVU6yv8tCFGQRBeM+MLOX77H8+wUq641i2jcN+2dpbyJVvYrlLRXIxnGGo3+gpiQS+LOcOjuGRqG20xew5ioSbXuVZ7CkGzWW1Tq0hVCJfncjIXhwDN2gaguMaaIAiX4Xe+dZavH51iqD3Ej+0fsNcvxrOEvA3s7Y/y1y8aOkQDrUFG5rOk8yX22jpFXhbNaWqWR2CFj2JBIzxk9TKsNgpWVdOP7Our4x0KNwOv6ClorYe01lvMf67+2fJqJ1ZK+ZVSB5RSR5VSJ5VSn161/2+VUlop1e5Y+4RS6oJS6qxS6uE3dmuCsL64MGe83T9xzF1hdDGeYWtnmE4zL2DNTD46YVQL3W7qFMVCXhbMktQTUymG2kO2DEVTg4dmfyPpfImWYBMRv/t97zPvv5X/8v5b2T/YWqe7E24W6jlYtQA8oLXOKKWagO8ppb6ptX5BKTUAvAsYs76slNoNPIoxt6EX+Cel1A6ttTTKCTc8iUyBU9NGX4FTzRTg4lyGe7a00Wm+3c+ZM5NPTC7ha/TY3kBrsImRRNbec4rVgeFJpPMltnaEa3SKtnU2s62zGUF4NV5vovlV0QYZ89cm88dqx/wD4N87fgd4P/AlrXVBaz0MXEAkuoWbhO9fSACwb6CFWfOhD0bD2FQqz9aOEJ3Nlqdg7F+Yy7C1I2xXC8XM8FG2UGIyueyStYZqCGmbWakkCK+HuhkFAKVUg1LqCDAHPKm1flEp9T5g0tRQctIHOOU0Jsw1QbjhOTiySLOvkQd3dpJaXrGTycPmm//WjjBtIS8NHmWHjy7MGZPQLFqDXtKFkh2GGmwLuf5GsWQksHd0i0cgvH7qahS01mWt9T6gH7hbKbUX+CTwa2t8fa2Etq75klIfVUodVEodjMfjV/eCBaFOzC7l6W0J0NMSsH+H6oS0LR1hPB5Fe9jL3FKBXNHwBpxv/TGz1+CIOSjH0iyy+Km3bOYDd/Tx6F0DCMLr5XIlqbcppV5QSo0rpT6vlIo59g68lj+itU4Cz2CEiIaAo0qpEQxj8bJSqhvDM3D+19wPTK1xrs9rrfdrrfd3dHS8lssQhLrz5ZfGeNfvP8uZmSXXeiJToL3ZS7eZTLb6ES7OZfAohyhds4+5dIHReaNzecghc21VFL1kzmXetMoo/Phdm/j9H99HyFfPVKFws3M5T+FzwK8DtwHnMBLFVh980ysdZKGU6lBKtZifA8BDwGFTKmNQaz2IYQju1FrPAF8HHlVK+ZRSQ8B24DUZH0G43nzumYucn8vwhe+PuNbjmQLtYR/dUePBPrNUnX0w0BrE32T0EXQ2+5lLFxhbMIyCU6eoJ2oYlBcuLdAa8hLxv+r/DQXhNXO5V4qw1vofzc+/q5Q6BPyjUuonWSOsswY9wBeVUg0YxudxrfUTr/RlrfVJpdTjwCmgBHxMKo+EG4nxhRwj5hu+FR6ySKSLtId91bJTM9l8bjbN9s5qiKiz2cexiSTjaxoFI/SUyBTscZqCcLW5nFFQSqmo1joFoLV+Win1QeB/Aa9a7Ky1Pgbc8SrfGVz1+2PAY692bkFYj1ix/u6Iv6bCaHmlTHvYZ4+/nFnKs1KuMJzI8tDuLvu7nc0+5rNFhhNZmv2NRANVb6At5KWpQbFS1vasBEG42lwufPRbwC7ngvmgfxD4aj0vShDWM8VShUSmULN+fDKFt9HDW7e32xVEAGdN6YnOZh9KGSqlM0t5RuezlCra5Sl0RPxobQzUGYgFXf0GltopwKZVlUeCcLW4XEfz32itXwBQSoWVUiFzfUxr/ZFrdYGCsN74xb95mf2/8U989eUJ1/rp6SVu6WqmryXAfLZoaxz96TMXiQWbeHBXJwBdER9zS3nOzxqVRzu6qiWkVgPbqemlmrkHUA0niacg1IvLlqQqpX5eKTUGjGJMYBtVSsmAHWHDcnp6iW+fMmYmHzMH4FiML+TY1Baky3zbt7yJU9NLvG1HBy1mc1m36SmcmzVmJWztcOcULFZXFwH8+U/fxW/8yB5++Laeq35vggCXL0n9FPDPgHdordu01q3AO4F3m3uCsOE4bhoCb6OHyWRVrqJS0Uwl8/THAnRFjAf77FKBQqnMVHLZ1WjWZeYczs2l6Y8FCHirc5E7HQNwBmK1iqYRfxM/ce9mu1pJEK42l0s0/yRwu9baDo5qrS8ppT4EHAV+o94XJwjrjbOzafxNHu4ZamPSMRoznilQLFfojwVtuYrZpTxhXyMVDYPt1bf+roifYqnCwZEF9vRGXed3zkGQ2QfC9eCy4SOnQXCsLQOVNb4uCDcNf/rsRf702Ys160YJaTObWoMuT8FqNutvqXoKRhOaIWOxeZWnAIYnsa3LrVPkbaz+X/L2fik7Fa49l/MUJpRSD2qtn3IuKqUeAKZf4RhBuOEpVzSf/eYZAD58zyZbnhrgUjzLXYMx+mIBUssrZAslQr5G/vHEDN4GD/sGWogEmvAomFvKU6kYLT39jlCQ1cAGsGMN5dLf/9DtDLQGbVkLQbiWXM4o/BLwf5RS36M6cOcu4D4MuQpBuCk5PllNIH/3XIL37DWSuivlCtOpZTa19tkJ4Xi6QMjXyLdPzfD2WzrsB3lHs4/ZpTzFcgVvg4f2UNUQdDnyBtu7ahVNP3Bnf13uSxCuhMuVpJ4E9gDfBQaBLebnPeaeINyUnJyqGoURM/wDMJVcpqKhvzVo6xDNpQvkV8pMLC5zW181P2DJVUwn83RH/Xg8yrVn4SxHFYT1wCt6CkqpbUCX1vovVq3fr5Sa0lrXBlwF4SZgfGGZpgZFyNfoyhuMLxif+2MB2kJVT8EyHIPtzryBj8lknky+ZGsWWXgbPfztR+5lW2dYqoiEdcflEs3/DUivsb5s7gnCTcn4Yo6+lgADsSATjgqj75mDcra0h21PIZ7OMxzPmutVo9AZ8TO3lGc6laevpba09M1b22rmKAvCeuByOYVBU9bChdb6oFJqsG5XJAjXiEvxDNFAE21h98N5YiHHQGuQsK+Rc6ZERbmi+asfjPDevT10R/1UKppGjyKeKZAtGrqNTk/B0jAC6Fuj30AQ1iuX8xT8l9mT/8qFG5r5TIEHfu9ZPvi552v2xheXGWgN0tsSYCppVGVPLObIFcu8bbsxw8MYiONjbqnAcCJLZ7OPsGOOgTOZvKVDdIqEG4fLGYWXlFI1GkdKqX+FUY0kCDcsXzlk6BaNzOc4NVUdiJMplFjIFhmIGcnk5ZUy2ULJMSHNPfQmnikwksgy1O5+8Fu9CgBD7TIzWbhxuFz46JeBrymlPkzVCOwHvMCP1vvCBOFqMDafozPiq0noOstOhxNZdvdGAOw5BgOtAQorRo9mPF3gUrw6S9mis9nHzFKemVSeH7q1Kn9t7FU9hSFRNBVuIC5XkjqrtX4L8GlgxPz5tNb6zeakNEFY16SWV3jb7zzNx798pGbv1NQSb9naBsB0yllhZBqFWJB2K5mcMUJELcEmV0NZR7OPC3MZ5rPFGk9ha0eYOze1cP/2dqJBmZAm3Di86jBXrfXTwNPX4FoE4ary3Pk4AN88MUOxVLElJDKFEsOJLB+4o48j40k7bwC4xmA2mXOUE+kCU8nlmiqijmYfhZLhTQyu8gYC3ga++gv31efGBKGOXFb7SBBuBC7FM5QrtRNif3Bx3v48vpizP5+eNnIIt/ZF6I76mVmqegrPX5ynPxagJdhULTvNFJhO5e1xmBbOklJJJgs3C2IUhBuagyMLPPB7z/LF50dq9oYT1W7kmVTVGzhp5hN290TpjVYrjLKFEs+dj/PuPd0oZUw5MzSMCkwml+ltcRfkOWcfiKKpcLMgRkFY92itOTaRROtab+CLPxgF4KWRhZq90fkcd24ylEannUZhaom2kJeuiI/OZh/xtDEM51I8y0pZ86bNxgjyBo+iJxrgzEyadL5E76rw0a4eIznd1xLA1yidycLNgRgFYd3z3fMJ3vdH3+evTAPg5PhEEsBuMrPIr5SZSi1z95CRTJ5xJJNPTS+xuzdiewMLZpNZVa6i+ta/qTXIi5eMMNRquYrNbSGe+/fv5G8+cs8bvUVBWDeIURDWPdaD/+8OuWci51fKjC7k8Ci4GM+SK5bsvfGFHFrDLd1hYsEm21Moliqcm01zqzncpi1s9CLkiiV79oFzNvLmtiDpgnHeLWv0Gwy0Bl2zEgThRkeMgrDusXoKhhNZVwjpwlwGrbHnFVu9BABHxg1DcmtvlO5ogNklwyicn0uzUtbcavYltJklpvOZIqPzOTqbfQS91aI85wN/SJLJwgZAjIKw7jllVgtlCiUWcyv2uhUysoyC1XUMcGh0kYi/kW0dYbojPttTOGl2L1tGodU0CgtZwyisLi3d1VOVtnbKWAjCzYoYBWFdUygZswr29BkP8VHHfINzsxmaGhTvuKWDBo/iwlzVKByfTHH7QAsej6I7GrCrj05NLRH0NtgP/7aw6SlkDQnszW3uKqL7Ta2jwTapLhI2BnUzCkopv1LqgFLqqFLqpFLq0+b6f1FKHVNKHVFKfVsp1es45hNKqQtKqbNKqYfrdW3CjcPYvJEbePsO4+FsNZcBnJ9Ns6U9TNDbSHfEb88+KFc0F+Yy7Ow23vJ7on7ms0UKpTKnppbY1ROxh95YcxEmFpeZSxdcSqdgVCAd+tRDfE0a0YQNQj09hQLwgNb6dmAf8IhS6l7gd7TWe7XW+4AngF8DUErtBh4FbgUeAf5EKSV1fhucS2avgaVOOu4wChfiGXvwfbujtHRsIUehVGG7OdWs26wamlsqcCmRYXtnNWFsNaAdGDZKWld7CmAko2VesrBRqJtR0AaWP99k/mit9ZLjayGM2c9gzH3+kta6oLUeBi4Ad9fr+oT1RzJXrFmzQkK7eiN0RXyMzhtGYaVcYWJx2Rabc/YbnDFzENaoy25TxvpiPEMiU3Q1mgW8DbSHffYAnc2tkkwWNjZ1zSkopRqUUkeAOeBJrfWL5vpjSqlx4MOYngLQB4w7Dp8w14QNwJ89d4l9n3nS5QmAkWTujwWI+JvY1Bq0w0eTi8uUK5pN5pt9Z7OPOdMoPH12jmZ/o51MtvoLDo4sAsY4TScDrQGSZgJ7k+QOhA1OXY2C1rpshon6gbuVUnvM9U9qrQeAvwZ+0fy6WusUqxeUUh9VSh1USh2Mx+P1unThGvMb3zgNwLGJlGv99PQSu83O4YHWoG00Rs1/bjbf+juafSxkixRLFb5zJs47b+mkqcH4z9sKH1ldz6slKQZixu+tIS/RgCiaChuba1J9pLVOAs9g5Aqc/A3wQfPzBDDg2OsHptY41+e11vu11vs7OjrqcLXCtcYpZnd2phpdzBUNNVNLTmJza4jppTyFUtmuQtpsh4+MB/+lRIZEpsBtfVH7PM3+JkLeBg5YRiHmNgpWs1qzX0pOBaGe1UcdSqkW83MAeAg4o5Ta7vja+4Az5uevA48qpXxKqSFgO3CgXtcnXHsqFc1SfqVm3ZKZADg9U5WrODuTRuuqxlBP1I/WRsL4wPACnc0+e8KZJU53aNQIEa32BrrNYyP+RtrD7qTx+/YZBXCrpbEFYSNSz1ejHuCLZgWRB3hca/2EUup/KaVuASrAKPBzAFrrk0qpx4FTQAn4mNa6XMfrE64x//O5S/zmN8/wg0884JKhthLEjR7FGYencHraMBBWbqDDNAAzS3m+dyHBgzu7UMqIOlpVRFbeYHUVUU80wMV4lm2dYfsYix1dzXz9F+9zTUsThI1K3YyC1voYcMca6x9c4+vW3mPAY/W6JuH68tSZOQD+8vsj/Mcf3mWvxzOGUXjr9naeORsnnV+h2d/E6eklwr5GOzFseQPfOTNHMrfC23a02+foNA3GwdG18wbWOZwzEJzs7W95w/cnCDcD0tEsXDVWyhX+8KnzrjkGq/cBTky6k8mWp2B1D1vyFefn0q43e+tN3hLGu29b1Si0h42H/fjCMp3NvhpJip97+1baQl4e2dP9+m9QEDYAYhSEq8bBkUV+78lzvPN3n6FQqo38WcbC6jWwuGRqFr15iyFzfdEUtrswl3U1mrWFvDR4FPF0gZ3dzbYhAGhq8Ng6Rrd0V/WKLAbbQxz81EP86B39b+QWBeGmR4yCcNU4P1dNEv+fw+7CsYnFHMncCs3+RqZTyxTN2cYL2SJ/9r1hNrcFGTIlJuLpAslckUSmwPauqlHweJT94L9zc6zm71uKpzvXMApATS5BEIRaxCgIr5nf/dZZPvm14zXr52czNPsbaQ15eXls0bX3zeMzAPzMfUNUtGEkAA6PLVIsVfjM+/cQ8DbQ7G9kbilvdzJv63TPMLAqhJwlp/aemTfYs8aeIAhXhhgF4TWhteaPnr7AX784ZucILM7NGjmAzW3BmhDRE8emuK0vaoeILNXSoxMpPAruGjTe/LsifubSBc6bRmF7p/ut/08+fCf/7PZefmh3V821/fcfv4Ov/sJbeO/e3po9QRCuDDEKwmvC+bB3Joy11pydTbOzO8JgW8ilZjq+kOPoRIr37O2xq3+siqOj40l2dDXbg206m33MLuU5P5vB3+Sp6R3obQnwh//iDtrCtVVE0WATd26K0eCRMJEgvF7EKAivicPj1bCQ1RMAMLtUIJlbYVdPM5tag0yllu1k8zeOTwPwntscRiFdQGvNsYkktzvKQS0NowvxDFs7wrbEtSAI1wYxCsJr4vR0Gm+Dh/5YwJaNADhtNp3t7I6wqTWI1jCVNEJE3zo5w+39UQZag0T8jXgbPMTTBcYXllnMrbB3oJoD6Ir6mV3Kc24m7ao8EgTh2iBGQXhNnJ5eYntXmDdvabMlJQDOmN3Ht3Q309Ni9BNMJw0l09PTS+wfbAWMCqAOU+b65JQRftrbV/UU+mNBVsqamaW8PQ9BEIRrhxgFYU3+x1Pn+bPnLtWsn50x8gZDHSEWskWWi0aI6MzMEn0tAaKBJnpNCYupVJ7xhRz5lQq3OB7w7c0+4pmCPUBnS0d1hsGAQ9Z6a4d4CoJwrRFZSKGGhWyR33/yHABv2hzjjk1GZVB+pcxcusCm1qDdE7CYKxLwBjgznbb7Ayyp6unkMmdN5dEdjt6BjrCPicUcw4ksXREfIUf3sVOeYq0mNEEQ6ot4ChuUSkVzx2e+zR8/faFm76nTs/bnk1NVgbpps4y0LxYgFjSMwoI5+/hiPMPOHuMh7m9qoDXkZSqV56yperp6BGYiU2A4kbUb1iyc1UaDMvBGEK45YhQ2KGdn0yzmVvidb52t2Ts5tUTQ24C30WPPLQBj2hkYD26rs3ghW+TiXJZSRbOzO2J/tyfqZya1zNmZNJvbgi5voKPZx3y2yPnZdI1R8Dc18H+/dYi//Om7pANZEK4DEj7aoDx/cd7+rLV2PYBPTy+xs7uZTKHEiKMvYdzsQu6PBSiajWuLphwFwK6earinJxpgYjFHsezOJ4BhFLSGpXypxigAfOq9u6/CHQqC8HoQT+EmJlMo8WN/+jxPm5LVTqzh9gCTyWX7s9aaMzNpdvZE2NQasj0FrTVffmmc3qif3pYArY7w0dlZo0x1sK36gO9t8XMpkWUkkWVnT9WDACOnYDHULslkQVhPiFG4ifnSgTFeGlnkZ77wkmvkJbg7kycWq0ZhZilPanmFXd3N9EQNyQlr/ch4kp+5b4gGjyIaaMKjYDFbZGw+x0BrgMaG6n9OPdEAxVKFioZd3as9herks6F2yRsIwnpCjMJNzHfPJ+zPw4mMa29kPsudm4z+AEuHCKr9Bjt7IrSFvSRzK6yUK/b6PvMYS7E0nikwvpiz5xxb9LZUp5jVegrVvdXDcARBuL6IUbiJGZvPssOUnj42UdUpyhVLzKUL3GOK002lqp6C1Zl8S3ezrS+0mC261i0G20JcimdNT2G1UahWEa02GH2xAB+5f4gv/uzd+Bob3vB9CoJw9ZBE801KqVxhYnGZf3X/EBOLyxyfTPGBO40BM9bs4719USL+Rpen8Ny5BFs6QkT8TbSbFUaJTJHzsxl6o34i/ib7u1s7wnzt8CTFcqXmwX97fws//46tbGoN1gjUNXgUn3yPJJMFYT0iRuEmZSqZp1TRbG0P09sSYHbJePCvlCv8t386h1Jw75Y2eqIBu/9gdinPC8Pz/NID2wFsT2EhW2QyuUx/zP3g39oZsquQVlcReRs9/IdHdtb1HgVBuPpI+GidsFwsc3jVYBowmsxWypWaRLFz/2I8g9bu/YtmDmFzW5BooInU8goAT5+Z47nzCfb2RYmFvPS0+Jk2w0fPnJ1Da3j3bcYc47aw4SnMZwvMLuXpivpdf8OpTSTdx4JwcyBGYZ3wmSdO8qN/8jxjjqqgYqnCfb/1HbZ/8pvc8Zlvc2h0oea4X/3qMR78vWf5wvMjrvUTZg5hV2/EZRQsvaEv/uzdgNVkZngKz56L0x3x230FlpRFPF1gOpWnZ5VRuNsUuQNq5h4IgnBjIkZhnWDNJvju+bi9tpAtMp3K8/CtXTT7m3jsG6drjvvOGeP7//2p81Qc3sTxyRRb2o3cgNMojM5naQ15aTH7DLojARIZQ6rixOQS+wdjdiNbNNCEt8HD+dkMxVKFrojbKIR8jQy0BuhrCUj3sSDcJIhRWCek8yUAXrhU7TROLhcBeN/tfbzjlg6GE1nXMdZw+109EZK5Fc7Npe29MzNpdvUapaDRQBPJnGUUcmx2aApZMtcTi8tMLOZcuQGlFJ0RH0cnksZ3V3kKAE9+/O089W/e/vpvXBCEdYUYhWtItlCqebADLOVXmFmqJnstUuaDvCXYRG9LgMXcCrliyd635hj/y7sHAHhp2Agv5VfKjC/mbBG6SKCJdL5EuaIZnc+5Oo+tB/3BkQUqGja3uRPG3RE/Z0xRu4FYbU+Bv6kBf5OUlQrCzULdjIJSyq+UOqCUOqqUOqmU+rS5/jtKqTNKqWNKqa8ppVocx3xCKXVBKXVWKfVwva7tevGRvzrIO3/3GeZNrSCLcXOecaNHkcgU7fWkGfKJBproN+cMTDkkKV42h9w8tLuLQFOD3aU8nMiidXUeQUvAKCOdzxSYSi27PQXTKPzA1EJarUxqJZeVgm0yCU0Qbnrq6SkUgAe01rcD+4BHlFL3Ak8Ce7TWe4FzwCcAlFK7gUeBW4FHgD9RSt1wr6BfOjDGf/2H2tj/7FLeFqH78sFx154lM7FvoIV4umowUg6jYDWDOSUp/uH4NHv7o/REDdXSRdOzuGB6ENZDPGoaheOTKbTG5Sl0mwNxfmCGrQbbaz0FMBrQAt4b7n8OQRBeI3UzCtrA0lZoMn+01vrbWmsrBvIC0G9+fj/wJa11QWs9DFwA7q7X9dWLX/3qcT7/3Uv2RDILZ7np0fGka89pFDKFEvkV49jV4SOozj3Or5Q5Ppni7Ts67O8s5gwv48JcBqWqvQOWUbD+7iaHNxD2NdLsb2R2qUDY12hXHFlYVUXbZAqaIGwI6tq8Zr7pHwK2AX+stX5x1Vd+Fviy+bkPw0hYTJhr645PfPU45UqF3/7nt7vWndU/h8cWecu2dvv3YxMpGj2K+7a122/yFhOLOULeBrabkhTxdIGB1iCp5RUaPIqwr9GWg7A8iYvxDBVd7Q9oDXlZyBbtvYFY0I71d0aMJrQXzZzD5lXdxz1RP+l8hs1twZoqon++v59ooIn7HPciCMLNS10TzVrrstZ6H4Y3cLdSao+1p5T6JFAC/tpaWusUqxeUUh9VSh1USh2Mx+NrHFJfKhXN3x4Y4/GDE/ZD2GJsodpjcNShNQRwYmqJHV3N3NobYXQ+x4rZCQzGm/1ge4iOZuPhbc0nSC4XiQaaUErhbfQQCzYxlzY8hfOzhmHZYfYUxIJekg5PYatj7rH1tn9odJGAORXNSY8ZQhpsq51tEPE38cE39dsjNgVBuLm5JtVHWusk8AxGrgCl1E8B7wU+rKutuBPAgOOwfmBqjXN9Xmu9X2u9v6Ojo67XvRaXHGqjz513G6UzM9UZBc6EMMDFuQw7usJs7QhTqmiXATkzk2ZXT4R2U1bC8gYWskU7SQzQ2VyVsj49s0RTg7If5LFgkz0acziRdSWFW0NefI0eShVNV8RX4w2EzTnKu3vdaqaCIGw86ll91GFVFimlAsBDwBml1CPAfwDep7XOOQ75OvCoUsqnlBoCtgMH6nV9r5cTk9UHvzPpC3BqOo1HwZb2kGtwzXKxzGRymS0dYbsvwOoinknliacL7Oxuto2CVYF0YS7DFscbf2fEZxuMA8ML7O1vwdto/E8YC3lZypf41slZCqWKK9yjlLK9hdUNaAAfuX8LH39oBx+5f8vr/LciCMLNQj1zCj3AF828ggd4XGv9hFLqAuADnjTfWF/QWv+c1vqkUupx4BRGWOljWuvyK538ejFhjqQMeRvszxZnppcYbA+xpT3s2rN6E7Z0hOxQjWUUfvtbZ2hqULzjlg5bayiRKVAsVbgUz/Lgri77PB1hH5fiWXLFEscnUnz0bdWHuBUS+trLE7SGvNy/3e1F9cUCXEpk1wwD7RtoYd9AS826IAgbj7oZBa31MeCONda3XeaYx4DH6nVNr4VvHJtmc1uQPX1R1/pUKk9ryMtALFDjKZyZSXNbf5T2kJcXh6udyScmjfzCLV3Ndomn1az2g4vz/PBtPWzrNHID0UATiUyBkfkspYp2zTfuMD2FU1NLlCqaOzfF7D2r6eyZc3HuGGipkave0xflufMJvA3SrygIwisjT4g1WC6W+djfvMx7//B7FEpuZ2U6uUxP1E9/LMikwyik8yuMLeTY1d1Mb0uAdL7EUt4oKf3BpXnaw162dYYJeBuI+BuZXcqTzq8wncrbyWKA9rCXeLrApXjVu7DobPZTLFfsKqJdjhzAHvOz1tCzhjjdQ6bHEZReA0EQLoMYhTV4aaSqRnpsVRXRVDJPb0uA7qjfftsHODdrjrHsjtg9BdNmT8GB4QXuGWqzE7zdaWJl3wAADHdJREFUpjLpRfPB70wKt4d9JDIFO/zkHF5jVSc9ey5Os7+RXkcoqC3ss7uT11IsfdPmGH/+U/v5Nw/f8pr+XQiCsLHYsEZBa81XDo7XVAkBrtCPc79S0Uwml+mN+mkNeckVy3ajmZWA3tUboc+UpJhM5lg0B9TcPlANQ3VF/Mwu5TlvGpLtDqPQ0WyEiCYWlwn7Gu3GM4BO0ygcGF5gd0+kpopolzkLeXXJqcWDu7pck9MEQRBWs2GNwt8dmuDf/d0x/ujpCzV7J6eWGGg13/YdoyqPT6bIFErcsSlGzJSettRHDwwv0Bv10xv122/qk8k8p6dNY+EYXt8dMbyMC/EM3gaPyxvobQkwlcoztpCjP+aWpLY8BYC7HLMMLP75m4zmcJltIAjC62XDGoVvnZwBjFnGTrTWHB5LctdgK80+9/zip8/OoRS8bUcHrSHjjXshW0RrzYvD89yzxQgRdYR9NDUoppLLnFrLKET9xNMFzkynGWoP0ehI/vbHAhRLFV4eW2RgVedxp8MovGkwxmp++LYevvmv7+e9e3te778WQRA2OBvWKFhSE04BOoDPfvMMqeUVbuuL0tPid4WPnj4bZ99AC60hr+0pLOaKXIxnSWSK3D1kvL17PIquiHHsqakluiI+uwcBDKNQ0UYCeluXW1PIUkNN5lbYsWov7KsWi9071Lbmfe1aI6wkCIJwpdRV+2i9kl8p2x3Fc6uMwstji7QEm3j0rk08czZuh48WskWOTST55Qd3ANW4/UK2yMi8kTC+Z6ga0uk28wbJ3Aq7e9ydwlZZarFUceUTAPpaqt7B7h53OaxSio/cP8SevqgolgqCUBc2pFEYTmSpaOPNe3bJbRQuxbO8e083AW8DPVE/J6eM6qMj44toDfduMR78sVDVUzg+kaI97HVNLeuK+jkylmR2Kc8DOztdf8PZVbx6RoHlKcDashOffM/u13PLgiAIV8SGNAqRQBO/9MA2xheX+drhSYYTWYbaQ6RyK8xni7aeUE+0Or/46HgKj8JuZqsOrilyydQacoZtupr9ttTF6m5hpyHY3tns2gv5Gvnzn9rPsYlUzcAbQRCEerMhcwp9LQF+5Ydu4cFdxhv8f/zqcaAqdmcNmrF0imZTBQ6PJ9nWGSZkxvUbGzy0h73MpQumUXG/8XdHqzmEe7a44//O8ZWD7bUP/gd3dfHxd+2Q3IAgCNecDekpWLx3by9PnZ7jH45Pk18p241qljdgNYONLeQ4OLLAB+/sdx3fHfVzZmaJhWyRLasmlrWYieiWYJOr18Di73/xrRybTNpzEgRBENYDG9JTcPLevT0UShWOjid5eWyRrojP7hS2xOv+4cQ0uWK5ZtBMTzTA4TFjmtnQKqOwt98wLL/1wb1r/t3b+qN8+J7NV/VeBEEQ3igb2lOA6pCa0fkcxyZS7BtoscM2/bEAHgXfPjkLGA9yJz0OmYmhDrdR2Nkd4dxvvNuWthYEQbgR2PBPrO6oH4+C0YUs4ws5tjpmEfubGhhsC5HIFGjwKLoczWPWsRYDsdrcgBgEQRBuNDb8U6upwUN3xM9Lw4uUKtolOQHYc5O7I35X5zG4K4fEAAiCcDMgTzKMATQHTGXUTavKQG/pNnoFfE21/6qs/oO3bF27u1gQBOFGQ4wC7r6B1Z7C+27vBWBpuVRzXINHcfLTD/MXP31XfS9QEAThGrHhE80A/+7hnbSHfQS8DTUKo9s6w3zqPbu4Y9Pa4ypDPvlXKAjCzYPSWl/va3jd7N+/Xx88ePB6X4YgCMINhVLqkNZ6/1p7Ej4SBEEQbMQoCIIgCDZiFARBEAQbMQqCIAiCjRgFQRAEwUaMgiAIgmAjRkEQBEGwEaMgCIIg2NzQzWtKqTj8/+2dfYwdVRnGfw9srdACAUtNBcJCQoEaKEKDmKA0GCAqoRJowBYlwcTEIMEvDEQIf+gfSoxRUg2aSCkokAgkYKI00PBhKN9Nt7aU8qmwuqEiEFcQYbevf5xzp8N6723u7sydi31+yWTPPWfOnWeePXvfPWdm3stf+nzYecCrfT5mJwZJCwyWnkHSAtbTjUHSAoOlpy4th0bEge0a3tdBoQkkPdHpScB+M0haYLD0DJIWsJ5uDJIWGCw9TWjx8pExxpgCBwVjjDEFDgq988umBZQYJC0wWHoGSQtYTzcGSQsMlp6+a/E1BWOMMQWeKRhjjClwUDDGGFOw2wcFSddL2i5pc6lusaSHJf1J0u8k7VtqOza3bcntH8z1J+TXz0m6VpIa1nO/pG2SNuZtfp1aJK0sHWujpB2SjmvKm13o6bc3syStyfVbJV1R6tOEN930zNibaej5gKTVuX5E0tJSnxn7U6GWKsbNIZLuy75vkXRprj9A0j2Sns0/9y/1uSKf/zZJZ1TpTVsiYrfegE8BxwObS3WPA6fk8kXA93J5CNgELM6vPwTsmcuPAZ8ABPwB+EzDeu4HlvTLmyn9jgFeKL3uuze70NNXb4AVwK25vDfwZ2C4wXHTTc+MvZmGnouB1bk8H3gS2KMqfyrUUsW4WQAcn8v7AM8Ai4BrgMtz/eXAD3N5ETACzAYOA56n4s+cqdtuP1OIiAeB16ZUHwk8mMv3AOfk8unApogYyX3/ERGTkhYA+0bEw5F+WzcCn29Kz3SOW4GWMl8AbgFo0Ju2eqqiRy0BzJE0BOwFvAP8s0Fv2uqZznEr0rMIWJf7bQfeAJZU5U8VWno9ZhctYxGxIZfHga3AQcAyYE3ebQ07z3MZKYD/JyJeBJ4DTqxy7Exltw8KHdgMnJXLy4FDcnkhEJLWStog6Tu5/iBgtNR/NNc1pafF6jzNvaqyqWVnLWXOY+eHcFPedNLTop/e3Aa8CYwBLwE/iojXaM6bTnpa1OFNNz0jwDJJQ5IOA07IbXX606uWFpV5I2kY+BjwKPDhiBiDFDhIsxRI5/tyqVvLg9q8cVBoz0XAxZKeJE3x3sn1Q8DJwMr882xJnyZN36ZS5b2+veoBWBkRxwCfzNsXa9YCgKSPA29FRGv9tilvOumB/ntzIjAJfIS0BPAtSYfTnDed9EB93nTTcz3pQ+0J4CfAemCCev3pVQtU6I2kucDtwNcjotssrZMHtXkzVMWb/L8REU+TlmaQtBD4XG4aBR6IiFdz2+9Ja5W/Bg4uvcXBwN8a1LMuIv6a+45Lupn0QXBjjVpanM97/ysfpRlvOumhAW9WAHdHxLvAdkkPkZYk/kgz3nTS80Jd3nTTExETwDda+0laDzwLvE5N/kxDS2XjRtIsUkD4TUTckatfkbQgIsby0tD2XD/Ke2cqLQ9q+7vyTKENrbsKJO0BXAlcl5vWAsdK2juvx54CPJWne+OSTspTyi8BdzalJ0995+U+s4AzSdPlOrW06pYDt7bqGvSmrZ6GvHkJOFWJOcBJwNMNetNWT53edNOTx++cXD4NmIiIWv+uetVSlTf5PH4FbI2IH5ea7gIuzOUL2XmedwHnS5qdl7OOAB6rdexUcbX6/byR/oscA94lRd8vA5eS7gp4BvgB+cnvvP8FwBbSgLimVL8k1z0PrCr36bceYA7prolNue2n5DsWatayFHikzfs05c3/6GnCG2Au8Nt8vKeAy5r0ppOeqryZhp5hYBvpouu9pLTOlflThZYKx83JpGWeTcDGvH2WdOfgOtKsZB1wQKnPd/P5b6N0h1FVY2fq5jQXxhhjCrx8ZIwxpsBBwRhjTIGDgjHGmAIHBWOMMQUOCsYYYwocFIzpAUmTOc3BFqUsmt/M97p36zMsaUW/NBozExwUjOmNf0fEcRHxUeA00j3mV++izzDpKWJjBh4/p2BMD0j6V0TMLb0+nJSGeR5wKHAT6UEngK9FxHpJjwBHAy+SMmBeS3pgaikpJfLPIuIXfTsJY7rgoGBMD0wNCrnudeAoYBzYERFvSzoCuCUilih9Ucu3I+LMvP9XgPkR8X1Js4GHgOWRUiMb0yhOiGfMzGllrJwFrFL6hrdJUmrzdpxOyll1bn69HymnjYOCaRwHBWNmQF4+miRltbwaeAVYTLpe93anbsAlEbG2LyKN6QFfaDZmmkg6kJRdc1Wkddj9gLGI2EHKtb9n3nWclLO/xVrgqznbJpIWtjJzGtM0nikY0xt7SdpIWiqaIF1YbqVA/jlwu6TlwH2kbzeDlBFzQtIIcAMpw+YwsCGnPf47FX2VojEzxReajTHGFHj5yBhjTIGDgjHGmAIHBWOMMQUOCsYYYwocFIwxxhQ4KBhjjClwUDDGGFPwX8f58wJ97NOyAAAAAElFTkSuQmCC",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"train_df\n",
"\n",
"import matplotlib.pyplot as plt\n",
"\n",
"plt.plot(train_df['index'], train_df['co2'])\n",
"plt.xlabel('Date')\n",
"plt.ylabel('CO2 Levels')\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Run FLAML\n",
"\n",
"In the FLAML automl run configuration, users can specify the task type, time budget, error metric, learner list, whether to subsample, resampling strategy type, and so on. All these arguments have default values which will be used if users do not provide them."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"''' import AutoML class from flaml package '''\n",
"from flaml import AutoML\n",
"automl = AutoML()"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"settings = {\n",
" \"time_budget\": 240, # total running time in seconds\n",
" \"metric\": 'mape', # primary metric for validation: 'mape' is generally used for forecast tasks\n",
" \"task\": 'ts_forecast', # task type\n",
" \"log_file_name\": 'CO2_forecast.log', # flaml log file\n",
" \"eval_method\": \"holdout\", # validation method can be chosen from ['auto', 'holdout', 'cv']\n",
" \"seed\": 7654321, # random seed\n",
"}"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"[flaml.automl: 07-28 21:10:44] {2478} INFO - task = ts_forecast\n",
"[flaml.automl: 07-28 21:10:44] {2480} INFO - Data split method: time\n",
"[flaml.automl: 07-28 21:10:44] {2483} INFO - Evaluation method: holdout\n",
"[flaml.automl: 07-28 21:10:44] {2552} INFO - Minimizing error metric: mape\n",
"[flaml.automl: 07-28 21:10:44] {2694} INFO - List of ML learners in AutoML Run: ['lgbm', 'rf', 'xgboost', 'extra_tree', 'xgb_limitdepth', 'prophet', 'arima', 'sarimax']\n",
"[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 0, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:44] {3114} INFO - Estimated sufficient time budget=2005s. Estimated necessary time budget=2s.\n",
"[flaml.automl: 07-28 21:10:44] {3161} INFO - at 0.7s,\testimator lgbm's best error=0.0621,\tbest estimator lgbm's best error=0.0621\n",
"[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 1, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:44] {3161} INFO - at 0.7s,\testimator lgbm's best error=0.0621,\tbest estimator lgbm's best error=0.0621\n",
"[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 2, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:44] {3161} INFO - at 0.8s,\testimator lgbm's best error=0.0277,\tbest estimator lgbm's best error=0.0277\n",
"[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 3, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:44] {3161} INFO - at 0.8s,\testimator lgbm's best error=0.0277,\tbest estimator lgbm's best error=0.0277\n",
"[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 4, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:44] {3161} INFO - at 0.9s,\testimator lgbm's best error=0.0175,\tbest estimator lgbm's best error=0.0175\n",
"[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 5, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:44] {3161} INFO - at 0.9s,\testimator lgbm's best error=0.0055,\tbest estimator lgbm's best error=0.0055\n",
"[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 6, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:44] {3161} INFO - at 1.0s,\testimator lgbm's best error=0.0055,\tbest estimator lgbm's best error=0.0055\n",
"[flaml.automl: 07-28 21:10:44] {2986} INFO - iteration 7, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.0s,\testimator lgbm's best error=0.0055,\tbest estimator lgbm's best error=0.0055\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 8, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.0s,\testimator lgbm's best error=0.0031,\tbest estimator lgbm's best error=0.0031\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 9, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.1s,\testimator lgbm's best error=0.0031,\tbest estimator lgbm's best error=0.0031\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 10, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.1s,\testimator lgbm's best error=0.0027,\tbest estimator lgbm's best error=0.0027\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 11, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.2s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 12, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.2s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 13, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.3s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 14, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.3s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 15, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.4s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 16, current learner rf\n",
"[flaml.automl: 07-28 21:10:45] {3161} INFO - at 1.6s,\testimator rf's best error=0.0217,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:45] {2986} INFO - iteration 17, current learner xgboost\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.0s,\testimator xgboost's best error=0.6738,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 18, current learner extra_tree\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.1s,\testimator extra_tree's best error=0.0197,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 19, current learner extra_tree\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.2s,\testimator extra_tree's best error=0.0177,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 20, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.2s,\testimator xgb_limitdepth's best error=0.0447,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 21, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.2s,\testimator xgb_limitdepth's best error=0.0447,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 22, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.3s,\testimator xgb_limitdepth's best error=0.0029,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 23, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.4s,\testimator lgbm's best error=0.0022,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 24, current learner rf\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.4s,\testimator rf's best error=0.0217,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 25, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.5s,\testimator xgb_limitdepth's best error=0.0029,\tbest estimator lgbm's best error=0.0022\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 26, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.6s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 27, current learner rf\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.7s,\testimator rf's best error=0.0216,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 28, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.8s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 29, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.9s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 30, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:46] {3161} INFO - at 2.9s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:46] {2986} INFO - iteration 31, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:47] {3161} INFO - at 3.0s,\testimator lgbm's best error=0.0022,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:47] {2986} INFO - iteration 32, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:47] {3161} INFO - at 3.1s,\testimator lgbm's best error=0.0022,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:47] {2986} INFO - iteration 33, current learner lgbm\n",
"[flaml.automl: 07-28 21:10:47] {3161} INFO - at 3.2s,\testimator lgbm's best error=0.0022,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:47] {2986} INFO - iteration 34, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:10:47] {3161} INFO - at 3.3s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator xgb_limitdepth's best error=0.0019\n",
"[flaml.automl: 07-28 21:10:47] {2986} INFO - iteration 35, current learner prophet\n",
"[flaml.automl: 07-28 21:11:07] {3161} INFO - at 23.3s,\testimator prophet's best error=0.0008,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:07] {2986} INFO - iteration 36, current learner arima\n",
"[flaml.automl: 07-28 21:11:08] {3161} INFO - at 24.2s,\testimator arima's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:08] {2986} INFO - iteration 37, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:09] {3161} INFO - at 25.3s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:09] {2986} INFO - iteration 38, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:09] {3161} INFO - at 25.4s,\testimator xgboost's best error=0.6738,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:09] {2986} INFO - iteration 39, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:10] {3161} INFO - at 26.4s,\testimator extra_tree's best error=0.0177,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 40, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:10] {3161} INFO - at 26.6s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 41, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:10] {3161} INFO - at 26.7s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 42, current learner arima\n",
"[flaml.automl: 07-28 21:11:10] {3161} INFO - at 26.9s,\testimator arima's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 43, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:10] {3161} INFO - at 26.9s,\testimator xgboost's best error=0.1712,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:10] {2986} INFO - iteration 44, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:11] {3161} INFO - at 27.0s,\testimator xgboost's best error=0.0257,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:11] {2986} INFO - iteration 45, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:11] {3161} INFO - at 27.0s,\testimator xgboost's best error=0.0257,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:11] {2986} INFO - iteration 46, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:11] {3161} INFO - at 27.1s,\testimator xgboost's best error=0.0242,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:11] {2986} INFO - iteration 47, current learner arima\n",
"[flaml.automl: 07-28 21:11:11] {3161} INFO - at 28.0s,\testimator arima's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:11] {2986} INFO - iteration 48, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:12] {3161} INFO - at 28.9s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0008\n",
"[flaml.automl: 07-28 21:11:12] {2986} INFO - iteration 49, current learner prophet\n",
"[flaml.automl: 07-28 21:11:17] {3161} INFO - at 33.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 50, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:17] {3161} INFO - at 33.3s,\testimator xgboost's best error=0.0242,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 51, current learner arima\n",
"[flaml.automl: 07-28 21:11:17] {3161} INFO - at 33.5s,\testimator arima's best error=0.0044,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 52, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:17] {3161} INFO - at 33.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 53, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:17] {3161} INFO - at 33.6s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:17] {2986} INFO - iteration 54, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.4s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 55, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.5s,\testimator xgboost's best error=0.0242,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 56, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.5s,\testimator xgboost's best error=0.0191,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 57, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.6s,\testimator xgboost's best error=0.0191,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 58, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 59, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.6s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 60, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.7s,\testimator xgboost's best error=0.0103,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 61, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.7s,\testimator xgboost's best error=0.0081,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 62, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.8s,\testimator xgboost's best error=0.0081,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 63, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 64, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.8s,\testimator xgboost's best error=0.0081,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 65, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 34.9s,\testimator xgboost's best error=0.0041,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:18] {2986} INFO - iteration 66, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:18] {3161} INFO - at 35.0s,\testimator xgboost's best error=0.0041,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 67, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.1s,\testimator xgboost's best error=0.0029,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 68, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.2s,\testimator xgboost's best error=0.0029,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 69, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.3s,\testimator xgboost's best error=0.0028,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 70, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.3s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 71, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.3s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 72, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 73, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.4s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 74, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.4s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 75, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.5s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 76, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:19] {3161} INFO - at 35.6s,\testimator sarimax's best error=0.0047,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:19] {2986} INFO - iteration 77, current learner prophet\n",
"[flaml.automl: 07-28 21:11:24] {3161} INFO - at 40.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:24] {2986} INFO - iteration 78, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:25] {3161} INFO - at 41.3s,\testimator sarimax's best error=0.0041,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:25] {2986} INFO - iteration 79, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:25] {3161} INFO - at 41.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:25] {2986} INFO - iteration 80, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:25] {3161} INFO - at 41.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:25] {2986} INFO - iteration 81, current learner prophet\n",
"[flaml.automl: 07-28 21:11:30] {3161} INFO - at 46.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:30] {2986} INFO - iteration 82, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:31] {3161} INFO - at 47.1s,\testimator xgboost's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:31] {2986} INFO - iteration 83, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:31] {3161} INFO - at 47.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:31] {2986} INFO - iteration 84, current learner arima\n",
"[flaml.automl: 07-28 21:11:31] {3161} INFO - at 47.6s,\testimator arima's best error=0.0044,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:31] {2986} INFO - iteration 85, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:31] {3161} INFO - at 47.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:31] {2986} INFO - iteration 86, current learner prophet\n",
"[flaml.automl: 07-28 21:11:35] {3161} INFO - at 51.8s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:35] {2986} INFO - iteration 87, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:35] {3161} INFO - at 51.8s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:35] {2986} INFO - iteration 88, current learner prophet\n",
"[flaml.automl: 07-28 21:11:38] {3161} INFO - at 54.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:38] {2986} INFO - iteration 89, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:38] {3161} INFO - at 55.0s,\testimator extra_tree's best error=0.0177,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:38] {2986} INFO - iteration 90, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:11:39] {3161} INFO - at 55.0s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 91, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:39] {3161} INFO - at 55.0s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 92, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:39] {3161} INFO - at 55.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 93, current learner arima\n",
"[flaml.automl: 07-28 21:11:39] {3161} INFO - at 55.3s,\testimator arima's best error=0.0043,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 94, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:39] {3161} INFO - at 55.3s,\testimator xgboost's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 95, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:39] {3161} INFO - at 55.5s,\testimator sarimax's best error=0.0040,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:39] {2986} INFO - iteration 96, current learner arima\n",
"[flaml.automl: 07-28 21:11:40] {3161} INFO - at 56.3s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:40] {2986} INFO - iteration 97, current learner arima\n",
"[flaml.automl: 07-28 21:11:41] {3161} INFO - at 57.4s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 98, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:41] {3161} INFO - at 57.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 99, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:41] {3161} INFO - at 57.8s,\testimator sarimax's best error=0.0038,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 100, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:41] {3161} INFO - at 57.8s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 101, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:41] {3161} INFO - at 57.8s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 102, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:41] {3161} INFO - at 57.9s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:41] {2986} INFO - iteration 103, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:42] {3161} INFO - at 58.0s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 104, current learner arima\n",
"[flaml.automl: 07-28 21:11:42] {3161} INFO - at 58.3s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 105, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:42] {3161} INFO - at 58.4s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 106, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:42] {3161} INFO - at 58.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 107, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:42] {3161} INFO - at 58.5s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 108, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:42] {3161} INFO - at 58.6s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 109, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:42] {3161} INFO - at 58.6s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:42] {2986} INFO - iteration 110, current learner arima\n",
"[flaml.automl: 07-28 21:11:43] {3161} INFO - at 59.4s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:43] {2986} INFO - iteration 111, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:43] {3161} INFO - at 59.4s,\testimator extra_tree's best error=0.0089,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:43] {2986} INFO - iteration 112, current learner prophet\n",
"[flaml.automl: 07-28 21:11:47] {3161} INFO - at 63.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:47] {2986} INFO - iteration 113, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:47] {3161} INFO - at 63.4s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:47] {2986} INFO - iteration 114, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:47] {3161} INFO - at 63.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:47] {2986} INFO - iteration 115, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:48] {3161} INFO - at 64.6s,\testimator sarimax's best error=0.0038,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 116, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:48] {3161} INFO - at 64.6s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 117, current learner sarimax\n",
"[flaml.automl: 07-28 21:11:48] {3161} INFO - at 64.8s,\testimator sarimax's best error=0.0038,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 118, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:48] {3161} INFO - at 64.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 119, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:48] {3161} INFO - at 64.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:48] {2986} INFO - iteration 120, current learner prophet\n",
"[flaml.automl: 07-28 21:11:52] {3161} INFO - at 68.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 121, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:52] {3161} INFO - at 68.2s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 122, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:52] {3161} INFO - at 68.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 123, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:52] {3161} INFO - at 68.3s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 124, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:52] {3161} INFO - at 68.4s,\testimator extra_tree's best error=0.0074,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:52] {2986} INFO - iteration 125, current learner prophet\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 126, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.3s,\testimator extra_tree's best error=0.0055,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 127, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.4s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 128, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 129, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 130, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 131, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.5s,\testimator extra_tree's best error=0.0055,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 132, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.6s,\testimator extra_tree's best error=0.0055,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 133, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.6s,\testimator extra_tree's best error=0.0055,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 134, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.6s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 135, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.7s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 136, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 137, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.9s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 138, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:55] {3161} INFO - at 71.9s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:55] {2986} INFO - iteration 139, current learner arima\n",
"[flaml.automl: 07-28 21:11:56] {3161} INFO - at 72.8s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 140, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:56] {3161} INFO - at 72.8s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 141, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:56] {3161} INFO - at 72.9s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 142, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:56] {3161} INFO - at 72.9s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 143, current learner lgbm\n",
"[flaml.automl: 07-28 21:11:56] {3161} INFO - at 72.9s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:56] {2986} INFO - iteration 144, current learner extra_tree\n",
"[flaml.automl: 07-28 21:11:57] {3161} INFO - at 73.0s,\testimator extra_tree's best error=0.0051,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:57] {2986} INFO - iteration 145, current learner xgboost\n",
"[flaml.automl: 07-28 21:11:57] {3161} INFO - at 73.1s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:11:57] {2986} INFO - iteration 146, current learner prophet\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.1s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 147, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.2s,\testimator extra_tree's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 148, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.2s,\testimator extra_tree's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 149, current learner xgboost\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.4s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 150, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 151, current learner rf\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.5s,\testimator rf's best error=0.0150,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 152, current learner rf\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.6s,\testimator rf's best error=0.0150,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 153, current learner rf\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.7s,\testimator rf's best error=0.0096,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 154, current learner rf\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.7s,\testimator rf's best error=0.0096,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 155, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.8s,\testimator extra_tree's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 156, current learner rf\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.8s,\testimator rf's best error=0.0042,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 157, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 158, current learner rf\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.9s,\testimator rf's best error=0.0042,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 159, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:00] {3161} INFO - at 76.9s,\testimator extra_tree's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:00] {2986} INFO - iteration 160, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.0s,\testimator rf's best error=0.0042,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 161, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.0s,\testimator rf's best error=0.0036,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 162, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.1s,\testimator rf's best error=0.0036,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 163, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.1s,\testimator extra_tree's best error=0.0030,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 164, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.2s,\testimator rf's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 165, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.2s,\testimator rf's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 166, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.3s,\testimator extra_tree's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 167, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.4s,\testimator extra_tree's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 168, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.4s,\testimator rf's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 169, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.4s,\testimator rf's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 170, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.5s,\testimator rf's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 171, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.5s,\testimator extra_tree's best error=0.0027,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 172, current learner rf\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.6s,\testimator rf's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 173, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 174, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 175, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.7s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 176, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.7s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 177, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.8s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 178, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:01] {3161} INFO - at 77.8s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:01] {2986} INFO - iteration 179, current learner prophet\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 180, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.3s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 181, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.3s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 182, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.4s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 183, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.4s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 184, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.4s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 185, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.5s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 186, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 187, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 188, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 189, current learner rf\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.6s,\testimator rf's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 190, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.7s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 191, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 192, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 193, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.8s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 194, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 195, current learner xgboost\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.9s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 196, current learner rf\n",
"[flaml.automl: 07-28 21:12:05] {3161} INFO - at 81.9s,\testimator rf's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:05] {2986} INFO - iteration 197, current learner prophet\n",
"[flaml.automl: 07-28 21:12:09] {3161} INFO - at 85.5s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 198, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:09] {3161} INFO - at 85.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 199, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:09] {3161} INFO - at 85.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 200, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:09] {3161} INFO - at 85.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 201, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:09] {3161} INFO - at 85.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:09] {2986} INFO - iteration 202, current learner prophet\n",
"[flaml.automl: 07-28 21:12:12] {3161} INFO - at 88.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:12] {2986} INFO - iteration 203, current learner prophet\n",
"[flaml.automl: 07-28 21:12:16] {3161} INFO - at 92.8s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:16] {2986} INFO - iteration 204, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:16] {3161} INFO - at 92.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:16] {2986} INFO - iteration 205, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:16] {3161} INFO - at 92.9s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:16] {2986} INFO - iteration 206, current learner xgboost\n",
"[flaml.automl: 07-28 21:12:17] {3161} INFO - at 93.0s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:17] {2986} INFO - iteration 207, current learner prophet\n",
"[flaml.automl: 07-28 21:12:20] {3161} INFO - at 96.0s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:20] {2986} INFO - iteration 208, current learner arima\n",
"[flaml.automl: 07-28 21:12:20] {3161} INFO - at 96.3s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:20] {2986} INFO - iteration 209, current learner rf\n",
"[flaml.automl: 07-28 21:12:20] {3161} INFO - at 96.4s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:20] {2986} INFO - iteration 210, current learner prophet\n",
"[flaml.automl: 07-28 21:12:26] {3161} INFO - at 102.7s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:26] {2986} INFO - iteration 211, current learner rf\n",
"[flaml.automl: 07-28 21:12:26] {3161} INFO - at 102.8s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:26] {2986} INFO - iteration 212, current learner rf\n",
"[flaml.automl: 07-28 21:12:26] {3161} INFO - at 102.9s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:26] {2986} INFO - iteration 213, current learner rf\n",
"[flaml.automl: 07-28 21:12:26] {3161} INFO - at 103.0s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:26] {2986} INFO - iteration 214, current learner rf\n",
"[flaml.automl: 07-28 21:12:27] {3161} INFO - at 103.0s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:27] {2986} INFO - iteration 215, current learner rf\n",
"[flaml.automl: 07-28 21:12:27] {3161} INFO - at 103.1s,\testimator rf's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:27] {2986} INFO - iteration 216, current learner prophet\n",
"[flaml.automl: 07-28 21:12:31] {3161} INFO - at 107.5s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:31] {2986} INFO - iteration 217, current learner prophet\n",
"[flaml.automl: 07-28 21:12:35] {3161} INFO - at 111.4s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 218, current learner rf\n",
"[flaml.automl: 07-28 21:12:35] {3161} INFO - at 111.5s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 219, current learner sarimax\n",
"[flaml.automl: 07-28 21:12:35] {3161} INFO - at 111.7s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 220, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:35] {3161} INFO - at 111.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 221, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:35] {3161} INFO - at 111.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 222, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:35] {3161} INFO - at 111.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 223, current learner rf\n",
"[flaml.automl: 07-28 21:12:35] {3161} INFO - at 111.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 224, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:35] {3161} INFO - at 111.9s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:35] {2986} INFO - iteration 225, current learner prophet\n",
"[flaml.automl: 07-28 21:12:39] {3161} INFO - at 115.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 226, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:39] {3161} INFO - at 115.4s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 227, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:39] {3161} INFO - at 115.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 228, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:39] {3161} INFO - at 115.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 229, current learner rf\n",
"[flaml.automl: 07-28 21:12:39] {3161} INFO - at 115.5s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 230, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:39] {3161} INFO - at 115.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 231, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:39] {3161} INFO - at 115.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 232, current learner rf\n",
"[flaml.automl: 07-28 21:12:39] {3161} INFO - at 115.6s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:39] {2986} INFO - iteration 233, current learner prophet\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.6s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 234, current learner rf\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.6s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 235, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.7s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 236, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.7s,\testimator xgb_limitdepth's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 237, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.7s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 238, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 239, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 240, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 241, current learner rf\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 118.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 242, current learner xgboost\n",
"[flaml.automl: 07-28 21:12:42] {3161} INFO - at 119.0s,\testimator xgboost's best error=0.0026,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:42] {2986} INFO - iteration 243, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:43] {3161} INFO - at 119.0s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 244, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:43] {3161} INFO - at 119.0s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 245, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:43] {3161} INFO - at 119.1s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 246, current learner rf\n",
"[flaml.automl: 07-28 21:12:43] {3161} INFO - at 119.2s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 247, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:43] {3161} INFO - at 119.2s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 248, current learner rf\n",
"[flaml.automl: 07-28 21:12:43] {3161} INFO - at 119.3s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:43] {2986} INFO - iteration 249, current learner prophet\n",
"[flaml.automl: 07-28 21:12:46] {3161} INFO - at 122.4s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 250, current learner rf\n",
"[flaml.automl: 07-28 21:12:46] {3161} INFO - at 122.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 251, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:46] {3161} INFO - at 122.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 252, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:46] {3161} INFO - at 122.5s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 253, current learner xgboost\n",
"[flaml.automl: 07-28 21:12:46] {3161} INFO - at 122.7s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 254, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:46] {3161} INFO - at 122.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 255, current learner rf\n",
"[flaml.automl: 07-28 21:12:46] {3161} INFO - at 122.8s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 256, current learner xgboost\n",
"[flaml.automl: 07-28 21:12:46] {3161} INFO - at 123.0s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:46] {2986} INFO - iteration 257, current learner prophet\n",
"[flaml.automl: 07-28 21:12:50] {3161} INFO - at 126.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 258, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:50] {3161} INFO - at 126.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 259, current learner sarimax\n",
"[flaml.automl: 07-28 21:12:50] {3161} INFO - at 126.4s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 260, current learner rf\n",
"[flaml.automl: 07-28 21:12:50] {3161} INFO - at 126.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 261, current learner sarimax\n",
"[flaml.automl: 07-28 21:12:50] {3161} INFO - at 126.9s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 262, current learner rf\n",
"[flaml.automl: 07-28 21:12:50] {3161} INFO - at 126.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 263, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:50] {3161} INFO - at 127.0s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:50] {2986} INFO - iteration 264, current learner rf\n",
"[flaml.automl: 07-28 21:12:51] {3161} INFO - at 127.0s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 265, current learner rf\n",
"[flaml.automl: 07-28 21:12:51] {3161} INFO - at 127.1s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 266, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:51] {3161} INFO - at 127.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 267, current learner rf\n",
"[flaml.automl: 07-28 21:12:51] {3161} INFO - at 127.2s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 268, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:51] {3161} INFO - at 127.2s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:51] {2986} INFO - iteration 269, current learner prophet\n",
"[flaml.automl: 07-28 21:12:54] {3161} INFO - at 130.5s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:54] {2986} INFO - iteration 270, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:54] {3161} INFO - at 130.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:54] {2986} INFO - iteration 271, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:54] {3161} INFO - at 130.6s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:54] {2986} INFO - iteration 272, current learner prophet\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.1s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 273, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 274, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.2s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 275, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 276, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 277, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 278, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 279, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 280, current learner xgboost\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.5s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 281, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 282, current learner sarimax\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.6s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 283, current learner extra_tree\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.7s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 284, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 285, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 286, current learner lgbm\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 287, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:12:58] {3161} INFO - at 134.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:12:58] {2986} INFO - iteration 288, current learner prophet\n",
"[flaml.automl: 07-28 21:13:02] {3161} INFO - at 138.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:02] {2986} INFO - iteration 289, current learner prophet\n",
"[flaml.automl: 07-28 21:13:05] {3161} INFO - at 141.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:05] {2986} INFO - iteration 290, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:05] {3161} INFO - at 142.0s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:05] {2986} INFO - iteration 291, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:06] {3161} INFO - at 142.0s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:06] {2986} INFO - iteration 292, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:06] {3161} INFO - at 142.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:06] {2986} INFO - iteration 293, current learner prophet\n",
"[flaml.automl: 07-28 21:13:08] {3161} INFO - at 144.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:08] {2986} INFO - iteration 294, current learner rf\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.0s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 295, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.0s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 296, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.2s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 297, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.2s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 298, current learner rf\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.2s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 299, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.3s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 300, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.4s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 301, current learner rf\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.5s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 302, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 303, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 304, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 305, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 306, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.8s,\testimator sarimax's best error=0.0037,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 307, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 308, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:09] {3161} INFO - at 145.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:09] {2986} INFO - iteration 309, current learner prophet\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.0s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 310, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.0s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 311, current learner rf\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.1s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 312, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 313, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 314, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 315, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 316, current learner rf\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 317, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:13] {3161} INFO - at 149.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:13] {2986} INFO - iteration 318, current learner prophet\n",
"[flaml.automl: 07-28 21:13:16] {3161} INFO - at 152.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:16] {2986} INFO - iteration 319, current learner prophet\n",
"[flaml.automl: 07-28 21:13:19] {3161} INFO - at 155.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 320, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:19] {3161} INFO - at 155.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 321, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:19] {3161} INFO - at 155.3s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 322, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:19] {3161} INFO - at 155.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 323, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:19] {3161} INFO - at 155.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 324, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:19] {3161} INFO - at 155.5s,\testimator extra_tree's best error=0.0017,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 325, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:19] {3161} INFO - at 155.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 326, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:19] {3161} INFO - at 155.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:19] {2986} INFO - iteration 327, current learner prophet\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 328, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 329, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.4s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 330, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 331, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.4s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 332, current learner rf\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.5s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 333, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 334, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 335, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.6s,\testimator xgboost's best error=0.0025,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 336, current learner rf\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.7s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 337, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.8s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 338, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 159.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 339, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 160.0s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 340, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:23] {3161} INFO - at 160.0s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:23] {2986} INFO - iteration 341, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:24] {3161} INFO - at 160.1s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:24] {2986} INFO - iteration 342, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:24] {3161} INFO - at 160.1s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:24] {2986} INFO - iteration 343, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:24] {3161} INFO - at 160.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:24] {2986} INFO - iteration 344, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:24] {3161} INFO - at 160.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:24] {2986} INFO - iteration 345, current learner prophet\n",
"[flaml.automl: 07-28 21:13:27] {3161} INFO - at 163.1s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:27] {2986} INFO - iteration 346, current learner prophet\n",
"[flaml.automl: 07-28 21:13:29] {3161} INFO - at 165.9s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:29] {2986} INFO - iteration 347, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:30] {3161} INFO - at 166.0s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:30] {2986} INFO - iteration 348, current learner prophet\n",
"[flaml.automl: 07-28 21:13:33] {3161} INFO - at 169.4s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:33] {2986} INFO - iteration 349, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:33] {3161} INFO - at 169.7s,\testimator sarimax's best error=0.0031,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:33] {2986} INFO - iteration 350, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:34] {3161} INFO - at 170.4s,\testimator sarimax's best error=0.0031,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:34] {2986} INFO - iteration 351, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:34] {3161} INFO - at 170.6s,\testimator sarimax's best error=0.0031,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:34] {2986} INFO - iteration 352, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:34] {3161} INFO - at 170.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:34] {2986} INFO - iteration 353, current learner arima\n",
"[flaml.automl: 07-28 21:13:35] {3161} INFO - at 171.3s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 354, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:35] {3161} INFO - at 171.7s,\testimator sarimax's best error=0.0031,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 355, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:35] {3161} INFO - at 171.8s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 356, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:35] {3161} INFO - at 171.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 357, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:35] {3161} INFO - at 171.8s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 358, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:35] {3161} INFO - at 171.9s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:35] {2986} INFO - iteration 359, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:36] {3161} INFO - at 172.1s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 360, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:36] {3161} INFO - at 172.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 361, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:36] {3161} INFO - at 172.1s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 362, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:36] {3161} INFO - at 172.2s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 363, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:36] {3161} INFO - at 172.2s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 364, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:36] {3161} INFO - at 172.3s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 365, current learner rf\n",
"[flaml.automl: 07-28 21:13:36] {3161} INFO - at 172.3s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 366, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:36] {3161} INFO - at 172.3s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:36] {2986} INFO - iteration 367, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:37] {3161} INFO - at 173.2s,\testimator sarimax's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:37] {2986} INFO - iteration 368, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:37] {3161} INFO - at 173.6s,\testimator sarimax's best error=0.0021,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:37] {2986} INFO - iteration 369, current learner rf\n",
"[flaml.automl: 07-28 21:13:37] {3161} INFO - at 173.6s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:37] {2986} INFO - iteration 370, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:39] {3161} INFO - at 175.5s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:39] {2986} INFO - iteration 371, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:39] {3161} INFO - at 175.6s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:39] {2986} INFO - iteration 372, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:41] {3161} INFO - at 177.5s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:41] {2986} INFO - iteration 373, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:41] {3161} INFO - at 177.6s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:41] {2986} INFO - iteration 374, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.2s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 375, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.4s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 376, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.5s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 377, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 378, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.5s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 379, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 380, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.6s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 381, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.7s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 382, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 383, current learner lgbm\n",
"[flaml.automl: 07-28 21:13:43] {3161} INFO - at 179.7s,\testimator lgbm's best error=0.0022,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:43] {2986} INFO - iteration 384, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:45] {3161} INFO - at 181.5s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:45] {2986} INFO - iteration 385, current learner prophet\n",
"[flaml.automl: 07-28 21:13:48] {3161} INFO - at 184.8s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:48] {2986} INFO - iteration 386, current learner prophet\n",
"[flaml.automl: 07-28 21:13:52] {3161} INFO - at 188.3s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:52] {2986} INFO - iteration 387, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:52] {3161} INFO - at 188.4s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:52] {2986} INFO - iteration 388, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:52] {3161} INFO - at 188.5s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:52] {2986} INFO - iteration 389, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:52] {3161} INFO - at 188.5s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:52] {2986} INFO - iteration 390, current learner sarimax\n",
"[flaml.automl: 07-28 21:13:53] {3161} INFO - at 189.5s,\testimator sarimax's best error=0.0019,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:53] {2986} INFO - iteration 391, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:53] {3161} INFO - at 189.6s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:53] {2986} INFO - iteration 392, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:53] {3161} INFO - at 189.7s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:53] {2986} INFO - iteration 393, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:53] {3161} INFO - at 189.7s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:53] {2986} INFO - iteration 394, current learner prophet\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.0s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 395, current learner xgboost\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.1s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 396, current learner rf\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.2s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 397, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.2s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 398, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.3s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 399, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.3s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 400, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.4s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 401, current learner rf\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 402, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.5s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 403, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.5s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 404, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.6s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 405, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.6s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 406, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.7s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 407, current learner extra_tree\n",
"[flaml.automl: 07-28 21:13:57] {3161} INFO - at 193.7s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:13:57] {2986} INFO - iteration 408, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:01] {3161} INFO - at 197.0s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:01] {2986} INFO - iteration 409, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:02] {3161} INFO - at 198.8s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:02] {2986} INFO - iteration 410, current learner rf\n",
"[flaml.automl: 07-28 21:14:02] {3161} INFO - at 198.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:02] {2986} INFO - iteration 411, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:05] {3161} INFO - at 201.0s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:05] {2986} INFO - iteration 412, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:05] {3161} INFO - at 201.1s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:05] {2986} INFO - iteration 413, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:06] {3161} INFO - at 202.3s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 414, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:06] {3161} INFO - at 202.3s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 415, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:06] {3161} INFO - at 202.4s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 416, current learner rf\n",
"[flaml.automl: 07-28 21:14:06] {3161} INFO - at 202.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 417, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:06] {3161} INFO - at 202.5s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 418, current learner xgboost\n",
"[flaml.automl: 07-28 21:14:06] {3161} INFO - at 202.6s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:06] {2986} INFO - iteration 419, current learner prophet\n",
"[flaml.automl: 07-28 21:14:09] {3161} INFO - at 205.7s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:09] {2986} INFO - iteration 420, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:11] {3161} INFO - at 207.4s,\testimator sarimax's best error=0.0012,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:11] {2986} INFO - iteration 421, current learner arima\n",
"[flaml.automl: 07-28 21:14:11] {3161} INFO - at 207.6s,\testimator arima's best error=0.0033,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:11] {2986} INFO - iteration 422, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:12] {3161} INFO - at 208.6s,\testimator sarimax's best error=0.0010,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:12] {2986} INFO - iteration 423, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:12] {3161} INFO - at 208.6s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:12] {2986} INFO - iteration 424, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:13] {3161} INFO - at 209.9s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:13] {2986} INFO - iteration 425, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:15] {3161} INFO - at 211.3s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:15] {2986} INFO - iteration 426, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:15] {3161} INFO - at 211.8s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:15] {2986} INFO - iteration 427, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:18] {3161} INFO - at 214.2s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 428, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:18] {3161} INFO - at 214.2s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 429, current learner xgboost\n",
"[flaml.automl: 07-28 21:14:18] {3161} INFO - at 214.4s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 430, current learner rf\n",
"[flaml.automl: 07-28 21:14:18] {3161} INFO - at 214.4s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 431, current learner xgboost\n",
"[flaml.automl: 07-28 21:14:18] {3161} INFO - at 214.5s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:18] {2986} INFO - iteration 432, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:20] {3161} INFO - at 216.7s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:20] {2986} INFO - iteration 433, current learner xgboost\n",
"[flaml.automl: 07-28 21:14:20] {3161} INFO - at 216.8s,\testimator xgboost's best error=0.0024,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:20] {2986} INFO - iteration 434, current learner rf\n",
"[flaml.automl: 07-28 21:14:20] {3161} INFO - at 216.9s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:20] {2986} INFO - iteration 435, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:21] {3161} INFO - at 217.4s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:21] {2986} INFO - iteration 436, current learner prophet\n",
"[flaml.automl: 07-28 21:14:24] {3161} INFO - at 220.1s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:24] {2986} INFO - iteration 437, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:24] {3161} INFO - at 220.6s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:24] {2986} INFO - iteration 438, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:26] {3161} INFO - at 223.0s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:26] {2986} INFO - iteration 439, current learner prophet\n",
"[flaml.automl: 07-28 21:14:30] {3161} INFO - at 226.0s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:30] {2986} INFO - iteration 440, current learner prophet\n",
"[flaml.automl: 07-28 21:14:33] {3161} INFO - at 229.2s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:33] {2986} INFO - iteration 441, current learner prophet\n",
"[flaml.automl: 07-28 21:14:36] {3161} INFO - at 232.6s,\testimator prophet's best error=0.0005,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:36] {2986} INFO - iteration 442, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:38] {3161} INFO - at 234.4s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:38] {2986} INFO - iteration 443, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:38] {3161} INFO - at 234.4s,\testimator extra_tree's best error=0.0016,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:38] {2986} INFO - iteration 444, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:39] {3161} INFO - at 235.1s,\testimator sarimax's best error=0.0007,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:39] {2986} INFO - iteration 445, current learner rf\n",
"[flaml.automl: 07-28 21:14:39] {3161} INFO - at 235.1s,\testimator rf's best error=0.0018,\tbest estimator prophet's best error=0.0005\n",
"[flaml.automl: 07-28 21:14:39] {2986} INFO - iteration 446, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:41] {3161} INFO - at 237.4s,\testimator sarimax's best error=0.0004,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:41] {2986} INFO - iteration 447, current learner xgboost\n",
"[flaml.automl: 07-28 21:14:41] {3161} INFO - at 237.5s,\testimator xgboost's best error=0.0024,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:41] {2986} INFO - iteration 448, current learner sarimax\n",
"[flaml.automl: 07-28 21:14:43] {3161} INFO - at 239.7s,\testimator sarimax's best error=0.0004,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 449, current learner lgbm\n",
"[flaml.automl: 07-28 21:14:43] {3161} INFO - at 239.7s,\testimator lgbm's best error=0.0022,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 450, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:43] {3161} INFO - at 239.8s,\testimator extra_tree's best error=0.0016,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 451, current learner lgbm\n",
"[flaml.automl: 07-28 21:14:43] {3161} INFO - at 239.8s,\testimator lgbm's best error=0.0022,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 452, current learner xgb_limitdepth\n",
"[flaml.automl: 07-28 21:14:43] {3161} INFO - at 239.8s,\testimator xgb_limitdepth's best error=0.0018,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 453, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:43] {3161} INFO - at 239.9s,\testimator extra_tree's best error=0.0016,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 454, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:43] {3161} INFO - at 239.9s,\testimator extra_tree's best error=0.0016,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 455, current learner extra_tree\n",
"[flaml.automl: 07-28 21:14:43] {3161} INFO - at 239.9s,\testimator extra_tree's best error=0.0016,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:43] {2986} INFO - iteration 456, current learner rf\n",
"[flaml.automl: 07-28 21:14:44] {3161} INFO - at 240.0s,\testimator rf's best error=0.0018,\tbest estimator sarimax's best error=0.0004\n",
"[flaml.automl: 07-28 21:14:44] {3425} INFO - retrain sarimax for 0.7s\n",
"[flaml.automl: 07-28 21:14:44] {3432} INFO - retrained model: <statsmodels.tsa.statespace.sarimax.SARIMAXResultsWrapper object at 0x000001E2D9979400>\n",
"[flaml.automl: 07-28 21:14:44] {2725} INFO - fit succeeded\n",
"[flaml.automl: 07-28 21:14:44] {2726} INFO - Time taken to find the best model: 237.36335611343384\n",
"[flaml.automl: 07-28 21:14:44] {2737} WARNING - Time taken to find the best model is 99% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
]
}
],
"source": [
"'''The main flaml automl API'''\n",
"automl.fit(dataframe=train_df, # training data\n",
" label='co2', # label column\n",
" period=time_horizon, # key word argument 'period' must be included for forecast task)\n",
" **settings)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Best model and metric"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Best ML leaner: sarimax\n",
"Best hyperparmeter config: {'p': 8, 'd': 0, 'q': 8, 'P': 6, 'D': 3, 'Q': 1, 's': 6}\n",
"Best mape on validation data: 0.00043466573064228554\n",
"Training duration of best run: 0.7340686321258545s\n"
]
}
],
"source": [
"''' retrieve best config and best learner'''\n",
"print('Best ML leaner:', automl.best_estimator)\n",
"print('Best hyperparmeter config:', automl.best_config)\n",
"print(f'Best mape on validation data: {automl.best_loss}')\n",
"print(f'Training duration of best run: {automl.best_config_train_time}s')"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"<statsmodels.tsa.statespace.sarimax.SARIMAXResultsWrapper at 0x1e2d9979400>"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"automl.model.estimator"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"''' pickle and save the automl object '''\n",
"import pickle\n",
"with open('automl.pkl', 'wb') as f:\n",
" pickle.dump(automl, f, pickle.HIGHEST_PROTOCOL)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Predicted labels\n",
"2001-01-01 370.568362\n",
"2001-02-01 371.297747\n",
"2001-03-01 372.087653\n",
"2001-04-01 373.040897\n",
"2001-05-01 373.638221\n",
"2001-06-01 373.202665\n",
"2001-07-01 371.621574\n",
"2001-08-01 369.611740\n",
"2001-09-01 368.307775\n",
"2001-10-01 368.360786\n",
"2001-11-01 369.476460\n",
"2001-12-01 370.849193\n",
"Freq: MS, Name: predicted_mean, dtype: float64\n",
"True labels\n",
"514 370.175\n",
"515 371.325\n",
"516 372.060\n",
"517 372.775\n",
"518 373.800\n",
"519 373.060\n",
"520 371.300\n",
"521 369.425\n",
"522 367.880\n",
"523 368.050\n",
"524 369.375\n",
"525 371.020\n",
"Name: co2, dtype: float64\n"
]
}
],
"source": [
"''' compute predictions of testing dataset '''\n",
"flaml_y_pred = automl.predict(X_test)\n",
"print(f\"Predicted labels\\n{flaml_y_pred}\")\n",
"print(f\"True labels\\n{y_test}\")"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"mape = 0.0005710586398294955\n"
]
}
],
"source": [
"''' compute different metric values on testing dataset'''\n",
"from flaml.ml import sklearn_metric_loss_score\n",
"print('mape', '=', sklearn_metric_loss_score('mape', y_true=y_test, y_predict=flaml_y_pred))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Log history"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 4, 'num_leaves': 4, 'min_child_samples': 20, 'learning_rate': 0.09999999999999995, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 1.0, 'optimize_for_horizon': False, 'lags': 3}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 4, 'num_leaves': 4, 'min_child_samples': 20, 'learning_rate': 0.09999999999999995, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 1.0, 'optimize_for_horizon': False, 'lags': 3}}\n",
"{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 8, 'num_leaves': 4, 'min_child_samples': 19, 'learning_rate': 0.18686130359903158, 'log_max_bin': 9, 'colsample_bytree': 0.9311834484407709, 'reg_alpha': 0.0013872402855481538, 'reg_lambda': 0.43503398494225104, 'optimize_for_horizon': False, 'lags': 1}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 8, 'num_leaves': 4, 'min_child_samples': 19, 'learning_rate': 0.18686130359903158, 'log_max_bin': 9, 'colsample_bytree': 0.9311834484407709, 'reg_alpha': 0.0013872402855481538, 'reg_lambda': 0.43503398494225104, 'optimize_for_horizon': False, 'lags': 1}}\n",
"{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 9, 'num_leaves': 4, 'min_child_samples': 14, 'learning_rate': 0.23100120527451992, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 0.028424597762235913, 'optimize_for_horizon': False, 'lags': 1}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 9, 'num_leaves': 4, 'min_child_samples': 14, 'learning_rate': 0.23100120527451992, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 0.028424597762235913, 'optimize_for_horizon': False, 'lags': 1}}\n",
"{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 9, 'num_leaves': 9, 'min_child_samples': 9, 'learning_rate': 0.2917244979615619, 'log_max_bin': 7, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 0.006048554644106909, 'optimize_for_horizon': False, 'lags': 4}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 9, 'num_leaves': 9, 'min_child_samples': 9, 'learning_rate': 0.2917244979615619, 'log_max_bin': 7, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 0.006048554644106909, 'optimize_for_horizon': False, 'lags': 4}}\n",
"{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 4, 'num_leaves': 8, 'min_child_samples': 11, 'learning_rate': 0.8116893577982964, 'log_max_bin': 8, 'colsample_bytree': 0.97502360023323, 'reg_alpha': 0.0012398377555843262, 'reg_lambda': 0.02776044509327881, 'optimize_for_horizon': False, 'lags': 4}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 4, 'num_leaves': 8, 'min_child_samples': 11, 'learning_rate': 0.8116893577982964, 'log_max_bin': 8, 'colsample_bytree': 0.97502360023323, 'reg_alpha': 0.0012398377555843262, 'reg_lambda': 0.02776044509327881, 'optimize_for_horizon': False, 'lags': 4}}\n",
"{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 5, 'num_leaves': 16, 'min_child_samples': 7, 'learning_rate': 1.0, 'log_max_bin': 9, 'colsample_bytree': 0.9289697965752838, 'reg_alpha': 0.01291354098023607, 'reg_lambda': 0.012402833825431305, 'optimize_for_horizon': False, 'lags': 5}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 5, 'num_leaves': 16, 'min_child_samples': 7, 'learning_rate': 1.0, 'log_max_bin': 9, 'colsample_bytree': 0.9289697965752838, 'reg_alpha': 0.01291354098023607, 'reg_lambda': 0.012402833825431305, 'optimize_for_horizon': False, 'lags': 5}}\n",
"{'Current Learner': 'lgbm', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 10, 'num_leaves': 13, 'min_child_samples': 8, 'learning_rate': 1.0, 'log_max_bin': 9, 'colsample_bytree': 0.915047969012756, 'reg_alpha': 0.1456985407754094, 'reg_lambda': 0.010186415963233664, 'optimize_for_horizon': False, 'lags': 9}, 'Best Learner': 'lgbm', 'Best Hyper-parameters': {'n_estimators': 10, 'num_leaves': 13, 'min_child_samples': 8, 'learning_rate': 1.0, 'log_max_bin': 9, 'colsample_bytree': 0.915047969012756, 'reg_alpha': 0.1456985407754094, 'reg_lambda': 0.010186415963233664, 'optimize_for_horizon': False, 'lags': 9}}\n",
"{'Current Learner': 'xgb', 'Current Sample': 502, 'Current Hyper-parameters': {'n_estimators': 17, 'max_depth': 6, 'min_child_weight': 1.1257301179325647, 'learning_rate': 0.3420575416463879, 'subsample': 1.0, 'colsample_bylevel': 0.8634518942394397, 'colsample_bytree': 0.8183410599521093, 'reg_alpha': 0.0031517221935712125, 'reg_lambda': 0.36563645650488746, 'optimize_for_horizon': False, 'lags': 1}, 'Best Learner': 'xgb', 'Best Hyper-parameters': {'n_estimators': 17, 'max_depth': 6, 'min_child_weight': 1.1257301179325647, 'learning_rate': 0.3420575416463879, 'subsample': 1.0, 'colsample_bylevel': 0.8634518942394397, 'colsample_bytree': 0.8183410599521093, 'reg_alpha': 0.0031517221935712125, 'reg_lambda': 0.36563645650488746, 'optimize_for_horizon': False, 'lags': 1}}\n",
"{'Current Learner': 'prophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.05, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 10.0, 'seasonality_mode': 'multiplicative'}, 'Best Learner': 'prophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.05, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 10.0, 'seasonality_mode': 'multiplicative'}}\n",
"{'Current Learner': 'prophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.02574943279263944, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 10.0, 'seasonality_mode': 'additive'}, 'Best Learner': 'prophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.02574943279263944, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 10.0, 'seasonality_mode': 'additive'}}\n",
"{'Current Learner': 'prophet', 'Current Sample': 502, 'Current Hyper-parameters': {'changepoint_prior_scale': 0.029044518309983725, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 8.831739687246309, 'seasonality_mode': 'additive'}, 'Best Learner': 'prophet', 'Best Hyper-parameters': {'changepoint_prior_scale': 0.029044518309983725, 'seasonality_prior_scale': 10.0, 'holidays_prior_scale': 8.831739687246309, 'seasonality_mode': 'additive'}}\n"
]
}
],
"source": [
"from flaml.data import get_output_from_log\n",
"time_history, best_valid_loss_history, valid_loss_history, config_history, train_loss_history = \\\n",
" get_output_from_log(filename=settings['log_file_name'], time_budget=180)\n",
"\n",
"for config in config_history:\n",
" print(config)"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYgAAAEWCAYAAAB8LwAVAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nO3de5xdVX338c+XMYGAwAQJFCZAUGMkAhKdBhFvoDaAKCFSBR4vjXJrhVJtY4HWW30osalWfKTmiZQqlpsgidEnEikoqQgkgxNyI2ljQJgJhaEYgjCSZPJ7/thr4OSwZ2YnzJ4zc873/XrNa85ee529f5sh53fWWnuvpYjAzMys2m61DsDMzIYnJwgzM8vlBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QZrtA0tslrat1HGZlcoKwEUfSw5LeU8sYIuI/ImJSWceXNE3SEknPSOqSdJekD5R1PrM8ThBmOSQ11fDcZwA3A9cC44EDgc8D79+FY0mS/53bLvH/OFY3JO0m6RJJv5b0P5K+L2m/iv03S/pvSU+nb+dvqNj3HUnfkrRI0rPACaml8leSVqT33CRpj1T/XZI6Kt7fZ920/7OSHpO0UdI5kkLSa3OuQcDXgC9HxNUR8XREbI+IuyLi3FTni5L+reI9E9LxXpG2fy7pckl3A88Bl0lqqzrPpyUtTK93l/SPkh6R9LikuZLGvMw/h9UBJwirJ38OTAfeCRwM/Ba4qmL/T4CJwAHAr4Drqt5/NnA5sDfwi1T2IeAk4HDgaOBP+jl/bl1JJwGfAd4DvDbF15dJwCHALf3UKeKjwHlk1/J/gEmSJlbsPxu4Pr3+CvA64JgUXwtZi8UanBOE1ZPzgb+JiI6IeB74InBG7zfriLgmIp6p2PdGSftWvP+HEXF3+sb++1T2jYjYGBFPAT8i+xDtS191PwT8a0SsjojngC/1c4xXpd+PFb7qfN9J59sWEU8DPwTOAkiJ4vXAwtRiORf4dEQ8FRHPAH8PnPkyz291wAnC6slhwHxJmyRtAh4EeoADJTVJmp26nzYDD6f37F/x/kdzjvnfFa+fA17Zz/n7qntw1bHzztPrf9Lvg/qpU0T1Oa4nJQiy1sOClKzGAXsC91f8d7stlVuDc4KwevIocHJENFf87BERnWQfiqeRdfPsC0xI71HF+8ua2vgxssHmXof0U3cd2XV8sJ86z5J9qPf6g5w61dfyU2B/SceQJYre7qUngW7gDRX/zfaNiP4SoTUIJwgbqUZJ2qPi5xXAXOBySYcBSBon6bRUf2/gebJv6HuSdaMMle8DMyUdIWlP+unfj2z+/c8An5M0U9I+afD9bZLmpWrLgXdIOjR1kV06UAARsY1sXGMOsB9weyrfDnwb+CdJBwBIapE0bZev1uqGE4SNVIvIvvn2/nwRuBJYCPxU0jPAvcCxqf61wG+ATmBN2jckIuInwDeAnwHrgXvSruf7qH8L8GHgE8BG4HHgf5ONIxARtwM3ASuA+4EfFwzlerIW1M0pYfT66xTXvan77d/JBsutwckLBpkNLUlHAKuA3as+qM2GFbcgzIaApNMljZY0luy20h85Odhw5wRhNjTOB7qAX5PdWfWntQ3HbGDuYjIzs1xuQZiZWa5X1DqAwbT//vvHhAkTah2GmdmIcf/99z8ZEbkPRtZVgpgwYQJtbW0DVzQzMwAk/aavfe5iMjOzXE4QZmaWywnCzMxyOUGYmVmu0hKEpGskPSFpVR/7JekbktanVbjeVLHvJEnr0r5LyorRzMz6VuZdTN8Bvkk2SVqek8lW95pINqHat4Bj01rAVwHvBTqAZZIWRsSaEmM1qzsL2juZs3gdGzd1c3DzGGZNm8T0KS21DssGUdl/49ISREQskTShnyqnAdem6Y3vldQs6SCyefrXR8QGAEk3prpOEDZo6v3Dc0F7J5feupLurT0AdG7q5tJbVwLU1XU2sqH4G9fyOYgWdlz1qiOV5ZUfSx8knUe29i6HHnro4EdpdacRPjznLF73wvX16t7aw2dvWcENSx+pUVQ2mNof2cSWnu07lHVv7WHO4nV1kSCUUxb9lOeKiHnAPIDW1lZPLGUDaoQPz85N3bnl1R8oNnL19bfc2MffflfUMkF0sOPSi+PJFkcZ3Ue52aDo6x9QPX14jm7aLfd6WprHcNP5x9UgIhtsx8++M/eLwMHNYwbtHLVMEAuBC9MYw7HA0xHxmKQuYKKkw8lW/zqTbD3hmqv3futGcXDzmNx/WPX04VndjQYwZlQTs6Z5obh6MWvapNL/xqUlCEk3AO8iWyi9A/gCMAogIuaSLRl5CtlSh88BM9O+bZIuBBYDTcA1EbG6rDgr9ZcAGqHfulEMxT+sWuv9f9JfaOrXUPyN62o9iNbW1tjVyfr+dsFKrrv3kR0GO8aMauKKGUcxfUpLn8250U27MeXQ5l2M2Grlyd89z4auZwmyloM/PK1RSbo/Ilrz9tXVbK67akF750uSA+w4cOlBv/qy/yt3Z/9X7s5px7Rw9rG++80sjxMEWROtr3ZUbwLwoJ+ZNRrPxUTftwTCiwngH844mjGjmnbYV2/91mZmldyCAJokevoYi+lNAB70M7NG4wQBfSYH2PEOpelTWpwQzKxhuIuJrBtpZ8rNzBqBEwRZN5LHF8zMduQuJl7sRvrsLSvY0rPd98WbmeEWBPDiE9RberYzumk3JwczM9yCeMkUGlt6tnsKDTMz3ILoc+rnOYvX1SgiM7PhoeETRF9TPw/mnOpmZiNRwyeIvuZOH8w51c3MRqKGTxC+xdXMLF/DD1L7Flczs3wNnyAgSxK9axF7ZlYzs0zDdzGZmVk+JwgzM8vlBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QZmaWq9QEIekkSeskrZd0Sc7+sZLmS1ohaamkIyv2XSxplaTVkv6izDjNzOylSksQkpqAq4CTgcnAWZImV1W7DFgeEUcDHwOuTO89EjgXmAq8EThV0sSyYjUzs5cqswUxFVgfERsiYgtwI3BaVZ3JwB0AEbEWmCDpQOAI4N6IeC4itgF3AaeXGKuZmVUpM0G0AI9WbHekskoPADMAJE0FDgPGA6uAd0h6laQ9gVOAQ/JOIuk8SW2S2rq6ugb5EszMGleZCUI5ZVG1PRsYK2k5cBHQDmyLiAeBrwC3A7eRJZJteSeJiHkR0RoRrePGjRu04M3MGl2Zk/V1sOO3/vHAxsoKEbEZmAkgScBD6YeI+BfgX9K+v0/HMzOzIVJmC2IZMFHS4ZJGA2cCCysrSGpO+wDOAZakpIGkA9LvQ8m6oW4oMVYzM6tSWgsiIrZJuhBYDDQB10TEakkXpP1zyQajr5XUA6wBPllxiB9IehWwFfhURPy2rFjNzOylSl0PIiIWAYuqyuZWvL4HyL19NSLeXmZsZmbWPz9JbWZmuZwgzMwslxOEmZnlcoIwM7NcThBmZpbLCcLMzHI5QZiZWS4nCDMzy+UEYWZmuZwgzMwslxOEmZnlcoIwM7NcThBmZpbLCcLMzHINmCAk7TcUgZiZ2fBSpAVxn6SbJZ2SlgWtOwvaO2l/ZBP3PfQUx8++kwXtnbUOycys5ookiNcB84CPAusl/b2k15Ub1tBZ0N7JpbeuZEvPdgA6N3Vz6a0rnSTMrOENmCAic3tEnEW2bvTHgaWS7pJ0XOkRlmzO4nV0b+3Zoax7aw9zFq+rUURmZsPDgEuOpnWhP0LWgngcuAhYCBwD3AwcXmaAZdu4qXunys3MGkWRNanvAb4HTI+IjoryNklz+3jPiHFw8xg6c5LBwc1jahCNmdnwUWQMYlJEfLkqOQAQEV8pIaYhNWvaJMaMatqhbMyoJmZNm1SjiMzMhociCeKnkpp7NySNlbS4xJiG1PQpLVwx4yhGN2X/KVqax3DFjKOYPqWlxpGZmdVWkS6mcRGxqXcjIn4r6YASYxpy06e0cMPSRwC46fwRP+5uZjYoirQgeiQd2rsh6TAgihxc0kmS1klaL+mSnP1jJc2XtELSUklHVuz7tKTVklZJukHSHkXOaWZmg6NIgvgb4BeSvifpe8AS4NKB3iSpCbgKOBmYDJwlaXJVtcuA5RFxNPAx4Mr03hbgz4HWiDgSaALOLHZJZmY2GAbsYoqI2yS9CXgLIODTEfFkgWNPBdZHxAYASTcCpwFrKupMBq5I51kraYKkAytiGyNpK7AnsLHgNZmZ2SAoOllfD/AE8DQwWdI7CrynBXi0YrsjlVV6AJgBIGkqcBgwPiI6gX8EHgEeA56OiJ/mnUTSeZLaJLV1dXUVvBwzMxtIkcn6ziHrVloMfCn9/mKBY+fN21Q9djEbGCtpOdkDeO3ANkljyVobhwMHA3tJ+kjeSSJiXkS0RkTruHHjCoRlZmZFFGlBXAz8IfCbiDgBmAIU+areARxSsT2eqm6iiNgcETMj4hiyMYhxwEPAe4CHIqIrIrYCtwJvLXBOMzMbJEUSxO8j4vcAknaPiLVAkafIlgETJR0uaTTZIPPCygqSmtM+yOZ5WhIRm8m6lt4iac80g+y7gQeLXZKZmQ2GIs9BdKQH5RYAt0v6LQUGjCNim6QLybqkmoBrImK1pAvS/rnAEcC1knrIBq8/mfbdJ+kW4FfANrKup3k7fXVmZrbLitzFdHp6+UVJPwP2BW4rcvCIWAQsqiqbW/H6HmBiH+/9AvCFIucxM7PB12+CkLQbsCI9i0BE3DUkUZmZWc31OwYREduBByqfpDYzs8ZQZAziIGC1pKXAs72FEfGB0qIyM7OaK5IgvlR6FGZmNuwUGaT2uIOZWQMqsuToM7z4BPRoYBTwbETsU2ZgZmZWW0VaEHtXbkuaTjYRn5mZ1bGik/W9ICIWACeWEIuZmQ0jRbqYZlRs7ga0UnDBIDMzG7mK3MX0/orX24CHyWZaNTOzOlZkDGLmUARiZmbDS5H1IL6bJuvr3R4r6ZpywzIzs1orMkh9dERs6t2IiN+SrQlhZmZ1rEiC2C2t8AaApP0oNnZhZmYjWJEP+q8Cv0zrMwTwIeDyUqMyM7OaKzJIfa2kNrJnHwTMiIg1pUdmZmY1VeQ5iLcAqyPim2l7b0nHRsR9pUc3BBa0dzJn8To6N3Uzumk3FrR3Mn1KS63DMjOruSJjEN8Cflex/WwqG/EWtHdy6a0r6dzUDcCWnu1ceutKFrR31jgyM7PaK5IgFBEvPDmdFhGqi0HqOYvX0b21Z4ey7q09zFm8rkYRmZkNH0USxAZJfy5pVPq5GNhQdmBDYWNqORQtNzNrJEUSxAXAW4FOoAM4Fji3zKCGysHNY3aq3MyskQyYICLiiYg4MyIOiIgDgU8C7yo9siEwa9okxoxq2qFszKgmZk2bVKOIzMyGj0LTfUtqknSypGuBh4APlxvW0Jg+pYUrZhzF6KbsP0NL8xiumHGU72IyM2OAwWZJ7wDOBt4HLAWOB14dEc8VObikk4ArgSbg6oiYXbV/LHAN8Brg98AnImKVpEnATRVVXw18PiK+XuiqdsL0KS3csPQRAG46/7jBPryZ2YjVZ4KQ1AE8QnZL66yIeEbSQzuRHJqAq4D3ko1dLJO0sOohu8uA5RFxuqTXp/rvjoh1wDEVx+kE5u/85ZmZ2a7qr4vpB0ALWXfS+yXtxc4tFDQVWB8RGyJiC3AjL11HYjJwB0BErAUmSDqwqs67gV9HxG924txmZvYy9ZkgIuJiYALwNeAE4D+BcZI+JOmVBY7dAjxasd2Ryio9AMwAkDQVOAwYX1XnTOCGvk4i6TxJbZLaurq6CoRlZmZF9DtIHZk7I+JcsmRxNjCdbFW5gSjvkFXbs4GxkpYDFwHtZKvWZQeQRgMfAG7uJ8Z5EdEaEa3jxo0rEJaZmRVR+InoiNgK/Aj4kaQiDwp0AIdUbI8HNlYdczMwE0CSyO6QeqiiysnAryLi8aJxmpnZ4Ch0m2u1iCjyqPEyYKKkw1NL4ExgYWUFSc1pH8A5wJKUNHqdRT/dS2ZmVp7S5lSKiG2SLgQWk93mek1ErJZ0Qdo/FzgCuFZSD7CG7CE8ACTtSXYH1PllxWhmZn0rddK9iFgELKoqm1vx+h5gYh/vfQ54VZnxmZlZ34qsB/E6YBbZHUYv1I+IE0uMy8zMaqxIC+JmYC7wbaBngLpmZlYniiSIbRFRFwsEmZlZcUXuYvqRpD+TdJCk/Xp/So/MzMxqqkgL4uPp96yKsiCbQM/MzOrUgAkiIg4fikDMzGx4KXIX0yjgT4F3pKKfA/83PVltZmZ1qkgX07eAUcA/p+2PprJzygrKzMxqr0iC+MOIeGPF9p2SHigrIDMzGx6K3MXUI+k1vRuSXo2fhzAzq3tFWhCzgJ9J2kA2hfdhpBlYzcysfhW5i+kOSROBSWQJYm1EPF96ZGZmVlP9rUl9YkTcKWlG1a7XSCIibi05NjMzq6H+WhDvBO4E3p+zLwAnCDOzOtZngoiIL6SXfxcRlau8IckPz5mZ1bkidzH9IKfslsEOxMzMhpf+xiBeD7wB2LdqHGIfYI+yAzMzs9rqbwxiEnAq0MyO4xDPAOeWGZSZmdVef2MQPwR+KOm4tDSomZk1kCIPyrVL+hRZd9MLXUsR8YnSojIzs5orMkj9PeAPgGnAXcB4sm4mMzOrY0USxGsj4nPAsxHxXeB9wFHlhmVmZrVWJEH0rvuwSdKRwL7AhNIiMjOzYaFIgpgnaSzwOWAhsAb4hyIHl3SSpHWS1ku6JGf/WEnzJa2QtDQloN59zZJukbRW0oOSjit4TWZmNgiKTNZ3dXp5FzuxDrWkJuAq4L1AB7BM0sKIWFNR7TJgeUScnp67uAp4d9p3JXBbRJwhaTSwZ9Fzm5nZy9ffg3Kf6e+NEfG1AY49FVgfERvS8W4ETiNrgfSaDFyRjrdW0gRJBwLdZEuc/knatwXYMsD5zMxsEPXXxbR3+mklW5O6Jf1cQPbBPpAW4NGK7Y5UVukBYAaApKlka02MJ2updAH/Kqld0tWS9so7iaTzJLVJauvq6ioQlpmZFdFngoiIL0XEl4D9gTdFxF9GxF8Cbyb7EB+I8g5btT0bGCtpOXAR0A5sI2vZvAn4VkRMAZ4FXjKGkeKcFxGtEdE6bty4AmGZmVkRRR6UO5Qdu3e2UOwupg7gkIrt8cDGygoRsZm0Op0kAQ+lnz2Bjoi4L1W9hT4ShJmZlaNIgvgesFTSfLIWwOnAtQXetwyYmKYG7wTOBM6urCCpGXgujTGcAyxJSWOzpEclTYqIdWQD12swM7MhU+Qupssl/QR4eyqaGRHtBd63TdKFwGKgCbgmIlZLuiDtnwscAVwrqYcsAXyy4hAXAdelO5g24HWwzcyGVH93Me0TEZsl7Qc8nH569+0XEU8NdPCIWAQsqiqbW/H6HmBiH+9dTjZAbmZmNdBfC+J6sum+72fHwWWl7cLPRJiZ2cjT33Tfp6bfXl7UzKwB9dfF9Kb+3hgRvxr8cMzMbLjor4vpq/3sC+DEQY7FzMyGkf66mE4YykDMzGx4KfIcBGmW1cnsuKJckWchzMxshBowQUj6AvAusgSxCDgZ+AXFHpYzM7MRqsh6EGeQPcn83xExE3gjsHupUZmZWc0VSRDdEbEd2CZpH+AJ/AyEmVndKzIG0ZbmTPo22UNzvwOWlhqVmZnVXH/PQXwTuD4i/iwVzZV0G7BPRKwYkujMzKxm+mtB/BfwVUkHATcBN6T5kczMrAH0t2DQlRFxHPBO4Cmy1d0elPR5Sa8bsgjNzKwmBhykjojfRMRX0spuZ5OtB/Fg6ZGZmVlNDZggJI2S9H5J1wE/Af4T+GDpkZmZWU31N0j9XuAs4H1kdy3dCJwXEc8OUWxmZlZD/Q1SX0a2JsRfFVkcyMzM6osn6zMzs1xFnqQ2M7MG5ARhZma5nCDMzCyXE4SZmeVygjAzs1ylJghJJ0laJ2m9pEty9o+VNF/SCklL08p1vfselrRS0nJJbWXGaWZmL1VoydFdIakJuAp4L9ABLJO0MCLWVFS7DFgeEadLen2q/+6K/SdExJNlxWhmZn0rswUxFVgfERsiYgvZk9inVdWZDNwBEBFrgQmSDiwxJjMzK6jMBNECPFqx3ZHKKj0AzACQNBU4DBif9gXwU0n3Szqvr5NIOk9Sm6S2rq6uQQvezKzRlZkglFMWVduzgbGSlgMXAe3AtrTv+Ih4E3Ay8ClJ78g7SUTMi4jWiGgdN27cIIVuZmaljUGQtRgOqdgeD2ysrBARm4GZAJIEPJR+iIiN6fcTkuaTdVktKTFeMzOrUGYLYhkwUdLhkkYDZwILKytIak77AM4BlkTEZkl7Sdo71dkL+CNgVYmxmplZldJaEBGxTdKFwGKgCbgmIlZLuiDtnwscAVwrqQdYA3wyvf1AYH7WqOAVZGtj31ZWrGZm9lJldjEREYuARVVlcyte3wNMzHnfBuCNZcZmZmb985PUZmaWywnCzMxyOUGYmVkuJwgzM8vlBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QZmaWywnCzMxyOUGYmVkuJwgzM8vlBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QZmaWywnCzMxyOUGYmVkuJwgzM8vlBGFmZrmcIMzMLJcThJmZ5So1QUg6SdI6SeslXZKzf6yk+ZJWSFoq6ciq/U2S2iX9uMw4zczspUpLEJKagKuAk4HJwFmSJldVuwxYHhFHAx8DrqzafzHwYFkxmplZ38psQUwF1kfEhojYAtwInFZVZzJwB0BErAUmSDoQQNJ44H3A1SXGaGZmfSgzQbQAj1Zsd6SySg8AMwAkTQUOA8anfV8HPgts7+8kks6T1CapraurazDiNjMzyk0QyimLqu3ZwFhJy4GLgHZgm6RTgSci4v6BThIR8yKiNSJax40b97KDNjOzzCtKPHYHcEjF9nhgY2WFiNgMzASQJOCh9HMm8AFJpwB7APtI+reI+EiJ8ZqZWYUyWxDLgImSDpc0muxDf2FlBUnNaR/AOcCSiNgcEZdGxPiImJDed6eTg5nZ0CotQUTENuBCYDHZnUjfj4jVki6QdEGqdgSwWtJasrudLi4rnr4saO+k/ZFN3PfQUxw/+04WtHcOdQhmZsOSIqqHBUau1tbWaGtrK1x/QXsnl966ku6tPS+UjRnVxBUzjmL6lOrxdDOz+iPp/ohozdvX0E9Sz1m8bofkANC9tYc5i9fVKCIzs+GjoRPExk3dO1VuZtZIGjpBHNw8ZqfKzcwaSUMniFnTJjFmVNMOZWNGNTFr2qQaRWRmNnyU+RzEsNc7ED1n8To2burm4OYxzJo2yQPUZmY0eIKALEk4IZiZvVRDdzGZmVnfnCDMzCyXE4SZmeVygjAzs1xOEGZmlquu5mKS1AX8pmD1/YEnSwyn1nx9I5uvb2QbSdd3WETkLqZTVwliZ0hq62uCqnrg6xvZfH0jW71cn7uYzMwslxOEmZnlauQEMa/WAZTM1zey+fpGtrq4voYdgzAzs/41cgvCzMz64QRhZma5Gi5BSDpJ0jpJ6yVdUut4BoOkayQ9IWlVRdl+km6X9F/p99haxrirJB0i6WeSHpS0WtLFqbwurg9A0h6Slkp6IF3jl1J5PV1jk6R2ST9O23VzbQCSHpa0UtJySW2pbMRfY0MlCElNwFXAycBk4CxJk2sb1aD4DnBSVdklwB0RMRG4I22PRNuAv4yII4C3AJ9Kf7N6uT6A54ETI+KNwDHASZLeQn1d48XAgxXb9XRtvU6IiGMqnn8Y8dfYUAkCmAqsj4gNEbEFuBE4rcYxvWwRsQR4qqr4NOC76fV3gelDGtQgiYjHIuJX6fUzZB8yLdTJ9QFE5ndpc1T6CerkGiWNB94HXF1RXBfXNoARf42NliBagEcrtjtSWT06MCIeg+xDFjigxvG8bJImAFOA+6iz60tdMMuBJ4DbI6KervHrwGeB7RVl9XJtvQL4qaT7JZ2Xykb8NTbainLKKfN9viOApFcCPwD+IiI2S3l/ypErInqAYyQ1A/MlHVnrmAaDpFOBJyLifknvqnU8JTo+IjZKOgC4XdLaWgc0GBqtBdEBHFKxPR7YWKNYyva4pIMA0u8nahzPLpM0iiw5XBcRt6biurm+ShGxCfg52ZhSPVzj8cAHJD1M1qV7oqR/oz6u7QURsTH9fgKYT9adPeKvsdESxDJgoqTDJY0GzgQW1jimsiwEPp5efxz4YQ1j2WXKmgr/AjwYEV+r2FUX1wcgaVxqOSBpDPAeYC11cI0RcWlEjI+ICWT/3u6MiI9QB9fWS9JekvbufQ38EbCKOrjGhnuSWtIpZH2iTcA1EXF5jUN62STdALyLbIrhx4EvAAuA7wOHAo8AfxwR1QPZw56ktwH/AazkxT7sy8jGIUb89QFIOppsELOJ7Evb9yPi7yS9ijq5RoDUxfRXEXFqPV2bpFeTtRog67a/PiIur4drbLgEYWZmxTRaF5OZmRXkBGFmZrmcIMzMLJcThJmZ5XKCMDOzXE4QNmJI+idJf1GxvVjS1RXbX5X0mX7e/x1JZ6TXP5f0kkXlJY2SNDvNwLkqzbJ6ctr3sKT9dyHuF87bx/6r0iygayR1p9fLJZ0haVHvMxKDSdJBvTOr9rF/tKQlkhpttgWr4ARhI8kvgbcCSNqN7LmPN1Tsfytw98s8x5eBg4AjI+JI4P3A3i/zmP2KiE9FxDHAKcCv04ygx0TELRFxSnq6erB9Bvh2PzFtIZuB9MMlnNtGCCcIG0nuJiUIssSwCnhG0lhJuwNHAO2SPi9pWWoBzFPBiZsk7QmcC1wUEc8DRMTjEfH9nLqfScdfVdWq+ZikFWlth+/lvO/LqUVR6N9eb6tF0gRJayVdnc55naT3SLo7tXampvp7KVsfZJmy9Rf6mq34g8Bt6T1vSC2l5Sn2ianOAuB/FYnT6pObjzZipMnQtkk6lCxR3EM2G+9xwNPAiojYIumbEfF3AOlD+lTgRwVO8VrgkYjY3F8lSW8GZgLHkk0AeZ+ku4AtwN+QTdz2pKT9qt73D8C+wMzYtSdUXwv8MXAe2bQxZwNvAz5A9nT59HT+OyPiE6lraqmkf4+IZyviOBz4bW8SBC4AroyI69IUNE2pfBXwh7sQp9UJtyBspOltRfQmiBJxjigAAAIXSURBVHsqtn+Z6pwg6T5JK4ET2bEbajC8DZgfEc+mdRxuBd6eznVLRDwJUDWtwueA5og4fxeTA8BDEbEyIrYDq8kWowmyaUgmpDp/BFyibOrwnwN7kE31UOkgoKti+x7gMkl/DRwWEd0p/h5gS+88Q9Z4nCBspOkdhziK7BvuvWQtiLcCd0vaA/hn4IyIOIqsn32PgsdeDxxa4AOxry4r0ff08cuAN1e3KnbS8xWvt1dsb+fF3gABH6wYxzg0IipXcgPopuK/SURcT9YK6QYWSzqxou7uwO9fRsw2gjlB2EhzN1mX0VMR0ZO+pTeTJYl7ePGD70lla0j0efdQtYh4jmzm2G+krpbeu30+UlV1CTBd0p5p9s7TySYUvAP4UJqkjapkcBswG/h/JX8jXwxc1DvuImlKTp3/5MUWR+9kcxsi4htkM5AencpfBXRFxNYS47VhzAnCRpqVZHcv3VtV9nREPJnu+Pl2KltA9s19Z/wtWffLGkmr0jEqu2NIS6B+B1hKNqvs1RHRHhGrgcuBuyQ9AHyt6n03p9gWpmm9y/BlsiVLV6T4v1xdIY1H/FrSa1PRh4FVqVvq9cC1qfwEYFFJcdoI4NlczRqQpNOBN0fE3/ZT51bg0ohYN3SR2XDiu5jMGlBEzO/tCsuTutgWODk0NrcgzMwsl8cgzMwslxOEmZnlcoIwM7NcThBmZpbLCcLMzHL9f2YCmplXeUu2AAAAAElFTkSuQmCC",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"\n",
"plt.title('Learning Curve')\n",
"plt.xlabel('Wall Clock Time (s)')\n",
"plt.ylabel('Validation Accuracy')\n",
"plt.scatter(time_history, 1 - np.array(valid_loss_history))\n",
"plt.step(time_history, 1 - np.array(best_valid_loss_history), where='post')\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Visualize"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"<matplotlib.legend.Legend at 0x1e2d8e31fa0>"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEHCAYAAABBW1qbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nOzdd1hUR9vA4d8sbSkWUFAEpdgrWLCjsWtiiV0TY9eYqDG9vmm+6V9iisYYo1FfE9HYE0sSe2+oiA0VFQWxgAoISN35/jgrQlRE43KAnfu69go7c/bsQ9R99pyZeUZIKVEURVEUAIPeASiKoihFh0oKiqIoSg6VFBRFUZQcKikoiqIoOVRSUBRFUXKopKAoiqLksLXUiYUQRmAr4GB+nyVSyveFEIuAmubDygIJUsrAXK+rAhwDPpBSfpnfe5QvX176+vpaInxFUZQSa//+/fFSSve79VksKQDpQHspZbIQwg7YLoRYK6UceOsAIcRXQOI/Xvc1sLYgb+Dr60toaOgjC1hRFMUaCCHO3avPYklBaqviks1P7cyPnJVyQggBDADa52p7EjgDpFgqLkVRFOXeLDqmIISwEUKEAVeAdVLKPbm6g4HLUspT5mOdgTeADy0Zk6IoinJvFk0KUsps83iBN9BUCFEvV/dgICTX8w+Br6WUyeRDCDFWCBEqhAiNi4t79EEriqJYMUuOKeSQUiYIITYDXYEjQghboA/QONdhzYB+Qogv0AagTUKINCnltH+cayYwE6BJkyaqcJOiWFhmZiYxMTGkpaXpHYrygIxGI97e3tjZ2RX4NZacfeQOZJoTgiPQEfjc3N0RiJBSxtw6XkoZnOu1HwDJ/0wIiqIUvpiYGEqVKoWvry/aUKBSHEgpuXr1KjExMfj5+RX4dZa8feQJbBJChAP70MYUVpn7BpH31pGiKEVUWloa5cqVUwmhmBFCUK5cuQe+wrPk7KNwoOE9+obf57UfWCAkRVEekkoIxdPD/LmpFc1KsbEx4jJh0Ql6h6EoJZpKCkqxsOnEFUbNC6X/jJ2sOHhB73AUHSxfvhwhBBEREfc99ptvviE1NfWh32vu3LlMmDChwO3/hiXO+W+opKAUeVHxKUwKOUitiqVp7OPKi4vC+H5TJGrXQOsSEhJC69atWbhw4X2P/bdJwZqppKAUaSnpWYydH4rBIJj5TGPmjWxKr8BK/N9fJ3h7+WGysk16h6gUguTkZHbs2MHs2bPzJIXs7GxeffVV6tevT4MGDZg6dSrfffcdsbGxtGvXjnbt2gHg4uKS85olS5YwfPhwAP744w+aNWtGw4YN6dixI5cvXy5wTHFxcfTt25egoCCCgoLYsWMHJpMJX19fEhJu3+asVq0aly9fvuvxRVGhrFNQlIchpeS1JYeIvJLM/0Y2o7KbEwDfDAyksqsT0zZFcjExjWlPNcLFQf1VLgwf/nGUY7FJj/ScdSqV5v0edfM9ZsWKFXTt2pUaNWrg5ubGgQMHaNSoETNnzuTs2bMcPHgQW1tbrl27hpubG1OmTGHTpk2UL18+3/O2bt2a3bt3I4Rg1qxZfPHFF3z11VcFinvSpEm89NJLtG7dmvPnz9OlSxeOHz9Or169WL58OSNGjGDPnj34+vpSoUIFnnrqqbseX9Sof0lKkTVjyxnWHL7E24/XonX18pCZBjZ2CIMNr3apSaWyjry78ggDf9zFz8ODqFDaqHfIioWEhITw4osvAjBo0CBCQkJo1KgR69evZ9y4cdjaah9lbm5uD3TemJgYBg4cyMWLF8nIyHig+fzr16/n2LFjOc+TkpK4ceMGAwcOZPLkyYwYMYKFCxcycODAfI8valRSUIqkLSfj+OKvCHoEVGJMsD9Eroflz4GdIzz2JtQfwFPNquBZ1sj4Xw/QZ/pO5owIokaFUnqHXqLd7xu9JVy9epWNGzdy5MgRhBBkZ2cjhOCLL75ASlmgaZe5j8k9b3/ixIm8/PLL9OzZk82bN/PBBx8UOC6TycSuXbtwdHTM096iRQsiIyOJi4tjxYoV/Oc//8n3+KJGjSkoRc65qylMXHCAmhVK8fmTNRHr3oVf+oJTOXB0hRXPwfTmcHgJ7aqX57dnW5CRbaLvDzvZeTpe7/CVR2zJkiUMHTqUc+fOERUVRXR0NH5+fmzfvp3OnTszY8YMsrKyALh27RoApUqVyvMtvEKFChw/fhyTycTy5ctz2hMTE/Hy8gJg3rx5DxRX586dmTbtdtGFsLAwQEtAvXv35uWXX6Z27dqUK1cu3+OLGpUUlCIlNSOLZ+fvRwjBzz3L4zT/Cdg5FZqMhLGbYOxmGPgr2NjB0lEwozX1krax/LkWVCxtZNjPe1l+MOZ+b6MUIyEhIfTu3TtPW9++fVmwYAGjR4+mSpUqNGjQgICAABYsWADA2LFj6datW85A82effUb37t1p3749np6eOef54IMP6N+/P8HBwfcdf/in7777jtDQUBo0aECdOnWYMWNGTt/AgQP55Zdfcm4d3e/4okQU52l9TZo0kWqTnZJDSsmEkIOsPXyRNe0uUSv0fTAYoOdUqNMr78EmExxdBps/hauRUKkhKa3eZNT20uw+e51XO9dgfLtqaiXuI3D8+HFq166tdxjKQ7rbn58QYr+UssndjldXCkqR8ePWM2wKP8sanxBq7XwZKtSFcdvvTAigJYv6/eD5PfDkD5B6FefFA1lg8z6vVr/Ml3+fVFNWFeUhqKSgFAlbT8ax6q8/2VTqfWpeWgVtXofhq6FslfxfaGMLgU/BhP3Q/WsMiTFMiH6JLR5TOLFvA6PmhZKcnlU4v4SilAAqKSi6Ox+fwq4FH7HM/j3cHTIRw/6A9u9oH/gFZWuvjTu8cBC6foZP9nmWOXzA8KjXeHvafC4nqb0AFKUgVFJQdJV6/RIXZ/TkDeaS5dcOw3M7wS/4/i+8FzsjNH8OJoVBxw9pbYziuxsvcvybnkQd2/foAleUEkolBUU38swWMqe1JDAzjFON38Np6GJwLvdoTm7vDK1fxO7lw1xu/DJNTIep8lsn4uYOgfjIR/MeilICqaSgFL7sLNjwX/hfL+Iz7VnR5H9U7/EKWGKmkLE0FXq8T+KzoSy074fT2b+R04JgxfNwPerRv5+iFHMqKSiFK+E8zH0ctn3J4uy2TK0+iwHdH7f423p5evHESz/wUsV5zMrqSlb4EuTUxrDqJUhUpbiLOhsbGwIDA3MeUVFRbN68me7du9/zNQEBAQwePDhP2/Dhw3FycsqzsG3SpEkIIYiP1xY+5i6edzfp6el07NiRwMBAFi1a9C9+q0fnk08+eWTnUklBKTzHVsKM1pguHeVNMYnZbq/w8YDmhbaWoIyjHVPHdOZovddpmTqFXWV7Ig/Mh+8awto3IflKocShPDhHR0fCwsJyHr6+vvkef2v18tatW0lJScnTV61aNVauXAlopSc2bdqUs6q5IA4ePEhmZiZhYWF5FqflJzs7u8DnfxgqKSjFS0Yq/DEJfhuKya0qoxy/YY1sxY/PNMa5kKubOtja8PXAQAa0C+Kp2H68UnEOmfX6w96Z8G0ArHsfUq8VakzKo7dgwQKeeeYZOnfuzO+//56nb/DgwTnf8Ddv3kyrVq1yCurdz5UrVxgyZAhhYWEEBgZy+vRpNmzYQMOGDalfvz4jR44kPT0dAF9fXyZPnkzr1q1ZvHgxf//9Ny1atKBRo0b079+f5ORkAPbt20fLli0JCAigadOm3Lhxg6ioKIKDg2nUqBGNGjVi586dAFy8eJE2bdoQGBhIvXr12LZtG2+++SY3b94kMDCQp59++l//v1MF8RTLunwMloyEuOPIlpN4Jb47m8/GMWd4Q3zLO+sSkhCCV7vUxMvVkf+sOELEzUH8b/h4yod+DTu+hX2zocV4aPE8GMvoEmORtfZNuHT40Z6zYn3o9lm+h9z60APw8/PLU7/obhYtWsS6des4ceIE06ZNy3MbqXr16qxcuZLr168TEhLCkCFDWLt2bYFC9fDwYNasWXz55ZesWrWKtLQ0HnvsMTZs2ECNGjUYOnQoP/zwQ05FV6PRyPbt24mPj6dPnz6sX78eZ2dnPv/8c6ZMmcKbb77JwIEDWbRoEUFBQSQlJeHo6IiHhwfr1q3DaDRy6tQpBg8eTGhoKAsWLKBLly688847ZGdnk5qaSnBwMNOmTXtktZRUUlAsQ0oI/Rn+ehscSsOQZcyK9WN5+HFe61KTx2p66B0hg5tWoWIZIxN+PUDPBRnMGTGFmsEva6UztnwGe2ZAh/cgaJTeoVq9W7ePCmLfvn24u7vj4+ODt7c3I0eO5Pr167i6uuYc06dPHxYuXMiePXv48ccfHzquEydO4OfnR40aNQAYNmwY33//fU5SuHV7affu3Rw7doxWrVoBkJGRQYsWLThx4gSenp4EBQUBULp0aQBSUlKYMGECYWFh2NjYcPLkSQCCgoIYOXIkmZmZPPnkkzmJ8lFSSUF59FKvwR8vwPE/oGoH6D2D7RcNfLp2D93qVeT5x6rqHWGOdjU9WPRsC0bO3Ue/GTv5cUhjWg74H1w8BH+9A6tfAZ9W4FFL71CLhvt8oy8KQkJCiIiIyBl3SEpKYunSpYwePTrnmEGDBtGoUSOGDRuGwfDwd9HvVzvO2dk557hOnToREhKSpz88PPyuY2pff/01FSpU4NChQ5hMJoxGba+QNm3asHXrVlavXs0zzzzDa6+9xtChQx86/rtRYwrKo3VuF8wIhhNrodN/4eklRGe4MDHkANU8XPiyf0CRK1JXz6sMy8e3wrOMkWFzzFVWPQOg/zywd4HNj24QT7Esk8nE4sWLCQ8PJyoqiqioKFauXHnHh3GVKlX4+OOPef755//V+9WqVYuoqCgiI7W1L/Pnz6dt27Z3HNe8eXN27NiRc1xqaionT56kVq1axMbGsm+ftrDyxo0bZGVlkZiYiKenJwaDgfnz5+cMVJ87dw4PDw/GjBnDqFGjOHDgAAB2dnZkZmb+q9/lFpUUlEfDlA1bvtCmm9rYwai/odUL3MySPDt/P1kmyY/PNCn0geWC8irryOJxLWni48ZLiw4xdcMppJObNq5wbKV25aAUORs2bMDb2zvn8fnnn+Pl5ZVnNlGbNm04duwYFy9ezPPaZ599lqpV77xqTU1NzXPOKVOm3PP9jUYjc+bMoX///tSvXx+DwcC4cePuOM7d3Z25c+cyePBgGjRoQPPmzYmIiMDe3p5FixYxceJEAgIC6NSpE2lpaTz//PPMmzeP5s2bc/LkyZwrjs2bNxMYGEjDhg1ZunQpkyZNArRS4Q0aNHgkA82qdLby7yXFwrKxELUN6g+AJ74CY2mklLy4KIzfD8Xy87Ag2tXSfxzhfjKyTLyxNJzlBy8wKKgy/+1aGbupgVC5GTz9m97h6UKVzi7eHrR0tsW+tgkhjMBWwMH8PkuklO8LIRYBNc2HlQUSpJSBQoimwMxbLwc+kFLmP8VA0d+JtdpOaFkZWgnrgME5K5Nnbz/LyrBYXu1co1gkBAB7WwNTBgTg7erI1I2RJKRmMqPVJNjwIUTvhcpN9Q5RUSzKktfy6UB7KWWyEMIO2C6EWCulzFntIYT4Ckg0Pz0CNJFSZgkhPIFDQog/pJSq7nFRFbZASwgVG0C/OVC+Wk7Xzsh4Pl0bQde6FRnfrlo+Jyl6hBC80rkmtgYDX68/ycEWA2joPB02TIbhq/QOT1EsymJjClKTbH5qZ37k3KsS2mjjACDEfHxqrgRgzH2sUgRFroffJ4JfWxi1Lk9CiLmeyvgFB/Ar78yXA4rewHJBjWnjR1knO77fcQmCX9Fuj53ZondYuijOt5mt2cP8uVl0oFkIYSOECAOuAOuklHtydQcDl6WUp3Id30wIcRQ4DIy721WCEGKsECJUCBEaFxdnyfCVe4kNg0VDwb02DPxFK1dtdjMjO2dgeeYzjXEpogPLBeFkb8vQFr6sP36ZyMr9oLQXbPyvtgbDihiNRq5evaoSQzEjpeTq1as501kLqlAGmoUQZYHlwEQp5RFz2w9ApJTyq7scXxuYB7SRUt5zdxQ10KyD61EwqxPYOmhXCKVvb4IupeSlRWGsPBTL7GFNaF+rgn5xPiLXUjJo+dkGujeoxJd+B2DVi/DUb1Cji96hFZrMzExiYmJIS1MbFRU3RqMRb29v7Ozs8rTrMtCcm5QyQQixGegKHBFC2AJ9gMb3OP64ECIFqAeoT/2iIvUa/NIPsjO0e+u5EgLAzzuiWBEWy8udapSIhADg5mzPoKAq/LrnHK906Iun6zfa1UK1Tto+0VbAzs4OPz8/vcNQConF/lYLIdzNVwgIIRyBjkCEubsjECGljMl1vJ85WSCE8EGboRRlqfiUB5R5ExYM1EpfD14I7jXzdO88Hc8na47TuU4FJhSzgeX7GdXaD5OE2Ttj4LG3tNo/x3+//wsVpRiy5FcdT2CTECIc2Ic2pnBr6sYgzAPMubRGm3EUhnar6XkpZbwF41MKypQNS0dDzD7o+xP4tMjTHXM9lQkLDuJbzomvBgRgMBTPgeV7qezmRI8GnoTsPU9i1SehfE3Y9In2/0VRShhLzj4Kl1I2lFI2kFLWk1JOztU3XEo54x/Hz5dS1pVSBkopG0kpV1gqNuUBSAlrX4eIVdD1M6jTK093WmY2437ZT2aWiZlDm1DKaHePExVvz7atSkpGNvP3RkO7tyH+BIRb52I2pWSzjpuiysPb/jXsmwUtJ0LzvMv3pZS8vfwwRy4k8fXAQKq6579jVXFW27M0j9V0Z86OKNKqP6Gtzdj8KWQ/mnozilJUqKSg3NuhhdpK3nr9oOPkPF0ZWSZeWxLOsgMXeLFjdTrWKRkDy/kZ17YqV1MyWLz/ArR/FxLOwcH5eoelKI+USgrK3Z3eCCvHg28wPDk9z0ybaykZDJm9hyX7Y5jUoTqTOlTXMdDC08zPjcDKZZm57QxZ/h3Auyls+T/IVFM1lZJDJQXlThfDtcVp5WvCoF+1NQlmkVeS6T19B2HRCXw7KJCXOtUotiuWH5QQgnFtqxJ97SZrjl6GDu/CjVhtMyFFKSFUUlDySjgPv/YDY2l4enGe7Sh3RMbTZ/oOktOyCBnTjF6BBd/svKToXKcC/u7OzNh8GukbDH5tYPsUSE++/4sVpRhQSUG5LfUa/NJXux0yZCmUuf2hv2DPeYb+vJeKZYysGN+Kxj5uOgaqH4NBMK5NVY5dTGLbqXhtbCElDvY+/JaOilKUqKSgaDLTYOFTWhmLwQvAQ6u/nm2S/HfVMd5efpjW1cqz9LmWVHZz0jdWnfVqWIkKpR2YseW0Vkq7ehfY8S3cTNA7NEX511RSULRFWMvGwPld0PtH8G0NQHJ6FmP/F8rs7WcZ3tKX2cNK7jqEB+Fga8Oo1n7sPH2V8JgEaP8OpCXCru/1Dk1R/jWVFKydlPDnW1rZhi6fQL0+AFxIuEm/H3ay+WQck3vV5YOedbG1UX9dbhnctAqljLba1YJngLaob/d0SFGL8JXiTf0rt3Y7v9PuhzcfDy3GAxAWnUCvaTu4cP0mPw8PYmgLX31jLIJKGe14prkPa49c4mx8CrR7BzJTYcc3eoemKP+KSgrWLHwxrHsP6vaGzh8BsDr8IgN/3IXRzsDS51vStoa7zkEWXSNa+WFnY2Dm1jNagcAGA2HvT5B08f4vVpQiSiUFa3Vmi7aVpk9r6P0jUgimbTzF+AUHqOdVhpXjW1GjQim9oyzS3Es50K+xN0v3x3AlKQ3avgGmLNh2xxYhilJsqKRgjS4dgUVDoFw1GPQr6djy8m+H+PLvkzwZWIlfRzejnIvD/c+jMDbYnyyTiZ93RIGbHzR8BvbPhevn9A5NUR6KSgrWJiFaW5xm7wJDlnA125Gnf9rD8oMXeLlTDb4eGIjRzkbvKIsN3/LOdKvnya+7z5GUlgltXgNhgK1f6B2aojwUlRSsyc3rWkLISIEhSziVVoYnp+/g8IVEpg5uyAsdqltNyYpHaVzbqtxIz2LBnvPagr+gURAWAvGReoemKA9MJQVrkZkGC5+Gq6dh0K9sTfSgz/Sd3MwwsXBsc3oEVNI7wmKrvncZWlUrx8/bz5KelQ2tX9LqRW3+RO/QFOWBqaRgDUwmWDEOzu2A3jOYf9mHEXP34eXqyIrxLWlYxVXvCIu959pW48qNdJYfuAAuHtBsHBxZqo3fKEoxopKCNfj7P3B0OdkdJ/PB2dq8u+IIbWu4s+S5lni7WnfJikelVbVy1PMqzcytZ8g2SWj1AjiU0bbtVJRiRCWFkm7nNNj9PRmNxzLqZHPm7oxiVGs/fhraBBcHW72jKzFuldU+E5/CumOXwNEVWk6AE6vhwn69w1OUAlNJoSQ7shT+fofUqk/Q69TjbIu8ykdP1uPd7nWwMagB5UetWz1PfMo58cOWM0gpoflz4OgGGz/SOzRFKTCVFEqqs9tg+ThuVGhKx6iniUnKYN6Ipgxp7qN3ZCWWjUEwJtifQ9EJ7D5zDRxKaYPOpzdC1A69w1OUAlFJoSS6fAwWPs0NJ2/aXxiLrYMTy59vSevq5fWOrMTr19ib8i72WqE8gKDR4FJRu1qQUt/gFKUAVFIoaRIvIH/tR7K0o0vci/h6e7FifCuqeaiSFYXBaGfDiFZ+bDkZx9HYRLB3gjavwvmd2hWDohRxKimUJDcTMP3Sl/Tk6/S/8QrNGwXwy+hmuDnb6x2ZVRnSzAdnext+3HJGa2g0DMpUgY3/VVcLSpFnsaQghDAKIfYKIQ4JIY4KIT40ty8SQoSZH1FCiDBzeychxH4hxGHzf9tbKrYSKT2ZzPn9yI47xai0SXTv3Jmv+gfgYKtKVhS2Mk52PNWsCqvCY4m+lgq29vDYGxB7ECJW6x2eouTLklcK6UB7KWUAEAh0FUI0l1IOlFIGSikDgaXAMvPx8UAPKWV9YBgw34KxlSwZqaTP748hdj8vZ0/k6cHDGN+umipZoaNRrf2xMQh+2ma+WmgwSCtAuOljbTGhohRRFksKUpNsfmpnfuRcOwvtE2sAEGI+/qCUMtbcfRQwCiFUqc77yUwj7ZdB2MXs4i3TeIaMmMjj9T31jsrqVSxjpHdDL34LjeZqcjrY2MJjb8GVY3B02f1PoCg6seiYghDCxnx76AqwTkq5J1d3MHBZSnnqLi/tCxyUUqbf5ZxjhRChQojQuLg4ywReXGRlkPrr0xjPb+F9xjF49Cs08y+nd1SK2dg2VUnPMjFvZ5TWULcPeNTVVjlnZ+kam6Lci0WTgpQy23ybyBtoKoSol6t7MOarhNyEEHWBz4Fn73HOmVLKJlLKJu7uVrwrWHYWNxYMwylqPR+LMQwc85aqYVTEVPNwoVPtCszbdY6U9CwwGKD9O3DtNBy646++ohQJhTL7SEqZAGwGugIIIWyBPsCi3McJIbyB5cBQKeXpwoitWDJlkxAyilJn1vCVYQT9x71PPa8yekel3MW4x6qSeDOThfuitYaaj0OlRrDlc8i640JYUXRnydlH7kKIsuafHYGOQIS5uyMQIaWMyXV8WWA18JaUUi3/vBeTiashz1I2cgXTbYbQ+7mP1LaZRVijKq409XNj9rYzZGabQAho/x9IjIYD/9M7PEW5gyWvFDyBTUKIcGAf2pjCKnPfIO68dTQBqAa8m2vKqocF4yt+pOTyoomUO7WYn20H0mP8/+Hv7qJ3VMp9PNe2KrGJafweZp5HUbU9+LSCrf8HGan6Bqco/yBkMV5M06RJExkaGqp3GIVDSi4segmviDmE2Peh3fPTqVjWUe+olAKQUtLt222YpOTPSW0wGASc2wlzukGn/2plthWlEAkh9kspm9ytT61oLg6kJOq3N/CKmMMK+x50nPCDSgjFiBCCZ9v6c/JyMhsjrmiNPi2hagfY/jWkJekboKLkopJCMXBy8Xv4Hv+RtQ5dafvCbNxLG/UOSXlA3RtUwqus4+1CeaDNRLp5DfbM0C8wRfkHlRSKuCO/TabGse/YaOxIy0n/w9VFrecrjuxsDIwO9iP03HVCo65pjV6NoVZ32DkVUq/pG6CimKmkUIQd+O1T6h37ih2ObWk2aQFlnFRCKM4GBlXG1cku79VCu7ch/YaWGBSlCFBJoYja+duXNDr2GaGOrWj0wm84O6qEUNw52dsyrKUv649f4eTlG1pjhbpQr692Cyn5ir4BKgoqKRRJGxd9S/OjHxHu2Iz6Ly7F0VGNIZQUw1r44miXq6w2aDWRstK1QWdF0ZlKCkWIlJLVC6bR9tj7nHRuRO0XV+DgoGYZlSSuzvYMDKrMyrALxCbc1BrLV4PAwbBvNiTF5n8CRbGwB0oKQghXIUQDSwVjzaSULFswgy4n3iXKuQHVX/gdOwcnvcNSLGB0sB8SmL397O3GNq9BdoaWGBRFR/dNCkKIzUKI0kIIN+AQMEcIMcXyoVkPk0ny6/yf6HHyHWKda+M3cRU2RrVSuaTydnWiZ0AlQvaeJyE1Q2t09YWa3eDAPFUTSdFVQa4Uykgpk9AK2M2RUjZGq12kPALZJsns+XPof/ptrjpXo/LE1RgcS+sdlmJhz7b1JzUjm/m7zt1uDBoNKXFwbKV+gSlWryBJwVYI4Ym2Ic6q+x2sFFxWtonv585jyJk3SHL2oeL4tQhHVf7aGtSqWJp2Nd2ZuzOKmxnZWqN/O213tr0z9Q1OsWoFSQqTgb+ASCnlPiGEP3C3jXGUB5CRZWLKnF8Zee4NbjpXwv35PxHOaoMcazKubVWupmSweL+5rLbBAEFjIGYfXDigb3CK1bpvUpBSLpZSNpBSPm9+fkZK2dfyoZVcaZnZfDJ7IeOiX8fk5I7buD/BxYo3DLJSTf3caFilLDO3niEr27xvc+BgsHOGfbP0DU6xWrb36hBCTCXXnsr/JKVUpR0fQmpGFpNnL+aNS69hcCqLy7NrobTaU9kaCSEY17Yqz87fz+rDF+kV6AXGMhAwEA7+Cp0/Aic3vcNUrEx+VwqhwP58HsoDupGWyVszl/LqpddxcHTGZcxqKFtZ77AUHXWqXYGq7s7M2HKGnDL2QWMgO11twqPo4p5XClLKebmfCyGcpZQplg+pZEpIzeCNn1Yw+drruBjtMY5eDW7+eoel6MxgEPgsJsgAACAASURBVDzbtiqvLwln66l42tZwhwp1wDdYW7PQciIYbPQOU7EiBVmn0EIIcQw4bn4eIISYbvHISpC0zGxemvkH7117C1cHMI78A8pX1zsspYh4MtCLiqWNzNicq1Be0zGQeB5O/qVfYIpVKsjso2+ALsBVACnlIaCNJYMqaaYv+ZPJ197Awz4D+xG/a98EFcXM3tbAqNZ+7DpzlUPRCVpjzSegVCXY95O+wSlWp0BlLqSU0f9oyrZALCXSrg0rGRkxBjf7LOyGrwTPAL1DUoqgwc2qUMrBlp+2mQvl2dhCk5FweiPEqxngSuEpSFKIFkK0BKQQwl4I8SrmW0lK/q7umEeTrSO4YeuG/XObwauR3iEpRZSLgy1PNavCmsMXib6WqjU2HgYGOzU9VSlUBUkK44DxgBcQAwSanyv3IiXZGz+h3LoX2C9qI0avw66cr95RKUXc8Fa+GIRgzo4orcHFA+r2hrAFkJ6sa2yK9ShIUhBSyqellBWklB5SyiFSyqsWj6y4ykqH5eOw2fo5i7PacO3JBXh7qnUIyv15lnGkR0AlFu07T+LNTK2x6RhIT4LwRfoGp1iNgiSFnUKIv4UQo4QQZS0eUXGWeg3m94HwhXyZ2Z+DjT7m8UAfvaNSipHRwX6kZGQTsve81uAdpI1D7f0J5D3XkirKI1OQMhfVgf8AdYEDQohVQoghFo+suLl2BmZ3Rsbs5R0xiXXlh/Jej7p6R6UUM3UrlaFVtXLM2XGWjCwTCAFNx0LccYjarnd4ihUo6OyjvVLKl4GmwDVg3n1eghDCKITYK4Q4JIQ4KoT40Ny+SAgRZn5ECSHCzO3lhBCbhBDJQohp/+J3KnzRe2FWR2RqPJNdP2FpVgumPdUQo51adKQ8uNHB/lxOSmdVuHkXtnp9wdFVVU9VCkVBFq+VFkIME0KsBXYCF9GSw/2kA+2llAFog9NdhRDNpZQDpZSBUspAYCmwzHx8GvAu8OrD/CK6Oboc5nYHYxl+rTuLOTGV+LBnXapXKKV3ZEox9VgNd6p7uDBzq7n0hZ0jNHwGIlZD4gW9w1NKuIJcKRxC+1CfLKWsIaV8Q0p539pHUnNryoSd+ZFzU1QIIdD2aAgxH58ipdyOlhyKPim1jdYXD4dKDTnYeTHv78ygZ0AlBjRR9YyUhyeEYEywPxGXbrAj0jynI2gUSBPsn6NvcEqR8N7KI3m3c32ECpIU/KWULwHhD3pyIYSN+fbQFWCdlHJPru5g4LKUsvitzMnOhD8mwfoPoF5fEvovZvzyc3iVdeTj3vXQ8p2iPLxeDStR3sWBmbcWs7n6Qo2usH+u2q7TykVeucH83eeIu2GZvwcFSQrNH7b2kZQy23ybyBtoKoSol6t7MOarhAchhBgrhAgVQoTGxcU96Mv/vbREWDBA20s3+BVkn594fcVJ4pLTmfZUQ0oZ7Qo/JqXEcbC1YXhLH7aejOPEpRtaY9MxartOhakbIzHa2jAm2M8i5y+U2kdSygRgM9AVQAhhi7bn8wNPvpZSzpRSNpFSNnF3L+SNaRKi4eeucHYr9JwGHd7jf7uj+fvYZd7oWosG3mrGrvLoPN3MB0c7m9ulL/zbgVtVbXqqYpUiryTz+6FYhrb0oZyLg0Xew2K1j4QQ7rfWNQghHIGOQIS5uyMQIaWMeYBY9RV7EGZ1gMQYeHoJNHqGo7GJfLz6OO1reTCqtWWytmK9XJ3t6d/Em5VhF7iSlKZt19l0DMTs1f4+KlZn2sZTGG1tGBtsubL7lqx95AlsEkKEA/vQxhRWmfsGcZdbR0KIKGAKMFwIESOEKBrlRCPWwJzHwcYBRv0NVduRkp7FxAUHcXW248v+AWocQbGIUa39yDJJ5u6M0hoCzNt17lX1kKzN6TjzVUILH8oZUiy2mPFhax89f78XSSnDpZQNzfs715NSTs7VN1xKOeMur/GVUrpJKV2klN5SymMF/1UsZPcMWPgUuNeC0evBozYA7648QtTVFL4d1BA3Z3udg1RKKp9yznSpU5Ff95wnJT0LHMtq23UeWaKtoFesxrSNkTjY2jCmtQ/M7w3LxljkfQqyojn+n7WPgLctEk1RYsqGNa/Dn29ArSdg+GooVQGApftjWHbgAhPbV6e5fzmdA1VKujFt/Em8mcniUPNd3KAxkJUGB+frG5hSaM7EJbMy7AJDmleh/MmFcDFMm41mAQUaU7iLAY80iqImPRkWPg17f4QWE2DA/8DeCdD+cN5deYSmfm5MbF9N50AVa9DYx5VGVcoye8dZsk1S26TJp7VWUtuktjaxBtM2RmJva+DZIFfYMFnbrrVeX4u818MmhZJ7Az3pIsx9HE79BY9/CV0+ztkjNy0zmwkLDuJga+DbQYHY2jzs/z5FeTBj2/gTfe0mfx29pDU0HQMJ5+HU3/oGpljc2fgUVoRdYEgzH8rv+RzSkqDbF1pdLAu456eaEMLtHo9ylNSkcPkozOoI8ZEweKH2Dy+Xz9ZGcOxiEl/2D8CzjKNOQSrWqFOdiviUc7pd+qKWebtOVQ+pxJu68RT2tgaer5mkLV5sNs6iW/rm91V3PxBq/m/uRyiQYbGI9BK5HmZ3AZkNI9dCjS55uv8+eom5O6MY2cqPDrUr6BSkYq1sDIJRrf0Ii05g/7nrYGMHTUaYt+uM1Ds8xUKi4lNYGRbL000r47b5HXB2h8fetOh73jMpSCn9pJT+5v/+82G5SbJ6CJ0Dvw4AVx8YveGOfZQvJNzktSXh1PcqwxvdauoUpGLt+jX2poyj3e3FbI3Udp0l3dSNkdgaBJPK7YULodD5v2AsbdH3tO6b4iYTrHsPVr0IVdvByD+hjFeeQ7KyTUwKOUhWtompgxviYKvKYSv6cLK35ZnmPvx97DJn41O02XB1n4SwX9V2nSXQuavaWMKoxq6U3v4RVGkBDQZa/H2tNylk3oQlw2HHt9BkJAxeBA53lrv+dsMpQs9d55M+9fEt71z4cSpKLkNb+mBnMPDzrQqZTceq7TpLqGnmq4QJLIKb1+Hx/7PY4HJu1pkUkuNgXg849jt0/giemAI2tncctiMynmmbIhnQxJtegV53OZGiFC6PUkaebFiJxfujuZ6SoW3XWbGB2q6zhDl3NYVlBy/wUv00nMLnQtBoqFi/UN7bOpPCjVht+8wB86DlxLtm3/jkdF5cFIZ/eWc+6Km21VSKjtHB/qRlmvhl97m823We26F3aMoj8v2mSGwMMDJxOji6Qbt3Cu2985uSWl8IsVsIES2EmCmEcM3Vt7dwwrMQzwCYFA51et2122SSvPLbIRJvZjLtqUY42d95FaEoeqlRoRSP1XRn3q4o0jKzoX4/tV1nCXL+aipLD1zg86rHsI/dC50+1MqbFJL8rhR+AD4A6gMnge1CiKrmvuK/aYCDyz27Zm0/w5aTcbzXvQ61PS070q8oD2NMsD/xyRmsDLtwe7vO46vUdp0lwPebIiljuEnPuBng1QQCnirU988vKbhIKf+UUiZIKb8EJgB/CiGak2tbzZLm4PnrfPHnCbrVq8jTzaroHY6i3FXLquWo41man7adxWSSarvOEiL6WipLD8TwfaW/sEmNhye+1EqmF6L83k0IIcrceiKl3AT0BeYDPpYOTA+JNzOZGHKQCqWNfNa3gSqHrRRZQgjGtPEj8koyW07Gmbfr7KK26yzmvt8USS0RTfO4JdrixEoNCz2G/JLC50Dt3A1SynCgA7DMkkHpQUrJ28sOczExjalPNaSMY/G/Q6aUbN0bVKJiaSMzt5oXs+Vs1/m7voEpDyX6WipL9kcz1TUEYSwN7d/VJY78VjQvkFLuBhBCuAghnM3t56WUlinkraOQvdGsPnyRVzvXpFEV1/u/QFF0ZmdjYEQrX3aducqRC4ng3968XacacC6Opm+OpIdhF37JB6HD++Dkpksc+d6sEkI8J4Q4D5xD24HtnBDivhvsFDcnLt3gwz+OEly9PM+2KVkVPJSSbXCzKrg42GqlLwwGbT57zF6IDdM7NOUBxFxPZU3oKT50DNFuGTUaqlss+U1J/Q/QA3hMSllOSukGtAO6mftKhJsZ2UxYcIBSRjumDAjEYFDjCErxUdpox8CgyqwKv0hswk0IfArsnGDfT3qHpjyA6ZtPM8FmOaUz47WS/Qb9yunkd6XwDNBHSnnmVoP55wGAfmnsEfvwj6NExiXzzcBA3Es56B2OojywEa18AZiz46w2n73BQDistussLi4k3ORA6G5G2qzRphZ7N9E1nnxvH0kp0+7SdhMwWSyiQvT7oVgW7ovm+ceq0rp6eb3DUZSH4u3qxOP1PQnZG01SWqY24Ky26yw2pm88xbs2cxEOLtDxA73DyTcpxAghOvyzUQjRHrhouZAKx/mrqby97DCNfVx5sWMNvcNRlH9lTLAfyelZLNobDRXqgk8rtV1nMXAh4SZJB5bQynAEQ4d3wVn/L6f5JYUXgB+FEHOFEBOFEBOEEPOAmWgL2YqtjCwTE0IOYBDw7aBA7NS2mkox18C7LM383Jiz4yyZ2aZc23Wu0zs0JR+zNhzmbZv5ZLjX06o1FwH5TUk9CtQDtgK+gL/553rmvmLrwPnrHL+YxBf9AvB2ddI7HEV5JMa28Sc2MY01hy9Cre5QylNNTy3CYhNu4hE2DU9xDfseX+k6uJzbPSu9CSGqARWklD//oz1YCBErpTxt8egspLl/OTa/1g6vsmqfZaXkaFfTA393Z37adoaeAZUQTUbCpo+17TrLV9M7POUffvtzE88bVpFSuz/OVZrrHU6O/O6bfAPcuEv7TXNfsaYSglLSGAyCMcH+HLmQxK4zV9V2nUXYxYRUGh37DJONEecnPtE7nDzySwq+5rIWeUgpQ9FuJ+VLCGEUQuwVQhwSQhwVQnxobl8khAgzP6KEEGG5XvOWECJSCHFCCNHlIX4fRbFqvRt6Uc7ZnlnbzmrbddbppbbrLII2rZxLG8MhbrZ+A1w89A4nj/ySgjGfvoJ8zU4H2kspA4BAoKsQormUcqCUMlBKGQgsxVxHSQhRBxgE1AW6AtOFEEXjJpuiFBNGOxuGtvBlY8QVIq/cuL1d5+Hf9A5NMbsUf402Z77iktEf17bj9Q7nDvklhX1CiDtqHAkhRgH773diqbn19cTO/MgpuS20EqQDgBBzUy9goZQyXUp5FogEmhbot1AUJceQ5lVwsDVoVwuVm6rtOouYiCWT8RbxGJ748q7bAOstv6TwIjBCCLFZCPGV+bEFGA1MKsjJhRA25ttDV4B1Uso9ubqDgctSylPm515AdK7+GHPbP885VggRKoQIjYuLK0gYimJVyrk40K+xN8sOXCAuOUObnnrlmNquswiIPx9Bi4u/cLBMJzzq37EMrEjIb0rqZSllS+BDIMr8+FBK2UJKeakgJ5dSZptvE3kDTYUQ9XJ1D+b2VQLA3YoO3fHVRko5U0rZRErZxN3dvSBhKIrVGdXaj0yTifm7oqBePzCWVdNTi4CrS14mExs8+n6hdyj3dN9VW1LKTVLKqebHxod5EyllArAZbawAIYQt0AdYlOuwGKByrufeQOzDvJ+iWDt/dxc61q7A/N3nuIkDNDJv15mk/knp5XrY79RM2sFmzxF4VSm61ZgttpRXCOEuhChr/tkR6AhEmLs7AhFSyphcL/kdGCSEcBBC+AHVgb2Wik9RSroxwf5cT81kyYEYaGLerjNUbdepi8w05Jo3OCW9aNDnLb2jyZcl6zt4ApuEEOHAPrQxhVXmvkHkvXV0awX1b8Ax4E9gvJRSFW5RlIcU5OtKQOWyzN52huyyvlC9s7aHs9qus9Alb5qCW0YsG31fpYpHmfu/QEcWSwpSynApZUMpZQMpZT0p5eRcfcOllDPu8pqPpZRVpZQ1pZRrLRWbolgDIQRjgv2IuprK+uOXtemparvOwnf9HA67vmZ1djO69hyodzT3pSrBKUoJ1rVuRbxdHflp6xmo2h7c/NUGPIUsbfWbZJoEB2q/ik85Z73DuS+VFBSlBLO1MTCylR+h565zICYRgsZA9B61XWdhiVyPMXIN32c/yTOdW+kdTYGopKAoJdyAoMqUNtoya9sZtV1nYcpKJ2v1a0TJisTVG4Nv+aJ/lQAqKShKiefiYMtTzXz488glzqfaQ4MBarvOwrDre2yvn+H9zGE817GO3tEUmEoKimIFhrf0xcYg+HnHWe0WUlYaHPxF77BKrsQY5Jb/Y50Molzg4/gVk6sEUElBUaxCxTJGegRU4rfQaBJK11DbdVraX++QlZ3F5IwhTGxfXe9oHohKCopiJcYE+5Oakc2ve85D0GhIOAcn/9Q7rJLnzGY4toLp2b0ICgwsVlcJoJKColiN2p6lCa5ennk7o0iv/ji4+sHf/4GMVL1DKzmyMmDN61x38OKHzCeY0L747XinkoKiWJExwf5cuZHO74fjoOd3cO0MbP5U77BKjj0zIP4Eb6c+TbdAP/zdXfSO6IGppKAoViS4enlqVSzFrG1nkb7B2padu6bBhQN6h1b8JV2ELZ9zqmwr/soKLJZXCaCSgqJYFSEEo4P9OXH5BltPxUOnyeBSAVZO0G59KA9v3bvI7EzGXx1Aj4BKVC2GVwmgkoKiWJ2eAZXwKOWgLWZzLAtPTIErR2HHt3qHVnyd+BMOL2Znxac5leVe7GYc5aaSgqJYGXtbA8Nb+bLtVDzHYpOg1uNQtw9s/QKuRNz/BEpe18/B8mfJ8qjH+Oh29GhQiWoexfMqAVRSUBSr9HRTH5ztbfh2w0mtodsXYO8Mv09QaxceRFYGLB4O0sTsSh+SmGnDCx2K51jCLSopKIoVKuNkx7i2Vfnr6GVCo66Bizt0/Rxi9qltOx/Eunch9gBJXb7huwNZdG9QiWoepfSO6l9RSUFRrNSoYD88SjnwyZrjSCm1mkjVOsGGyXA9Su/wir6jK7QpqM2f592T/mRkm5jUofiOJdyikoKiWCkne1te7lSDA+cT+OvoJRACun8NwgB/TAIp9Q6x6Lp6Wpux5dWEzVXGszIslvHtqhXrsYRbVFJQFCvWr7E31T1c+PzPE2Rmm6BsZej4gVaqIexXnaMrojJvwuJhYGNLaq/ZvPP7Sap5uPDcY1X1juyRUElBUayYrY2BN7vV4mx8CiF7z2uNTUZBlZbw19tw45K+ARZFf74Jlw5D7x/5ck8qsYk3+bxvfRxsbfSO7JFQSUFRrFz7Wh4093fj2/WnuJGWCQYD9JwKmWmw5lW9wytaDi2C/XOh9UuEOTZjzs6zDGnmQ2MfN70je2RUUlAUKyeE4K1utbmaksHMrWe0xvLVoN1bcPwPOLZS3wCLiisRsOpF8GlFZtu3eXNpOBVKGXm9a029I3ukVFJQFIWAymXpEVCJn7ad4XJSmtbYYiJ4BsDqV9UubRkp8NtQbS1H39nM3H6eiEs3+O+T9ShltNM7ukdKJQVFUQB4rXNNsk2Sr9eZF7TZ2ELPaZB6VSuxba2khFUvQ/xJ6DuLM+ml+HbDKR6vX5FOdSroHd0jp5KCoigAVCnnxDPNffktNJqTl29ojZ4NoPWL2kykyA36BqiXA/+D8IXw2FuYfNvy1rLDGG0NfNCzrt6RWYTFkoIQwiiE2CuEOCSEOCqE+DBX30QhxAlz+xfmNnshxBwhxGHzax6zVGyKotzdxPbVcHaw5bO1uWogtXkdylWHP16E9GT9gtPDxXBY8xr4t4M2r/JbaDR7zl7j7cdr41HKqHd0FmHJK4V0oL2UMgAIBLoKIZoLIdoBvYAGUsq6wJfm48cASCnrA52Ar4QQ6kpGUQqRq7M949tVY2PEFXaejtca7YzQaxokRmurna1FWpK2HsHJDfr8xJXkTD5ec5zm/m4MDKqsd3QWY7EPXam59bXCzvyQwHPAZ1LKdPNxV8zH1AE25GpLAJpYKj5FUe5ueEtfKpUx8tnaCEwm86rmKs2h6RitLtL53foGWBikhN8nahVQ+/0MLu588MdR0rNMfNqnAUIIvSO0GIt+ExdC2AghwoArwDop5R6gBhAshNgjhNgihAgyH34I6CWEsBVC+AGNgZKbjhWliDLa2fBK55qExySy6vDF2x0d3ocy3tqHZWaafgEWhr0/wbEV0OE98GnJ30cvsebwJSZ1qI5feWe9o7MoiyYFKWW2lDIQ8AaaCiHqAbaAK9AceA34TWhp92cgBggFvgF2Aln/PKcQYqwQIlQIERoXF2fJ8BXFaj3Z0IvanqX5v78iSM8yl9J2cIEe32izcLb+n74BWlLMfm01d42u0PIFbqRl8t7Ko9SqWIqxbfz1js7iCuWevZQyAdgMdEX74F9mvr20FzAB5aWUWVLKl6SUgVLKXkBZ4NRdzjVTStlEStnE3d29MMJXFKtjYxC8/Xgtoq/dZP6uc7c7qnWEgKdgxzfaIGxJk3pN2x+hlCc8+QMYDHzx5wku30jjs74NsLMp+cOclpx95C6EKGv+2RHoCEQAK4D25vYagD0QL4RwEkI4m9s7AVlSymOWik9RlPwFV3cnuHp5pm6MJDE183ZHl4/B0VXbkCf7jov54stkghXPwY2L0H8uOLkRGnWN+bvPMaKlH4GVy+odYaGwZNrzBDYJIcKBfWhjCqvQbhP5CyGOAAuBYVJKCXgAB4QQx4E3gGcsGJuiKAXwZrdaJKVlMn1L5O1GJzd4/Eu4eAh2TdMvuEdt11Q4+aeW9Lwbk56VzZvLDuNV1pFXOtfQO7pCY2upE0spw4GGd2nPAIbcpT0KKFlFRBSlmKtbqQy9G3oxZ0cUQ1v44lXWUeuo0wtqdYfNn2r/LV+8t6Dk3C5Y/6H2ezUdC8D0TaeJvJLMnBFBODtY7KOyyCn5N8gURflXXumsfVf76u8TtxuFgCe+AhsH+OMF7dZLcZUSD0tGgKuPVh1WCE5dvsH0zZH0CqxEu5oeekdYqFRSUBQlX15lHRnZyo/lBy9wNDbxdkepitqtlnM7YP8c/QL8N0zZsGyMNsDcfx4Yy2AySd5cdhgXB1ve615H7wgLnUoKiqLc13OPVaWMo13e8hcADYeAX1tY9z4kxugT3L+x7Ss4vREe/0Kr8wT8uucc+89d5z9P1KGci4POARY+lRQURbmvMo52TGxfnW2n4tl6Mtf6ICGg53cgs7VKosVpX+czm2HTJ9BgIDQaBsDFxJt8/ucJgquXp08jL33j04lKCoqiFMiQ5lWo7ObIp2sjyDbl+vB39dVW/p76Cw4v1i2+B5J0EZaOBvea0P1rEAIpJe+uOEqWycTHT9Yv0aUs8qOSgqIoBeJga8NrXWpx/GISKw5eyNvZdCx4B8HaNyC5iFcayM6CpaO0jXP6z9M2zgHWHrnE+uOXeblTDaqUc9I5SP2opKAoSoF1r+9JgHcZvvr7BGmZ2bc7DDbahjwZyfDnG/oFWBCbzIPj3b8Bj1oAJKZqpSzqeZVmZCs/nQPUl0oKiqIUmMEgeLNbbWIT05izIypvp0ctaPMaHFkKEWt0ie++Tv4N26doYwgBA3OaP117nOupGXzWpwG2VlDKIj/W/dsrivLAWlQtR4daHkzfFMm1lIy8na1eBI+6sPplSEu8+wn0khANy8dChfrQ7fOc5l2nr7JwXzSjg/2o51VGxwCLBpUUFEV5YG90q0VKRhbTNkbm7bC1h15TIfkyrHtPn+DuJitDW6CWnQUD5oGdtjI7LTObt5cfpoqbEy92sJ5SFvlRSUFRlAdWo0IpBjSpzPzdUZy/mpq306sxtBgP++fC2a26xHeH9R9AzD4tYZWrmtP83YZTnI1P4dM+9XG0t9EvviJEJQVFUR7KS51qYGsw8H+5y1/c8tjb4OoHv78AGal39hemY7/D7u+h6bNQt/ft5tgkZm49Q7/G3rSqVl7HAIsWlRQURXkoFUobGRPsxx+HYjkUnZC3095JW9R2/Sxs/kSfAAGunYGV47Wrl84f5TRnmyRvLQunjKMd7zxeW7/4iiCVFBRFeWhj21alnLM9n6w5jvznama/NtB4OOz6Hi7sL/zgMtPgt2EgDNBvjjbeYTZ3ZxSHYhJ5v2ddXJ3t8zmJ9VFJQVGUh+biYMuLHauz5+w1NkZcufOATpPBpSKsnKgN9hamv96CS+HQ+0etAqpZ9LVUvvr7BO1qutOjgWfhxlQMWE+RcEVRLGJQ0yr8vCOKz9ZG0LaGe955/sYy0H0KhAzStvBs+/qDv4HJBOlJ2hTXtAS4mXD757TEvM9z/xx/ElpNgppdc04lpeQ/K44A8FFv6y1lkR+VFBRF+VfsbAy80bUm4345wJL9MQxqWiXvATW7Qb1+sOULrRSGvUuuD/CE+3+4pyUB+RTaEwYt+RjLgmNZ7efSlaDOk3ckod8PxbLlZBzv96hze8MgJQ+VFBRF+de61K1IYx9Xpqw7Sc/ASjjZ/+OjpdvnWonq+U/e/QS2jtqH+a0P9VKe4FH7zg97Y9lcx5l/diilVWu9j2spGXz4xzECK5dlaAvff/9Ll1AqKSiK8q8JIXj78Vr0/WEXs7ad5YUO1fMe4FwexmyEmNC7f8DbWn7fgo9WHyPpZiaf9a2PjUHdNroXlRQURXkkGvu40bVuRX7ccprBTavgXuofH/RuftpDB1tPxrHswAUmtq9GrYqldYmhuFCzjxRFeWRe71qTtCwT3204pXcoOVIzsnhnxWH83Z0Z366a3uEUeSopKIryyPi7u/BU0yos2Hue03HJeocDwNfrThJ97Saf9q6P0U6Vsvj/9u48xqryjOP498fisAyLsigyLKIsso3LqJi4tFatkiqlaqI2pnaJ0bTW1tgosYmWlFQ0qUukqVi32kZj1aZpGyGujWgUwTIsKiiiMoqKC2sEZebpH+ed2wvOAJeZO3Pv5fdJbubwnvec8z453Pvc8773vGdPnBTMrF1ddfpoenTrwi3zWpj+ooMta9jIPQvWcNHxwzlh1IDObk5ZcFIws3Y1sLqKy089Z1snyAAACRZJREFUnHkrPmTxu591Wju+amzi2seWMrC6iuvOHtdp7Sg3Tgpm1u5+fPJhDO5Txax/tzD9RQe5Z8EaXlu3iZnTJtCvZ/dOaUM5KlpSkNRD0kJJ9ZJWSPpN3rorJa1M5Tensu6SHpC0TNLrkmYUq21mVly9DujG1WeM4dX3NjB/xYcddtympmD1+i08uriBW59cxZnjD+asiZ7KohDF/EnqduC0iNgiqTuwQNITQE9gGjA5IrZLGpzqXwBURcQkSb2A1yQ9FBHvFLGNZlYk5x9bwz0L1jB73kq+deTBdG/nx1xGBOs2bmNpwwbqGzaytGEDSxs2snnbDgAO6duDmdMmtusx9wdFSwqRXTM2//yge3oFcAVwU0RsT/WaZ9EKoLekbmSJ40tgU7HaZ2bF1a1rF2ZMHceP7l/Ewwvf45I23kX8+dYvqU8f/PVrs0TwyZbtAHTvKsYd0pdzaw+ltqY/k4f144hB1fv985b3RVFvXpPUFVgMHAHMiYiXJY0BTpY0C9gGXBMRrwCPkl1BrAN6Ab+MiK+NUkm6DLgMYPjw4buuNrMS8s2xg5ky6iBue+pNph9TQ3XV3n3kbN2+g2Xvb9zpKmDtZ18A2YwWhw+q5pQxA7MEUNOPI4f09c9N20lRk0JENAJHSeoP/F3SxHTMA4EpwHHAI5JGAccDjcChaf3zkp6KiLd32edcYC5AXV1d54xgmdlekcSMs49k2pwXmPuf1Vx95tiv1dm+o5E31m3eKQG89fEWmtK7e2j/ntQO68f3TxhBbU1/Jg7tS58eHjgulg6Z5iIiNkh6DjgLaAAeT91LCyU1AQOBi4F5EfEV8LGkF4A64O1WdmtmZaB2WH/OqT2Uu59fw4XHD2fL9h3Ur03dQA0beGPdZr5sbAJgQO8DmFzTj6mThlBb059JNf0YWF38eZHs/4qWFCQNAr5KCaEncDowm2yc4TTgudSVdADwCfAecJqkv5B1H00BbitW+8ys4/zqzLHMW76Ok2Y/k7sCqK7qxsShffnhSSNz3UBD+/f0Mw46WTGvFIYAD6RxhS7AIxHxL0kHAPdKWk42mPyDiAhJc4D7gOWAgPsiYmkR22dmHWT4gF7Mmj6JFe9vZHJNf2qH9WPUwGq6eLbSkqPOurGkPdTV1cWiRYs6uxlmZmVF0uKIqGtpnX+vZWZmOU4KZmaW46RgZmY5TgpmZpbjpGBmZjlOCmZmluOkYGZmOU4KZmaWU9Y3r0laD7zbhl0MJJtio5JUYkz5Kjm+So6tWaXHWC7xjYiIQS2tKOuk0FaSFrV2V1+5qsSY8lVyfJUcW7NKj7ES4nP3kZmZ5TgpmJlZzv6eFOZ2dgOKoBJjylfJ8VVybM0qPcayj2+/HlMwM7Od7e9XCmZmlqdskoKkYZKelfS6pBWSrkrlB0l6UtKb6e+BedvMkPSWpJWSvp1XPkvSWklb9nDMYyUtS/u4Q+mRUJJOkfSqpB2Szq+guC5P5UskLZA0vi2xlWB8l0pan+JbIuknFRTbrXlxrZK0oS2xlWiMIyQ9LWmppOck1ZRhbC3WUzt+prRZRJTFi+xJbsek5T7AKmA8cDNwXSq/DpidlscD9UAVcBiwGuia1k1J+9uyh2MuBE4kexLcE8DZqXwkMBn4M3B+BcXVN6/OuWTPzK6k83YpcGcl/p/cpc6VwL2VFiPwN7InNUL2SN8HyzC2FuvRjp8pbT7nnXnwNp7QfwBnACuBIXkneWVangHMyKs/Hzhxl320egLTvt7I+/dFwF271Lm/vU9gKcSVV/5EJZ032jkplFJsu9R7ETij0mIEVgA1aVnApnKKbW/qFeMzpdBX2XQf5ZM0EjgaeBk4OCLWAaS/g1O1ocDavM0aUtneGpq22dftC1YKcUn6qaTVZN+Wfl5YBLtXCvEB56Xuh0clDSsogN0okdiQNILsW+wzBex3r5RAjPXAeWl5OtBH0oAC9t2qDoqtLJRdUpBUDTwG/CIiNu2uagtlhfzUqq3bF6RU4oqIORFxOHAt8OsC9rv7g5ZGfP8ERkbEZOAp4IEC9tv6AUsjtmYXAo9GRGMB+93zgUsjxmuAUyX9FzgVeB/YUcC+Wz5gx8VWFsoqKUjqTnby/hoRj6fijyQNSeuHAB+n8gYg/5tgDfDBbvbdNW+gbmbaPn8ga7fbt0WJxvUw8N19iaeFNpREfBHxaURsT+V3A8e2LbLSiS3PhcBD+xpPK+0oiRgj4oOI+F5EHA1cn8o2llFs5aEz+64K7O8T2SDMbbuU38LOg0I3p+UJ7Dwo9DZpUGhP/Xp5618hGxhqHvCa2t79f6UUFzA6r845wKJKOm+kfuK0PB14qVJiS+vGAu+Q7j9qj1cpxUg22VyXtDwLmFluse2pHiUwptBpB96HE3gS2aXaUmBJek0FBgBPA2+mvwflbXM92S8EVpL3Kw2y/vIGoCn9vbGVY9YBy9M+7mx+swHHpe22Ap8CKyokrtvJBvOWAM8CEyrsvP0uxVef4htXKbGldTcCN1Xw++78dLxVwJ+AqjKMrcV6tONnSltfvqPZzMxyympMwczMistJwczMcpwUzMwsx0nBzMxynBTMzCzHScGsAJIa081IKyTVS7pa0m7fR5JGSrq4o9po1hZOCmaF+SIijoqICWSTp00FbtjDNiMBJwUrC75PwawAkrZERHXev0eR3YE7EBgBPAj0Tqt/FhEvSnoJOBJYQzbf0h3ATcA3yO6OnRMRd3VYEGa74aRgVoBdk0Iq+xwYB2wGmiJim6TRwEMRUSfpG8A1EfGdVP8yYHBE/FZSFfACcEFErOnQYMxa0K2zG2BWAZpnz+wO3CnpKKARGNNK/TOByXlP2OoHjCa7kjDrVE4KZm2Quo8ayWbSvAH4CKglG6/b1tpmwJURMb9DGmlWAA80m+0jSYOAP5I9zS3IvvGvi4gm4BKga6q6mexxj83mA1ekaZuRNEZSb8xKgK8UzArTU9ISsq6iHWQDy79P6/4APCbpArJZWLem8qXADkn1ZFMj3072i6RX00Pp19NOz64waysPNJuZWY67j8zMLMdJwczMcpwUzMwsx0nBzMxynBTMzCzHScHMzHKcFMzMLMdJwczMcv4HEXedtCkCYNYAAAAASUVORK5CYII=",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"plt.plot(X_test, y_test, label='Actual level')\n",
"plt.plot(X_test, flaml_y_pred, label='FLAML forecast')\n",
"plt.xlabel('Date')\n",
"plt.ylabel('CO2 Levels')\n",
"plt.legend()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 3. Forecast Problems with Exogenous Variables"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Load Data and Preprocess\n",
"\n",
"Load dataset on NYC energy consumption. The task is to predict the average hourly demand of enegry used in a day given information on time, temperature, and precipitation. Temperature and precipiation values are both continuous values. To demonstrate FLAML's ability to handle categorical values as well, create a column with categorical values, where 1 denotes daily tempurature is above monthly average and 0 is below."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"''' multivariate time series forecasting dataset'''\n",
"import pandas as pd\n",
"# pd.set_option(\"display.max_rows\", None, \"display.max_columns\", None)\n",
"multi_df = pd.read_csv(\n",
" \"https://raw.githubusercontent.com/srivatsan88/YouTubeLI/master/dataset/nyc_energy_consumption.csv\"\n",
")\n",
"# preprocessing data\n",
"multi_df[\"timeStamp\"] = pd.to_datetime(multi_df[\"timeStamp\"])\n",
"multi_df = multi_df.set_index(\"timeStamp\")\n",
"multi_df = multi_df.resample(\"D\").mean()\n",
"multi_df[\"temp\"] = multi_df[\"temp\"].fillna(method=\"ffill\")\n",
"multi_df[\"precip\"] = multi_df[\"precip\"].fillna(method=\"ffill\")\n",
"multi_df = multi_df[:-2] # last two rows are NaN for 'demand' column so remove them\n",
"multi_df = multi_df.reset_index()"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"''' Use feature engineering to create a categorical value'''\n",
"# Using temperature values create categorical values \n",
"# where 1 denotes daily tempurature is above monthly average and 0 is below.\n",
"\n",
"def get_monthly_avg(data):\n",
" data[\"month\"] = data[\"timeStamp\"].dt.month\n",
" data = data[[\"month\", \"temp\"]].groupby(\"month\")\n",
" data = data.agg({\"temp\": \"mean\"})\n",
" return data\n",
"\n",
"monthly_avg = get_monthly_avg(multi_df).to_dict().get(\"temp\")\n",
"\n",
"def above_monthly_avg(date, temp):\n",
" month = date.month\n",
" if temp > monthly_avg.get(month):\n",
" return 1\n",
" else:\n",
" return 0\n",
"\n",
"multi_df[\"temp_above_monthly_avg\"] = multi_df.apply(\n",
" lambda x: above_monthly_avg(x[\"timeStamp\"], x[\"temp\"]), axis=1\n",
")\n",
"\n",
"del multi_df[\"month\"] # remove temperature column to reduce redundancy"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>timeStamp</th>\n",
" <th>demand</th>\n",
" <th>precip</th>\n",
" <th>temp</th>\n",
" <th>temp_above_monthly_avg</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>2012-01-01</td>\n",
" <td>4954.833333</td>\n",
" <td>0.002487</td>\n",
" <td>46.510000</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>2012-01-02</td>\n",
" <td>5302.954167</td>\n",
" <td>0.000000</td>\n",
" <td>40.496667</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>2012-01-03</td>\n",
" <td>6095.512500</td>\n",
" <td>0.000000</td>\n",
" <td>26.672500</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>2012-01-04</td>\n",
" <td>6336.266667</td>\n",
" <td>0.000000</td>\n",
" <td>20.585000</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>2012-01-05</td>\n",
" <td>6130.245833</td>\n",
" <td>0.000000</td>\n",
" <td>33.577500</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>...</th>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1864</th>\n",
" <td>2017-02-07</td>\n",
" <td>5861.319833</td>\n",
" <td>0.011938</td>\n",
" <td>39.020417</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1865</th>\n",
" <td>2017-02-08</td>\n",
" <td>5667.644708</td>\n",
" <td>0.001258</td>\n",
" <td>47.305417</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1866</th>\n",
" <td>2017-02-09</td>\n",
" <td>5947.661958</td>\n",
" <td>0.027029</td>\n",
" <td>29.242500</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1867</th>\n",
" <td>2017-02-10</td>\n",
" <td>6195.122500</td>\n",
" <td>0.000179</td>\n",
" <td>25.048750</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1868</th>\n",
" <td>2017-02-11</td>\n",
" <td>5461.026000</td>\n",
" <td>0.000492</td>\n",
" <td>37.175000</td>\n",
" <td>1</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>1869 rows × 5 columns</p>\n",
"</div>"
],
"text/plain": [
" timeStamp demand precip temp temp_above_monthly_avg\n",
"0 2012-01-01 4954.833333 0.002487 46.510000 1\n",
"1 2012-01-02 5302.954167 0.000000 40.496667 1\n",
"2 2012-01-03 6095.512500 0.000000 26.672500 0\n",
"3 2012-01-04 6336.266667 0.000000 20.585000 0\n",
"4 2012-01-05 6130.245833 0.000000 33.577500 1\n",
"... ... ... ... ... ...\n",
"1864 2017-02-07 5861.319833 0.011938 39.020417 1\n",
"1865 2017-02-08 5667.644708 0.001258 47.305417 1\n",
"1866 2017-02-09 5947.661958 0.027029 29.242500 0\n",
"1867 2017-02-10 6195.122500 0.000179 25.048750 0\n",
"1868 2017-02-11 5461.026000 0.000492 37.175000 1\n",
"\n",
"[1869 rows x 5 columns]"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# split data into train and test\n",
"num_samples = multi_df.shape[0]\n",
"multi_time_horizon = 180\n",
"split_idx = num_samples - multi_time_horizon\n",
"multi_train_df = multi_df[:split_idx]\n",
"multi_test_df = multi_df[split_idx:]\n",
"\n",
"multi_X_test = multi_test_df[\n",
" [\"timeStamp\", \"precip\", \"temp\", \"temp_above_monthly_avg\"]\n",
"] # test dataframe must contain values for the regressors / multivariate variables\n",
"multi_y_test = multi_test_df[\"demand\"]\n",
"\n",
"multi_train_df"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Run FLAML"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"[flaml.automl: 08-13 00:54:48] {2540} INFO - task = ts_forecast\n",
"[flaml.automl: 08-13 00:54:48] {2542} INFO - Data split method: time\n",
"[flaml.automl: 08-13 00:54:48] {2545} INFO - Evaluation method: holdout\n",
"[flaml.automl: 08-13 00:54:48] {2664} INFO - Minimizing error metric: mape\n",
"[flaml.automl: 08-13 00:54:48] {2806} INFO - List of ML learners in AutoML Run: ['lgbm', 'rf', 'xgboost', 'extra_tree', 'xgb_limitdepth', 'prophet', 'arima', 'sarimax']\n",
"[flaml.automl: 08-13 00:54:48] {3108} INFO - iteration 0, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3241} INFO - Estimated sufficient time budget=13549s. Estimated necessary time budget=14s.\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 1.7s,\testimator lgbm's best error=0.0854,\tbest estimator lgbm's best error=0.0854\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 1, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 1.7s,\testimator lgbm's best error=0.0854,\tbest estimator lgbm's best error=0.0854\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 2, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 1.8s,\testimator lgbm's best error=0.0525,\tbest estimator lgbm's best error=0.0525\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 3, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 1.8s,\testimator lgbm's best error=0.0525,\tbest estimator lgbm's best error=0.0525\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 4, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 1.9s,\testimator lgbm's best error=0.0406,\tbest estimator lgbm's best error=0.0406\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 5, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 1.9s,\testimator lgbm's best error=0.0406,\tbest estimator lgbm's best error=0.0406\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 6, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 1.9s,\testimator lgbm's best error=0.0406,\tbest estimator lgbm's best error=0.0406\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 7, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.0s,\testimator lgbm's best error=0.0393,\tbest estimator lgbm's best error=0.0393\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 8, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.0s,\testimator lgbm's best error=0.0393,\tbest estimator lgbm's best error=0.0393\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 9, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.0s,\testimator lgbm's best error=0.0393,\tbest estimator lgbm's best error=0.0393\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 10, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.1s,\testimator lgbm's best error=0.0393,\tbest estimator lgbm's best error=0.0393\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 11, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.2s,\testimator lgbm's best error=0.0357,\tbest estimator lgbm's best error=0.0357\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 12, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.2s,\testimator lgbm's best error=0.0357,\tbest estimator lgbm's best error=0.0357\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 13, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.3s,\testimator lgbm's best error=0.0334,\tbest estimator lgbm's best error=0.0334\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 14, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.3s,\testimator lgbm's best error=0.0334,\tbest estimator lgbm's best error=0.0334\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 15, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.3s,\testimator lgbm's best error=0.0334,\tbest estimator lgbm's best error=0.0334\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 16, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.4s,\testimator lgbm's best error=0.0334,\tbest estimator lgbm's best error=0.0334\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 17, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.4s,\testimator lgbm's best error=0.0334,\tbest estimator lgbm's best error=0.0334\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 18, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.5s,\testimator lgbm's best error=0.0334,\tbest estimator lgbm's best error=0.0334\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 19, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:50] {3288} INFO - at 2.5s,\testimator lgbm's best error=0.0334,\tbest estimator lgbm's best error=0.0334\n",
"[flaml.automl: 08-13 00:54:50] {3108} INFO - iteration 20, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 2.6s,\testimator lgbm's best error=0.0334,\tbest estimator lgbm's best error=0.0334\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 21, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 2.6s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 22, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 2.7s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 23, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 2.7s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 24, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 2.8s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 25, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 2.9s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 26, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 2.9s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 27, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.0s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 28, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.0s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 29, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.1s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 30, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.1s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 31, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.2s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 32, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.2s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 33, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.3s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 34, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.3s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 35, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.4s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 36, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.5s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 37, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:51] {3288} INFO - at 3.5s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:51] {3108} INFO - iteration 38, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 3.6s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 39, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 3.7s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 40, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 3.8s,\testimator lgbm's best error=0.0332,\tbest estimator lgbm's best error=0.0332\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 41, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 3.9s,\testimator lgbm's best error=0.0321,\tbest estimator lgbm's best error=0.0321\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 42, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 4.0s,\testimator lgbm's best error=0.0321,\tbest estimator lgbm's best error=0.0321\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 43, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 4.1s,\testimator lgbm's best error=0.0321,\tbest estimator lgbm's best error=0.0321\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 44, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 4.2s,\testimator lgbm's best error=0.0321,\tbest estimator lgbm's best error=0.0321\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 45, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 4.3s,\testimator lgbm's best error=0.0321,\tbest estimator lgbm's best error=0.0321\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 46, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 4.4s,\testimator lgbm's best error=0.0321,\tbest estimator lgbm's best error=0.0321\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 47, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:52] {3288} INFO - at 4.5s,\testimator lgbm's best error=0.0321,\tbest estimator lgbm's best error=0.0321\n",
"[flaml.automl: 08-13 00:54:52] {3108} INFO - iteration 48, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:53] {3288} INFO - at 4.7s,\testimator lgbm's best error=0.0321,\tbest estimator lgbm's best error=0.0321\n",
"[flaml.automl: 08-13 00:54:53] {3108} INFO - iteration 49, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:53] {3288} INFO - at 4.9s,\testimator lgbm's best error=0.0307,\tbest estimator lgbm's best error=0.0307\n",
"[flaml.automl: 08-13 00:54:53] {3108} INFO - iteration 50, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:53] {3288} INFO - at 5.0s,\testimator lgbm's best error=0.0307,\tbest estimator lgbm's best error=0.0307\n",
"[flaml.automl: 08-13 00:54:53] {3108} INFO - iteration 51, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:53] {3288} INFO - at 5.2s,\testimator lgbm's best error=0.0307,\tbest estimator lgbm's best error=0.0307\n",
"[flaml.automl: 08-13 00:54:53] {3108} INFO - iteration 52, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:54] {3288} INFO - at 5.6s,\testimator lgbm's best error=0.0307,\tbest estimator lgbm's best error=0.0307\n",
"[flaml.automl: 08-13 00:54:54] {3108} INFO - iteration 53, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:54] {3288} INFO - at 5.7s,\testimator lgbm's best error=0.0307,\tbest estimator lgbm's best error=0.0307\n",
"[flaml.automl: 08-13 00:54:54] {3108} INFO - iteration 54, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:54] {3288} INFO - at 6.5s,\testimator lgbm's best error=0.0295,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:54:54] {3108} INFO - iteration 55, current learner rf\n",
"[flaml.automl: 08-13 00:54:54] {3288} INFO - at 6.5s,\testimator rf's best error=0.0481,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:54:54] {3108} INFO - iteration 56, current learner rf\n",
"[flaml.automl: 08-13 00:54:54] {3288} INFO - at 6.6s,\testimator rf's best error=0.0409,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:54:55] {3108} INFO - iteration 57, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:56] {3288} INFO - at 7.8s,\testimator lgbm's best error=0.0295,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:54:56] {3108} INFO - iteration 58, current learner rf\n",
"[flaml.automl: 08-13 00:54:56] {3288} INFO - at 7.9s,\testimator rf's best error=0.0409,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:54:56] {3108} INFO - iteration 59, current learner lgbm\n",
"[flaml.automl: 08-13 00:54:57] {3288} INFO - at 8.8s,\testimator lgbm's best error=0.0295,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:54:57] {3108} INFO - iteration 60, current learner rf\n",
"[flaml.automl: 08-13 00:54:57] {3288} INFO - at 8.9s,\testimator rf's best error=0.0409,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:54:57] {3108} INFO - iteration 61, current learner rf\n",
"[flaml.automl: 08-13 00:54:57] {3288} INFO - at 9.0s,\testimator rf's best error=0.0403,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:54:57] {3108} INFO - iteration 62, current learner xgboost\n",
"[flaml.automl: 08-13 00:54:59] {3288} INFO - at 10.9s,\testimator xgboost's best error=0.6546,\tbest estimator lgbm's best error=0.0295\n",
"[flaml.automl: 08-13 00:55:00] {3552} INFO - retrain lgbm for 0.8s\n",
"[flaml.automl: 08-13 00:55:00] {3559} INFO - retrained model: LGBMRegressor(boosting_type='gbdt', class_weight=None, colsample_bytree=1.0,\n",
" importance_type='split', learning_rate=0.13459582172600984,\n",
" max_bin=63, max_depth=-1, min_child_samples=4,\n",
" min_child_weight=0.001, min_split_gain=0.0, n_estimators=769,\n",
" n_jobs=-1, num_leaves=14, objective=None, random_state=None,\n",
" reg_alpha=0.01620659563289605, reg_lambda=823.3392094935695,\n",
" silent=True, subsample=1.0, subsample_for_bin=200000,\n",
" subsample_freq=0, verbose=-1)\n",
"[flaml.automl: 08-13 00:55:00] {2837} INFO - fit succeeded\n",
"[flaml.automl: 08-13 00:55:00] {2838} INFO - Time taken to find the best model: 6.466601371765137\n"
]
}
],
"source": [
"from flaml import AutoML\n",
"automl = AutoML()\n",
"settings = {\n",
" \"time_budget\": 10, # total running time in seconds\n",
" \"metric\": \"mape\", # primary metric\n",
" \"task\": \"ts_forecast\", # task type\n",
" \"log_file_name\": \"energy_forecast_categorical.log\", # flaml log file\n",
" \"eval_method\": \"holdout\",\n",
" \"log_type\": \"all\",\n",
" \"label\": \"demand\",\n",
"}\n",
"'''The main flaml automl API'''\n",
"try:\n",
" import prophet\n",
"\n",
" automl.fit(dataframe=multi_train_df, **settings, period=multi_time_horizon)\n",
"except ImportError:\n",
" print(\"not using prophet due to ImportError\")\n",
" automl.fit(\n",
" dataframe=multi_train_df,\n",
" **settings,\n",
" estimator_list=[\"arima\", \"sarimax\"],\n",
" period=multi_time_horizon,\n",
" )"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Prediction and Metrics"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Predicted labels [5816.55214206 5905.70575447 5834.65947255 5859.27980938 5981.48416072\n",
" 5882.10376347 5551.01099266 5435.01475996 5471.23496654 5569.51972162\n",
" 5540.85105607 5396.28260123 5462.80727224 5434.07936867 5511.56573622\n",
" 5574.25340904 5447.35826516 5555.70270542 5657.09729925 5949.68245693\n",
" 5968.01898401 5791.25872055 5712.91089711 5662.27425465 5638.28515788\n",
" 5630.24299004 5910.41105978 6305.40454128 5936.47176065 6052.48858565\n",
" 6249.11522565 6472.15770449 6203.12944304 5678.46456796 5514.886543\n",
" 5479.08912968 5521.10433109 5527.11532443 5813.63023796 6112.91322944\n",
" 5661.36696892 5146.82879591 5332.76868247 5649.76768828 5741.38048438\n",
" 5412.30150321 5421.63190406 5648.53641688 5106.03841483 4982.47727835\n",
" 5403.54145745 5665.41675811 5333.27032542 5619.01917593 5406.80836839\n",
" 5103.61165688 4886.87942122 5323.86713609 5539.9843075 5518.89510834\n",
" 5327.27090772 5287.19840493 4934.96448807 5471.16766936 6062.58941089\n",
" 5431.09529611 5470.89238927 5362.88508552 5380.14421299 5071.58037314\n",
" 4853.29025196 5452.0224224 5572.37008992 5498.91273931 5597.20203709\n",
" 6070.93048894 6251.17800295 4781.82352937 5360.66527007 5792.43908038\n",
" 5328.08151132 5327.64714018 5233.11697526 4858.66192485 4732.03629459\n",
" 5259.85731069 5283.86181827 5344.58667455 5333.25709336 5391.41383181\n",
" 5114.90778764 4743.65152646 5282.94743208 5842.67839729 7218.28513354\n",
" 8102.89494987 8101.82094708 4992.07237744 4849.45536478 5339.87449948\n",
" 5644.16743177 5740.54108609 5483.18411931 5690.66143676 5473.76918472\n",
" 5057.36948366 5499.63825649 5542.8271924 5811.15446061 6387.26085946\n",
" 6019.77223695 5407.7611932 5322.47209288 5922.02023651 5531.77492267\n",
" 5635.0666545 5614.58478836 6162.10202722 6732.07200697 7615.59654474\n",
" 8643.74398328 8760.84386874 7426.89230032 6302.30839453 5677.66060946\n",
" 5877.90171155 6598.7972824 7908.18136152 7446.21819637 7413.88711491\n",
" 7392.86850738 7696.63934358 7598.28250622 6410.4477643 6583.80575997\n",
" 6807.47299577 6461.56057374 7449.91648426 8400.89774327 7197.86389915\n",
" 7341.789892 8049.63278137 7769.12525719 7580.93092426 7085.35092664\n",
" 6812.1738561 7061.31018238 6429.46178215 6973.40579451 7629.68254393\n",
" 8397.04287117 8452.93805232 6881.50754923 7008.46256787 6934.24421662\n",
" 7549.45383053 8179.00897173 8541.82708366 8793.42740263 8811.52537154\n",
" 8253.01929193 7091.51516967 6555.81711197 6578.04691356 7603.44781028\n",
" 6999.28814622 7306.04157302 5795.93681969 5934.24027018 7392.77266682\n",
" 8133.39055379 8167.0181761 7735.13694118 7802.15314154 6945.96808164\n",
" 5654.34743886 6757.66283603 6782.48752487 7613.98500771 7756.78074882]\n",
"True labels 1869 5486.409375\n",
"1870 6015.156208\n",
"1871 5972.218042\n",
"1872 5838.364167\n",
"1873 5961.476375\n",
" ... \n",
"2044 5702.361542\n",
"2045 6398.154167\n",
"2046 6471.626042\n",
"2047 6811.112167\n",
"2048 5582.297000\n",
"Name: demand, Length: 180, dtype: float64\n"
]
}
],
"source": [
"''' compute predictions of testing dataset '''\n",
"multi_y_pred = automl.predict(multi_X_test)\n",
"print(\"Predicted labels\", multi_y_pred)\n",
"print(\"True labels\", multi_y_test)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"mape = 0.04171642942259451\n"
]
}
],
"source": [
"''' compute different metric values on testing dataset'''\n",
"from flaml.ml import sklearn_metric_loss_score\n",
"print('mape', '=', sklearn_metric_loss_score('mape', y_true=multi_y_test, y_predict=multi_y_pred))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Visualize"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEGCAYAAACUzrmNAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nOy9eZxdZZ3n/37Ocs+9t5akKgkBEiABw2JYkoCIzWLjAjhuDS7gdKu0Y2Or3WozPdM4PSN2z9A/xkbbVgeVwbVbIigi2N04KCAItgiyCLJFIEDIQlKp7W5nfX5/PM+5S9XdqlKVukme9+tVr7r33HPPfc6t5HzOdxdSSgwGg8FgaIe10AswGAwGQ+9jxMJgMBgMHTFiYTAYDIaOGLEwGAwGQ0eMWBgMBoOhI85CL2C+WLp0qVy1atVCL8NgMBj2KX7961/vklIum7p9vxWLVatW8cADDyz0MgwGg2GfQgjxfLPtxg1lMBgMho4YsTAYDAZDR4xYGAwGg6Ej+23MohlhGLJlyxYqlcpCL8UwC7LZLCtXrsR13YVeisFwwHFAicWWLVsYGBhg1apVCCEWejmGGSClZGRkhC1btrB69eqFXo7BcMBxQLmhKpUKS5YsMUKxDyKEYMmSJcYqNBgWiANKLAAjFPsw5m9nMCwcB5xYGAwGw1whpeSG+1+kEsYLvZR5x4jFAnDTTTchhODJJ5/suO/nP/95SqXSrD/rm9/8Jn/2Z3/WdPuyZctYv349a9as4dxzz+UXv/jFrD9nrlm1ahW7du1a6GUYDG159KVx/uuNv+HHj21f6KXMO0YsFoCNGzdyxhln8N3vfrfjvnsqFu248MILeeihh9i0aROXXXYZF1xwAU888cS8fJbBsD+ydUzF0F7cPT//R3sJIxZ7mUKhwL333svXvva1BrGI45i//Mu/5IQTTuDEE0/ki1/8Il/4whfYunUrZ599NmeffTYA/f391fd8//vf5+KLLwbgRz/6Ea9+9atZv349b3jDG9ixY8eM1nX22WdzySWXcM011wDwzDPPcN5553HyySdz5plnVq2giy++mA9/+MOcffbZHHnkkdx111184AMf4LjjjquuBeDDH/4wp5xyCmvXruXyyy+vbl+1ahWXX345GzZs4IQTTqged2RkhHPOOYf169fzoQ99CDPB0bAvsH28DMCW0fICr2T+OaBSZ+v5mx/9lse3TszpMV956CCXv3Vt231++MMfct5553H00UczPDzMgw8+yIYNG7jmmmt47rnneOihh3Ach927dzM8PMznPvc57rzzTpYuXdr2uGeccQa//OUvEUJw7bXX8pnPfIbPfvazM1r/hg0b+OpXvwrAJZdcwle+8hXWrFnDfffdx0c+8hHuuOMOAEZHR7njjju45ZZbeOtb38q9997Ltddey6te9Soefvhh1q1bxxVXXMHw8DBxHPP617+e3/zmN5x44okALF26lAcffJCrr76aq666imuvvZa/+Zu/4YwzzuBTn/oU//qv/1oVLYOhl9k+4QPw0pgRC8Mcs3HjRj7xiU8AcNFFF7Fx40Y2bNjAT3/6U/70T/8Ux1F/kuHh4Rkdd8uWLVx44YVs27aNIAhmVYuQ3s0XCgV+8Ytf8K53vav6mu/71cdvfetbEUJwwgknsHz5ck444QQA1q5dy+bNm1m3bh033HAD11xzDVEUsW3bNh5//PGqWFxwwQUAnHzyyfzgBz8A4O67764+fvOb38zQ0NCM128w7G1qlsX+74aaV7EQQnwc+BNAAP9XSvl5IcQwcD2wCtgMvFtKOar3/yTwn4AY+JiU8v/p7ScD3wRywL8BH5d76KfoZAHMByMjI9xxxx089thjCCGI4xghBJ/5zGeQUnaVGlq/T33NwZ//+Z9z6aWX8ra3vY2f/exnfPrTn57x+h566CGOO+44kiRh8eLFPPzww0338zwPAMuyqo/T51EU8dxzz3HVVVdx//33MzQ0xMUXX9yw1vQ9tm0TRVHTczMY9gW2T6h/11vHKiSJxLL233/D8xazEEIcjxKKU4GTgLcIIdYAlwG3SynXALfr5wghXglcBKwFzgOuFkLY+nBfBi4B1uif8+Zr3fPJ97//fd73vvfx/PPPs3nzZl588UVWr17NPffcwznnnMNXvvKV6sVz9+7dAAwMDDA5OVk9xvLly3niiSdIkoSbbrqpun18fJwVK1YA8K1vfWvGa7vrrru45ppr+JM/+RMGBwdZvXo13/ve9wBlcTzyyCNdH2tiYoK+vj4WLVrEjh07uPXWWzu+56yzzuI73/kOALfeeiujo6MzPgeDYW+zfVyJRRAn7Cz4Hfbet5nPAPdxwC+llCUpZQTcBZwPvB1Ir2bfAv5AP3478F0ppS+lfA74HXCqEOIQYFBK+e/amvh23Xv2KTZu3Mj555/fsO0d73gH1113HR/84Ac5/PDDOfHEEznppJO47rrrABU7eNOb3lQNcF955ZW85S1v4XWvex2HHHJI9Tif/vSnede73sWZZ57ZMb6Rcv3117Nu3TqOPvpo/u7v/o4bb7yR4447DoDvfOc7fO1rX+Okk05i7dq13HzzzV2f50knncT69etZu3YtH/jABzj99NM7vufyyy/n7rvvZsOGDdx2220cfvjhXX+ewbAQSCnZPlHhFQeppJP9Pcgt5ivrRAhxHHAz8BqgjLIiHgDeK6VcXLffqJRySAjxJZS4/LPe/jXgVpSr6kop5Rv09jOBv5JSvqXJZ16CskA4/PDDT37++cYZHk888UT1YmjYNzF/Q0OvMFYKWPe3P+GiVx3Gd+9/kX+8aB1vX7dioZe1xwghfi2lPGXq9nmzLKSUTwD/G/gJ8GPgESBq85Zmzj7ZZnuzz7xGSnmKlPKUZcumTQU0GAyGOSONV2w4QiVj7O8ZUfNaZyGl/JqUcoOU8ixgN7AJ2KFdS+jfL+vdtwCH1b19JbBVb1/ZZLvBYDAsGNt0vOKoZX0M5d393g01r2IhhDhI/z4cuADYCNwCvF/v8n6Uqwq9/SIhhCeEWI0KZP9KSrkNmBRCnCZUusz76t5jMBh6gP2piLLbc9mhxeLgRTlWDuV5yYjFHnGjEOJx4EfAR3WK7JXAG4UQm4A36udIKX8L3AA8jnJbfVRKmXbn+jBwLSro/QwqlmEwGHqAoh+x7m9/wh1PzqxrQC9S9CNO/l8/5aePdz6XbeMVhICDBjxWLM7t97UW81pnIaU8s8m2EeD1Lfa/AriiyfYHgOPnfIEGg2GP2TFRYbwc8uzOIq87dqFXs2c8t6vI7mLA8130etoxUWFpv4drW6wcyvGzp1/uul5qX8T0hjIYDHvEWDkE2C/adKfWQRgnHffdNl7h4MEsAMsGPCphQinY97+DVhix2MvYts26deuqP5s3b+ZnP/sZb3nLtEzgKieddBLvec97GrZdfPHF5PP5hoK9j3/84wghqq2965sONiNtU56u5X3ve98enNncsXnz5mqdiaH3GddiUd4vxELFHaIuxGLHRIWDFymxWCyKHCm2UgzaJXzu2xix2Mvkcjkefvjh6s+qVava7p9Wa999990Ui8WG117xildUi+WSJOHOO++sVnF3y4UXXlhdy7e//e2u3iOlJEk6/2eaLUYs9i3GS6llMX//JvYWaavxIO4c5H550ueg/gz89G94x13n8K+Z/zZv4wR6ASMWPc51113He9/7Xs455xxuueWWhtfe8573cP311wPws5/9jNNPP73aiHBP+NznPsfxxx/P8ccfz+c//3lAXcCPO+44PvKRj7BhwwZefPFF/v7v/55XvepVnHjiiQ1tyL/97W9XK9Hf+973Aq1bqN91111Vy2b9+vVMTk5y2WWX8fOf/5x169bxD//wD3t8Pob5ZbwcYpEcUJZFnEhGSwHHy6fgns8RekPkRIA/uf8O7Dpwu87eehlsf3Ruj3nwCfCmK9vuUi6XWbduHQCrV69u6O/UjOuvv56f/OQnPPXUU3zpS19qcEetWbOGm2++mdHRUTZu3Mgf/dEfddWHaerx77nnHkC5sU488US+8Y1vcN999yGl5NWvfjWvfe1rGRoa4qmnnuIb3/gGV199NbfddhubNm3iV7/6FVJK3va2t3H33XezZMkSrrjiCu69916WLl1a7XHVqoX6VVddxf/5P/+H008/nUKhQDab5corr+Sqq67iX/7lX2Z0LoaFwR/bwa+9P+XGkf8GnLDQy9kjXtQxiyhpb1mMlQKkhMPiLQDsWPMfWfXIVQSTu+d9jQvFgSsWC0TqhuqG+++/n2XLlnHEEUewcuVKPvCBDzA6OtrQvvuCCy7gu9/9Lvfdd191FsVMuPDCC/nSl75Uff6P//iPnH/++fT19VWP//Of/5y3ve1tHHHEEZx22mkA3Hbbbdx2222sX78eUG3NN23axCOPPMI73/nOan+qtNV6qxbqp59+Opdeeil/+Id/yAUXXMDKlfX1l4Z9gRXbf8KQKLCo8tJCL2WPkFJWLYsgam9Z7C4GABwcvgB2hnC5EsmoMDK/i1xADlyx6GAB9AIbN27kySefrMY1JiYmuPHGG/ngBz9Y3eeiiy5iw4YNvP/978ey9tyr2K4gKRWQdL9PfvKTfOhDH2rY5wtf+ELT1MFWLdQvu+wy3vzmN/Nv//ZvnHbaafz0pz/d43Mw7F2O3qX/ZlGl/Y49zu5iUM1mijrE5Ea0WAyVX4Dho3AHVHuhuLj/dks2MYseJUkSvve97/Gb3/yGzZs3s3nzZm6++WY2btzYsN/hhx/OFVdcwUc+8pE5+dyzzjqLH/7wh5RKJYrFIjfddBNnnjmtXIZzzz2Xr3/96xQKBQBeeuklXn75ZV7/+tdzww03MDKi7rBSN1SrFurPPPMMJ5xwAn/1V3/FKaecwpNPPjmtLbuhhym8zFEl1b5exsECL2bPqG/XEXUIcI8U1Ln2Tz4LS9eQ6V8CQFI2bijDPHP77bc3uGA++tGPsmLFiobsprPOOovHH3+cbdu2Nbx36t19SqlUajjmpZdeyqWXXtp2HRs2bODiiy/m1FNPBeCDH/wg69evZ/PmzQ37nXPOOTzxxBO85jWvAVSa7j//8z+zdu1a/vqv/5rXvva12LbN+vXr+eY3v1ltob5ixQpOO+00nnvuOQA+//nPc+edd2LbNq985St505vehGVZOI7DSSedxMUXX8xf/MVfdPj2DAvGE7dgoe7CRbRvz3N4sa4CO+gQ4N5d9HGJyEw8DydeQG6RHgtQHpvPJS4o89aifKE55ZRT5AMPPNCwzbS33vcxf8Me458u4IXnnmRJvIsfZ9/EOz4588FbvcJX7nqGK299kqX9Hr931BK+8J71Lff9/E+f5ke3/4zbvf8C538V/5XvxP5fy3j4iD/mlA98bi+ueu7Z6y3KDQbDAcDkdp6RK/BxseJ93LLYXWJx3mUo73as4N5dDDjB0w2zl64h49hM0Ift77+WhRELg8Ewa2RYYjJ28clg7Qcxi8OG8ri2RdgpZlEMWJvZrp4sWYMQggnRjxsYsdhv2F/dbgcC5m/Xg4QVSolLIB0suW+LxY6JCssHs7i26JwNVfA5ytoO/QdDdhCASTFAJhjfG0tdEA4oschms4yMjJiLzj6IlJKRkRGy2exCL8VQhwxLlPGILBcn2bfFYncxYElfBse2unJDHSG3wNI11W1FawAvmpjvZS4YB1Q21MqVK9myZQs7d+5c6KUYZkE2mzVFe71GWKZCBml7OHFIFCc49r53Dyqlat8x3J/B3S06uqF2FwOWy20wfGp1W8UZIBfvv0M8DyixcF23WjlsMBj2kCTGSgLK0gPbwyOkEiX074NiMVGJCGPJkr4Mrm1R8Ft3j00SSbFUoj8zBotqNy8VZxH5YP+tD9r3/qoGg6E3CFURWwUX4WbxRLjPzrQYTSuy8xkcS7QtyhsvhyyRuvhu4JDq9sAdpF8WINk3v4NOGLEwGAyzQ4tFGQ/bVZZFeR8d/pO271iSg3eOf5MlQWt30kjR52C0WAweWt0eZBarB5X9M8h9QLmhDAbDHBKqiucKGRwvR4Zon7csjnvqapaPfYfnvBzw7qb7jhQCDhHTxSL2FqkH5VHID8/nchcEY1kYDIbZoRsH+mRwMlk8gr0702LzvXPm8tldDDhZPMVBv/kyAHYStt13udANA+vcUImnu0GX989mgkYsDAbD7NCWhZXJY7lZbVnspWl5O5+Cb/4HeOaOOTncSDHgUuf7yP7lAFiytViMFJVlkbh5yC6qvZBXYpGUjFgYDAZDDR2zsL0+hKMC3HvNskjv3ueocd9oKeAgaxxrpWqJZHW0LHbDwKFQ145f5FTMIizunzMtjFgYDIbZocXCyaaWxV4McAd6Hv0czdAYKQQMiApkBghFpm2B4UjBZ6U1irXo0IbtVp+KU+yv0/KMWBgMhtmhxcL1+qrZUH60l8RCf/ZcicXuok+/KIPXTyIcLNm6zuKlsQqHWKPKsqjD1W6o2FgWBoPBUEd6wXbz2F5OWRZtitnm5bPnTCwCcrIMmX5iy8VuE7PYOlpUdRaDjWKRy2aZkDniorEsDAaDoUakLtjS8XBcD1tIKsFealMezq0bqlCcxCYBr59YZHDbiEVlbDsO8TSx6PccxmU/0mRDGQwGQx1hKhY5HC8HQODvpTncqWURzs3nhSXdpiMzQGK5OEQkyfQq7oIf0efrORZTxCLv2YzRhzBiYTAYDHXo1Fnp5nFc1Q04CvaWWOgRqHNgWfhRDIGaJY/XT2K5uESETdqUvzRa5mAxvdUHQF/GoUS2trb9DCMWBoNhdui7euFkEVWxKO+lz05jFnvu9tpdDOhHHy+jxCJD1LTz7EtjpZpYDK5oeC2fsfGlC+G+PTGwFUYsDAbD7AhLVHBxHRtsD4Bob7mhgtSy2HNx2l0M6EOv2+tHWhlcIqImMy1Sy0JaDvQta3itz3OokEHEc/QdlHZDD83eMWJhMBhmR1imIj1c2wJHiUW8191Qc2NZ9InUshggsbUbqollsWWszEHWJOSXgtV4+cxnbHxcxFwE3X/9Tfj7V8AjG/f8WHOEEQuDwTA7wjJlMrhOnVjMUcC5m88G5iRmodxQjZaFJ8Km0/JeGi2zzK1Uq7Xr8RwLHw8r3kMB+8WX4EcfBxnDyDN7dqw5xIiFwdBrxCE8fgs8diO8/ORCr6YlMipTlhlcS1TFIpmlWIRxwiXffoDHXuqyvXc1dXauLAu97kw/UlsWzWZavDRWZqlTbuwJpRFCENseTrKHAvbo92DFycp6Kb68Z8eaQ0yLcoOh13j2Lrjhverx4iPgE79Z2PW0QAYlKmg3lL1nYrF9vMJtj+/g1Ucu4fgV0y/E06imzu55zGKsFNYC3F4/6JhFq2yoRXYZsgc1PVZie9h7Oovcn4RD1ykhLPTOCGhjWRgMvUaap7/yVbUeSD2IDEp1biiVDSVneaefjjHteh7GHGZDFfyIIVsfJzOAdDI6G6pRLPwo5uVJnwGKTS0LUGLhJnu4pqAAmX4VQO8hy2JexUII8RdCiN8KIR4TQmwUQmSFEMNCiJ8IITbp30N1+39SCPE7IcRTQohz67afLIR4VL/2BSHqWj0aDPsbqYulfzkke6l9xiyQYYWKzOgAd0Zvm92FcrKizrPrRoTN6iyCEnxuLTx924w+u+hHLHJ8JXi2A3amqRtq+7j6rFxcaCkWOFlsYoj34O/mT4I3AP0HHRiWhRBiBfAx4BQp5fGADVwEXAbcLqVcA9yunyOEeKV+fS1wHnC1EMLWh/sycAmwRv+cN1/rNhgWnDQt1BsEuZfmQ8yGUFkWGVtULYvZBpwLvmqvUepWLIImYlF8GSa2wPP3zOizJ/2IRbav7uYBbJeMiAimWBYvjZUBSSaabGNZ7Nn3QBIrIUzFovhyz6TPzrcbygFyQggHyANbgbcD39Kvfwv4A/347cB3pZS+lPI54HfAqUKIQ4BBKeW/Sykl8O269xgM+x+p68kbmLNJcPNCWK6LWSjLgllmAlUtixm7oeouyr5u2bH7uRl9dqESscjyVbwCwPbINLEsin5MHh8h45ZiIXSgf9ZikZ6DNwB9B6njpNsWmHkTCynlS8BVwAvANmBcSnkbsFxKuU3vsw1II0UrgBfrDrFFb1uhH0/dPg0hxCVCiAeEEA/s3Nk75pvBMCPCIliuyjCSPSwWUZkKqRtK3VGLPRSL7mMWTeosZikWRT+iX8+yABB286K8IEoYRAt5KzeUm9PrmqVYpG1HMv3KsgAo9sa1bD7dUEMoa2E1cCjQJ4T4o3ZvabJNttk+faOU10gpT5FSnrJs2bJmuxgMvU9QgkyelyYC4j3xfc8zIlSps45dS50V8ewygdIAd/cxiyaWRWVC/R59bkaum4IfqWwobVkIJ0OGcJobKohjBoUWqU5iMdt6kwbLQl/DCr0R5J5PN9QbgOeklDullCHwA+D3gB3atYT+nX4TW4DD6t6/EuW22qIfT91uMOyfhEXI9PPbbQXl8uhRRFSmjEemroLbin3kLHzshZm4oaSsWRZhEzdUUJjR3XjBj8hTqcYshOM1DXD7YWfLwnLTmMUsU3r9tKHhQJ1lsf+LxQvAaUKIvM5eej3wBHAL8H69z/uBm/XjW4CLhBCeEGI1KpD9K+2qmhRCnKaP87669xgM+x9BEdw8hQAsZM8EOBuQEhFVVOpsXZ1Fhgg/ahOUv/Wv4JHrp22erKgAd1eWRVQBpIqTxH7t+/EnavvMwBVV8CNystRoWYiYcMrUvyBOOloWViZ1Q80yfTY9hzRmAfu/ZSGlvA/4PvAg8Kj+rGuAK4E3CiE2AW/Uz5FS/ha4AXgc+DHwUSmrt1UfBq5FBb2fAW6dr3UbDAuOdkMVQn0R7MUgd+QjkPhS11nYDgkWnmg/hzt59PtET/zrtO2T/gwsi9QFlRuurgVoDATvfrar05BSUqhEZJNy1bKwdBpwHDW61FTMIhWL6e0+AGzthkqCWbYpr49Z5JcAomfEYl4ruKWUlwOXT9nso6yMZvtfAVzRZPsDwPFzvkCDoRcJS+D2MRlosZAxPddsQbuBlGWhwoqJ7ZGJQsphzFCLt5VKZXZteYFVU7bPyA2VZovlh6GwXbl83KwSC6Hvf7sUCz9KiBJJJtHpqtQympIp1oEfJQyK9m4oWw+BCv0yXlcrmLqgupiF7SjBOADcUAaDYTYERWSmj0Kg3Tm9aFnou/tqzAJILBePsK0byiUg449M2z6jorx2loU3AItWqiB3FxT8CEFCJi7VLAtXWxZhE7GgrgamCY52QwWVWVbe65jF//zpiyozrIcK84xYGAy9RlAksLJEUv/37MUqbp2FVK3gRlsWhGryXDOkxCOkLxqb9lJhRm4ofcFOO7+mGVH+pLqIDx/ZtWVRqETk0aLgpW6otM9Vo1gEUcKQVQI3X61Yn4qbzat9K7N0Q2nL4tu/HuEdX/4FQVZbFv4kFHfN7phzhBELg6HXCEtUyBKjGhjInrQsam4oR7uhpO3hiQg/bGFZ6LTaATk5zVqaUepsalnktWWRZkT5E8qyGFrddYC74Ee1wUfasrBdJRZT+1wFUcJiq3nH2RTXU2IR+bPMhgom8XFZvXwxT26fZFOpDwo74J8ugI0Xze6Yc0RLR6gQ4ou0qGcAkFJ+bF5WZDAc6ARFSmSJ9b1cHEe9FrGoXrArdW4oaXt4hC0L62RYRqAzvEq7ob9WC1XNhgpjpJS0bf+W9s6quqHqLYsBZVmUd6uGjLlW0RNFwY/oTwcf6ZhFGuBOorBhXz+KWSRKbcUioy2LaLYBbn+SEllevXoJQ/kMT+7OstZ/AcZegEWHd37/6PMwuV21OLfn9l9NO8viAeDXQBbYAGzSP+uAHrzVMRj2E8ISBZmpikUU9aAbKhWLOjcUupitVcwiqJuiVx7b3vDaZCUi1Ye2qbd1n10Vgqkxi2XHqufbH+t4GoVKa8simVKFHURJ92Ixy/Gy0i8wKXP0Zx3OPvYgni7kaq+FXcRBHr4Ovn5uYxrxHNFSLKSU35JSfgtV73C2lPKLUsovojKZ1s35SgwGg3LPRBUm4gyJ/u+Z9GIVdzXArVNnAZxs2wB3WKm5ZiZHtlUfB1GCHyUs6VN39B1dUdPEYoplsWKDer71wY6nUQzqLYtULNJ2642WRRAnDIjW7ckBvJwSi3iWlkVSmaQgcwxkHc4+5iCekweTCJsnvRMJS130iHrmDnX+qYtuDukmZnEoMFD3vF9vMxgMc41OCx2P3Jpl0ZNikcYsvGrqrNBzIFq5ocI6y6I4WrMsVLxC8jHnB6wUOyl1CnKnAe58oxtK+pM8sjPhRT+vXDYvdRaLyaaWhW63Hk+PWQzI9mKRy6rEhCSYXcwirkwwSY4Bz+Ho5f081n8Gb+Cr/FvhaDIE7TPjymPw0gNw1Otm9dmd6MapdSXwkBDiTv38tcCn52U1BsOBjr4QjoYusc7SSXrRDZVmQ+FWYxY4WTwx0tKyiOsuoMH4jurjQiVildjO+yrXkbO3UA7e2f6z07v23FSxmOBXYyHW4zv4TyvWd2dZ+BF9NMYsqh10pxTl+VFCP+3dUHnXoUJm1hMDZWWSoswykHURQvD7xy3nuvt8SrpCnqAI2eZpu2z+uWppP09i0dGykFJ+A3g1cJP+eY12TxkMhrlGWxYjoUssawHunkOLWkV61ZiF5Xi63Ufzu996P340WasdmPRDVgqVFvoW+5cEhd1dfXZDzCKOsMISBZlTls2hG1RQuEO6aSHtOAt18yzSbKgpFdxhTL9sM/gIyGYsfFzkbMe9+pMUyNHvqfv4d2xYyfErBll1iE4GaDc58Zk71DmsfNXsPrsD3abO2sBOYBQ4Wghx1rysxmA40NEXg12+3ZAN1XPUxSzS1Fnh5lTMokXqbFR3AZV1F/HJSsRhQlUp50RA/umbOn+2sGp32GEZAuXPL5DDD+O6uMVDbQ81WYkYcrUoeLXhR0A11TdFhCVskvaWRUZZFrPtDSWCAgWZpT+rxOLkI4b4lz8/k8FFuqakk1isPqu2/jmmo1gIIf43cC/w18B/0T9/OS+rMRgOdPRd846KTaJbV8S96Iaqps5mcC21TttVRXmVlpZFTSzscgODiogAACAASURBVE0sCpWIw8ROYuHweHIES5/c2L55om6HglPXtE8Xs02SU4V9h6wDREexKPoRw7av9nX79OLSmEWjWLiRDjC3i1m4Nr50a0H4eh79fsdiQSssUCDPQLYxQiAyam1J2pV2KmMvwuhmOPL32x5/T+jGsvgD4Bgp5ZullG/VP2+btxUZDAcy+s5xe9kmn9UpnL1oWQRFYuGA5WJZyrKw3CyeaG1ZpO0zCjJLxq+5mgq+siwq+UO5MT6T/rEnYHJb02MAEJaQbo5/ekAHyeumyU3KvBKL7CAsXdMxyF3wI4bsstrfSmMvzWdzuKFOR20jFrYlCEQGEU+JWUgJN30IHvh668UkMU5cVjELb4p1oF1kQblFRlTaP2rxEa2Pv4d0IxbPAvNj1xgMhka0WEzEGRblVQpnHPdgWVNYIrDytRoLVAM+r02L8jTAvU0uIReOVrdPVkIOEzuJBg9jp9QX4napp2GZ0PL49K3PqOd1lkWBHJVUrA46DkY2tT2Ngh+pqmyvTgC0G0fEjamzXheWBUAoMoipbqg4UG1bytNbnVSpO4eploWlLYuwlVik7im933zQTTZUCXhYCHE7UP0GTAW3wTAPaDdUCY/BfASTIHuxN1RQIrCy1bRZABwPTwQtU2fTDKGd1lIOjp+pbp/0I1aKnYih0/CfTzOR2mQThSVCO0eMTSIcrKhcu9DKHIPp5/ctU5XibSj4kZpRUZ9h1GKeeDbuViw8+qZaFqlbqtJGLHR78gI5+rzGS7Od1WJRaeGG6hGxuEX/GAyG+Ub/py/JLIN5ddFLetGyCAoEVpaMU+ecqGZDNbcs0sZ8k5nlDASPqNRUJ4NfnGSpmKC8dBU++m6+nVgEJUKhrK7IypCJ/GrF8mQa4AbV3rs8qmoTLLvpoQqVdKRqvVhoN1TSaFlkY32hbjHLIiW0PKxkimVUFYvx1m/Ugudb+cbvFbCzKq036mhZ9Ldd257QUSxMmqzBsBepsywGcsqimNqjqCcIS/gi1+CGwsniEBOEzedwS21ZVPIHQwBJcRfWokNxJl5Ubx9ehc9mtXNby6JMINQFPRIZMnUxi4LM1TrX5pcAUglG39Kmhyr4EX2iCNmDaxtbuKFyVbFob1nEVgYnHm3cmKb7tnVDqeMn7vQLvqXFIm5pWaRDk+bPsugmG2qNEOL7QojHhRDPpj/ztiKD4UBG3yGW8ch6eq5Cj7qhKla2mjYLVN03U+dApEgtAPHASgBuvOcRLrj6XuLdmwFwlqzCR7uA2hW1hSUqWiwCkVH71vn7q+1C8kvU79L0+RkpBT+iL5lSla3Pw0oazyOfdCkWdhZnynu7syyUdSS9gWkvudoNFfstUmf3ghuqmwD3N4AvAxFwNvBt4J/mbUUGw4FMUCS2PBIsMtW2Ez3ohgqL+GSnWRYAskWri3TynDOkxOLOe3/Bgy+MMbr1dwCIoVUIN02H7SAWqM/yyTRkQxXJ1gLcHcRCSknBj8gmhSluqFQsGkW6TxYJrFzHOobE9nDkFOsqPZ8uYhbCm25ZeNk+EimQrVJne0QsclLK2wEhpXxeSvlpYH7qyQ2GA52wRGirZnReRnmJe7IoLyhSEbX25EB1INDUcaRVtLVgrTyZLXIpn818lc++8hmVNis8FZDWaaud3FAlbYH4uFWxqFh5JFY1wD4m9B16C7EoBaoduhdPaaFhWUTYiKR2wY/ihH5ZJHCm3/VPJbE9XDnVstBuqMoEJC266mrBE00si2zGoUgW2aooLyioWMs8FeRBd2JREUJYwCYhxJ8JIc4HDpq3FRkMBzJBidBWd83ZjLYsenH4UVCi3CRmASCDFhf6KMCXLmsOX8EH3SvxlxzHO579H7zXvYNJ71AQAqE7vnayLEqJ+m4qiaPFYoKypUQ2jVn8w71aJFqIRdGPyOFjyXjamNRYuFh1Ae4gThgUJQK3s1hgZ3HllDhTtUhPtm4frq2GND5RT9a1KeO1ruAOivNqVUB3YvEJIA98DDgZeC/w/vlclMFwwBIUCC3livEy6i6xJ4vywmJDx1mgJhYtLvQiruDjsHIox4//+7tY/OHb4Ny/w8vmWXbMawBV2Ad0tCwKWixK0lV1FpUJSkKJRWpZbC7rY7UQi0k/YiBtIjilOV8kXOx6sYgSBikSdmFZSDeLR9BYhV5f0d3KFaXdUG5ueqPAnGtTlF5t8NO09xbnNRMKusuGul8/LAB/PK+rMRgOdMISfioWnm5o15MB7iKlrNdoWXSKN0Q+Pi6LnbpK6dd8FE69BNBV4OndcasAd5Ko4VCJEtJiUnNDFWm0LEYDdYH1Jnc2vdCV/JhBoS++3lSxcLDr4g5+pCyLKHNY83XV4+RUD6k4rM3qbhCLFkFu7b7L53LTXsq6FjvJsqidG2qhLQshxClCiJuEEA8KIX6T/szrqgyGA5WghG9l8RwL21a1AT1XZxFHEAeU8KbUWag7eStqHuAWcUCAi2NNGZlqu9URoHamQ8xCb5+IlViUYkdZMqVdTJJaFkk1eD3KAP7EzqaHqkQxg+hYwpTaibipZVEiynS2LER1hnfd9xDW1V2Ux/inf9/Me675ZcP7ZOTjS4f+7PS4g+falPAQLb5bgmKtEeI80U1R3ndQzQMfBTrMOzQYDHtEWMSnv0EsZK+5obQrpCizjRd+bVlM64uksWKfALftfO2M65EgsFoGydXFcjxSF1QfFzn5IqK0i8fcd1V386OEoh+xWw4wMNm8TXkljBkQzd1QKmZR+979KGFYFBnLtE+bBbD09xBWymTSNNsplsX23z3MoVseBE6rbo5C9f1MbfUByg1VklmcaOFiFt2IxU4ppangNhj2BkGRilhC1rWx9N120msBbt23qVQ/fxvqLItWMYuAUGTaHjrnOfhkyLW6g9Z36GOhg+dYVMhg6SZ6v0hOrO5WDmKKfsyoHODwFjGLchAzkFoWUwPclotTN88iCJUVMuq1GDxUR5r+WymXyKQGS9QYs/i9Hf/CWnEP8Le1UwsqBDjV9uT1uLagRBY7am4lERRVNtk80o1YXC6EuBaY2hvqB/O2KoPhQCUoUbZyeK6FrcWi5ywL7TeflF5t/jbUxCJubhVYiU/YoSdp1rUJcMm1tCxSsXBZMZTDH9XHywzwq2A1OdemHMaUwphiEDHiDGJXnmt6qEqUMCBSN1SjCCTCxaH2vYeVSRyRIDsU5AE4nhILv1JnBUyxLIaCrXiEJImsdu2N/LK2LKZ/R0IIfCuLE7dyQ81/zKIbsfhj4FhU59nUDSUBIxYGw1wTFilnPTzHxtI58z1nWaRuqGRKnYXOZJpWvayxE59yB8sin7GpkGFRq0lzWiwmE5eVQ/mqWMjVZzDxqODgwQwvjZUZLQZICaNygEww2vRQlbAuZjHFYkisDHZd+mtS0hlMXmexsDNaLMpTxCLTr9ZfHmNJtAOXiCBOyOq+VVHoE0iHAa/5ZTmwcrgtxaI33FAnSSlPmNdVGAwGRVCk5HkNMQt6LRtKu6EKMkO2IXVWXSQz0ieMk0YXFWDFIbFob1lUhwd1iFmU8Fg1lFNT6YDoiN8nfkSytF+Jxc6Cev9uOYAXl9Tx0oI/jYpZlJDCrg4XSkkst6EKO0kzmHLdWxaBXxfUDsvg5lUwv7SLpckubJFQCkOyrvo7x0GFALepGwogsPJkkoUTi27qLH4phHjlvK7CYDCoLKMkoixdFbNwejVmoe6YC8mUmIW2LDzCpp1n7cQntDrELFybsnRb1mqkQlWRGVYO5ZSwAMXD1KTn4T51/F2TSixGSau4p7cqr4Q6ZuENwpSguxKLmkhL3QDQyrXvOAvgZFRWVliZKhY51Vdqx+MqtRYI6qYHJpFPiNM0wA0Q2llcGah/Jw2LjZXFMs91Ft2IxRmoeRZP6bTZR03qrMEwD2hff1m62rLQMYteEwvthhqPp9RZaMvCI6i1Ca/DkQFxB7HIajdU0qK/VOqGKuNx2FCeHyWv4ZdHfZxCn5oQt6RfWQ8jRWUVTFRbfkzPiKqEicqGyk4PWidWBocImRbWacvC6sKyyGR1NpQ/JXXWzUN2EXLHY7XNfs2CSqJABbhbuKEi3QZmWmFempbbA26o8+Z1BQaDQaFdL+XEaXRD9VyAW8cN4kxjBbftkmCRFUFTy8JJQmKns2Xh45KEFZpOoKhzQy0fzPKCWMHdy87kD3TzwCVTLAt3YBmUaVrFXQ5jFokSoolYSNslQ0icSNVZtyoWXVgWnrpoR9PcUDnwBhB1NRdhXWsUGaqixWYBboDY0WIRlBo73+6FJoLQhWUhpXweOAx4nX5c6uZ9BoNhhuiZz+XY1m4oHbztUctiIp7ihhKCxPbIEjadlufIgKiDZZHP2FRkhqRDgLssPQayDoM5l/FySNFXgpq6odKYxeASNacialJrUQljFoly06C1tFwyRISxsiwsLRZOX2exyOoK7Lh+NGxUqbmh6ojq2rmLOCCQDvlM80FNsZuKxRTLYi8MPoLuKrgvB/4K+KTe5AL/PJ+LMhgOSLSfvqgtC8fuUbFI54Qn3rQgthKLFpaFDEg6iMWinKssi1bNCLVYVMgwkHVYpMWipGdYVN1QBSW8wwcdAkBhbAfP7Gxs710Jk+kjVTXSzuASEeoOsVagxMLND7VdP0Ampy2L+nMIS0osplgmkV/bRyQtKtw1SdWymNKmfC8MPoLuLITzgbcBRQAp5Vagi9aLBoNhRugisFLs4Dl2zQ0le00s0rhBZtr4z8TOthQLV4Yktjdtez2L8xlVld0qwF0/STDrTrMsqm4obVkcuvxQAB747SZe/9m7+O3WWl+mNBuq2TAjaWXIiIhIWxZ2MEFRemS89usHGOhX4uOX6kag1ge460+nTlDsJCASrSvcpZv2zZoysrVX3FBAIFWURwIIIbpakRDiGCHEw3U/E0KITwghhoUQPxFCbNK/h+re80khxO90MP3cuu0n68D674QQXxDt+gUYDPsqOsBdTCyyroXj9G6AWzo5JFZjzAKQTpasCJq6oTIEXYiFqwYatXRDlYmFTaQDwYtyLhMNlkUqFkp4Vy1fxDY5zNi2ZwCo/Po6uOlP1eMwpl+WptVYAOC4yrKIleg5wSQT9OE5nS+Z+YFhAIrjdXGSNHVW96Aak9r6qHNDWUlIbLUOI1fFolfdUMANQoivAouFEH8C/BT4v53eJKV8Skq5Tkq5DtXavATcBFwG3C6lXIOqCr8MQKfnXgSsRQXVrxZCpM67LwOXAGv0jwm6G/Y/dIC7FDl4de0+erHOQuq72KluKOm0sCySGIcYabd3Qy3OZfCli2hRBU5QIhRZ+j0H2xLT3FBpzGJ30ccScMRwnuflclaJ7er1zbfCb38IQCWIyNM8GwrbwyXC14FzJ5xgQuYbixBbYTuURB/BRF2cJCyrCndtWTwrlXssruuuayUBcbuixdRy6FU3lJTyKuD7wI3AMcCnpJRfnOHnvB54RgfI3w58S2//FvAH+vHbge9KKX0p5XPA74BThRCHAINSyn/XFs63695jMOw/aLEoxpa6g03vMnvNsgiUZQHgTL14ujklFlMtC31usgvLokKmZX8pwhKB5TGoaxEW5RwmKhGlQAnqYE75/BMJfZ7Dkn6PFzmEo92dLM679BeeVX2aogAZFFS9QxPLws14eEREz9wNj/0AN5xkkr5qa45O+JnFiMookbZMqpZFTjlSnpXKPRaHtcI/W7YvWrS8VCwWxg3VMXVWCLEYGANuAJ6WUraZON6Si4CN+vFyKeU2ACnlNiFEOnVvBVDfs3eL3hbqx1O3Gwz7F3V1FlnXBqEvxK3GcC4UYZFEZ+ZkprihcLJ4YpTxqZaFPjfptBeLrGsTWRnsJGi+Q1imQraaXjqYVZZFQccs8q5NzrWZ9KOq9XHWaacy+Ks7OG5xwPDYS+o4/gROoGMKTSwLz8viEnHQLz4FE88ybC9iszi87drrkdnFDJYKbB4p8YqD+msB7gFlUTyRHA42DYF8J2lfh2LpFuQyKNDwrS+0G0oIkRFCfBPYDHwV5XraLIT4uhAdGrxMOQ4qQP69Trs22SbbbG/2WZcIIR4QQjywc2eL7owGQ6+iA9wBqigPIYilQPZggDvNzJnqhhJuDo8mMYu0fUcHywJAOFnVakM2+W8elqjgVaucVy/tI04kv35+VGeQWXi6fUafLm5bvmotAK93H6tWTlMZxw61WDSxLLxsDktI+seegiRkMNxFsbtwLQDuwFIWi0k27ZjUlfmhEosjfo+vH/8t7k+OASCps6AcGbbNFksti7jSe26o/45Kkz1MSrlBxx4OR1kj/2MGn/Em4EEp5Q79fId2LaF/v6y3b0HVc6SsBLbq7SubbJ+GlPIaKeUpUspTli2b33a9BsOcoy8cAU41kJoIqyfdUHFLsciSbdbuI70o6s607bAybSbuhSVKOm0W4LXHqP/nv3hmpCoOuYxaU/qc4SMBOC36Ve04lTEyYdrvaXo6bDZbt85j3wJAyer+zj23aBlDosBTOyZr7cndHAjBJusoQu3UqXdDOTIkbhPTcb08sRTE/lSxKCqXZYd40J7STiwuAP5ESlnN/9KPP4JKp+2W91BzQQHcQm2G9/uBm+u2XySE8IQQq1GB7F9pl9WkEOI0nQX1vrr3GAz7D3G9ZaGby2H3XoA7LBLb6oI+rVlgJtc0wC1Ty6KDGwpqXVubi0WZYpKpuqEOGshy4spFxImsFrPltGVR7d46vBqANZP31Y5TmSAXTajH+eFpH5PVLTtG8qvhrV+gYA+ywz6449pTnL5hhkWRTTsKtcwuPediohKS0c0Gq99LEmORIK3WMYtsxqFChthvErPI9E3rbzXXtBOLREpZmrpRSlmghRtoKkKIPPBGGtuZXwm8UQixSb92pT7ub1FxkceBHwMflTX7+8PAtaig9zPArd18vsGwT6EvHL508Vz1XzPGQvSgGyrSlkXGabxAWW5Ot/toXHPaJykdOdoOJ6Pv6pt1ng1LFJIMi3K1i+rZx6iwZyoW2aobSidTZvpg4BC8qO6OvDJOLtZi0cSysPU6Hx84E/qW8D9WbeTm7Ns7rr1KbpgBimzaPlari9BxnslKRH+fepyk55gmALRxQ2V1K5Q4mPK9BIV5j1dA+wC31DUQzeSqq4ibFpslU7aNoLKjmu1/BXBFk+0PAMd385kGwz6LDgLXWxYJVg/WWZSIWlgWtrYsKmHjJSIMVDNx0YUbyk2zfprUWsigzGQ8XE2RBXjdsQfxj7dvIp9Rl7PslJgFAMNHweQ2nkkO4ShrG1TG6Ysn1O1yE7FA1zQ8kH01ZwKTiUfG7eoeWaGtldGRlwkqB6tG6vrcJ8ohy/vyUASZTuOLO2eLZV2LABd76veyF9qTQ3uxWAT8mhkEmA0Gwx4QpWLhVC2LBAsheywbKigSarFwrOkBbuWGahS4KLUsunBDpV1bm1kWSVCkLA+uFt8BnLBiEUv7vaolkbqhGrq3Dq+G5+/hIbmGo9iGrIzTn0wS2w52s7vy497KFbdv4elkDaBmcE+tVm+LFqABOcnI6BiHQJ1lEbJmWR+8XOeG0qLRrg4l66hZH144xT230GIhpVw1759uMBhqpG4oXLKpZSF6LGYhZYNYTHVD4ebIihA/mCIWuuW4pf327Uh7K/mVIrc+9BIbDh/i8CVpe+4SJbxqWw8AyxJ85p0nkHNTy2JKgBtgyVEAPCmOIuEektIYiyngO4vIN/P1e/08NfRaxkvqIu5HSXcFeSk5ZVkspkC5pN1f1ZhFRJ8+R6ZYFu2C1LmM7sg7VUSD4l5xQ5nusQZDr1ANcDt1MQu7tyyLOAAZE1rN3VCpqyWe4iqJdT2B1UXMIp9TwrBzdJxPXP8w19/+S7juQiiPQlimjMdwvvGi+rpjl/Oao5THu6llseQVAIz3H0nJ6iMuj7FYFPAzredTDOVdRktqtGowS8tisShQScerpmJRDqvnWBWJVDTatHBP3VBMsywKBHaO1332ZzyxbaL7Nc4QIxYGQ68Q+STCRWJVU2clVm81EtQFYKGlRKGVWEztGpvoC5yd6RyzyOXVXffTL6l2GYNb7oSnfwwv/go7rlAhw3B/+0AwTBGLo8+D87/KrqWnMkkfcWmcxRQIM61bjg/lM4xqyyKIkmocqSt0d9ohCgTlmmXhRzF+lNCXT8UiaPzdJmbhObZuslhnWZR2w+QOxqMMz+4s8tT2yZbv31OMWBgMvULkk+i25OkFLxE9lg2lxcJvZVno0apyimUxEzdUrk+JxbPblFj0TzytXhj5HQAl6TUEuKfSNMBtu3DSRRwy1MdYkoPKGEOiQNxklkXK4rzLZCUiihP8KO6qiWDtJLQbSkwSVlLLIs9kRbkUB3KuqrWIG91Qoo1lkcvYum+WFuItD8CXXgWlXfxuye8DTIsVzSXdzLO4Sgixdt5WYDAYFLFfbSRXLcrDgl5yQ+k0UF8oUZjmx9c9o6aKRaK7q7qZzm6o/rzyvz+/Q83NPjJ+Qb2waxOgRqoO5dtfVAH6vemWwLJ+j9EkhyyPs0gUiL3W8ynSzxgrhwTxDN1Q3iBSWCwWRaJKzQ2VisVgVomFSMWi6oZqlw2lLIvqvg9fp/4eH7qb+/t+H6Bpa/i5opuzfxK4RghxnxDiT4UQnYfQGgyGmRP51UlyqctDCgurlwLcunq4YukK7mkB7uaWReqWqhbctaG/X43LKRQKOBYcY2mx0JYFbm66RVNHmhzQ700vcFvSn2FC9pFUxllMkSTbWiwW59X7x0qhdkPNQCwsC3JDDDFZG6/qZJkoqxjIQNYhFC4iVs9rlkUbsXBUzKLakTcsKwtm+Vq2jqnv2w8XUCyklNdKKU9HVU6vAn4jhLhOCHH2vK3KYDgQiWuN5LLV1Fm7tyyL8igAJVtd0KemzqaWRbXFhSbN4HG8zjGLwQFlWXgi5M1H2gwL7fPXYuF47TN/au0+plsWw30ZJmQep7STvPCnTa6rp2pZlIKZp84C5IYZsoq18apunomKEofBnEuEW7UsUsurrVhoy8KqBsXLVXHeOq7EeEHdUAB6rsSx+mcX8AhwqRDiu/O2MoPhQCOqEOkW1V5d6mxPxSy0WBQt1XxvmhtKX7zElPROqQPcTheWRTarrBaPkLOHVNxi0l4Mk9vUMfTrrWiaDaVZ0ucxSZ5cqM6jWauPlFQsRrVlMaPUWUDkhlhqFZFBCRDgeOyYUN/LQQMekXCxdHfddAiS3SZbLOuqmEX6HsJKVZy3pZbFQrqhhBCfA54C/gPwd1LKk6WU/1tK+VZg/bytzGA40IjSsZpUJ9BJ0WNFeVosCpayLKa5ofTFK6o0dgqSUYVYCjJu52Z3QgfBswQca6npBPdZtUuNl2s/1fnwJX3kMzYHDUy3YpQbqiY2VhuxSN1QL42WKIdxY8C8G/LDDImCEko3D0KwY0KJ5vLBLLFwsBJlaURaTK02loVtCSKRwZ5iWUgpa26oBY5ZPAacKKX8kJTyV1NeO3Ue1mQwHJjEPiEZso5dncMssRH0omWhMpamu6HUxS7wG0d/ysgnwCXjdpF+qtNvPUIODZ6j6AxxX6XWeDqXb++GOmvNUh761BtZlG8Ss+jLMEFNLJy+zmLxb49uR0p49ZGt921KbojFoqBnWahz2jZeZnHenTa3o9s6lNium/URVlQcpBJR1EWQ04ZOzSHdSOXDwLFTxl6PA8/PchCSwWBoRuQTiFoTQVCps1YvuaEqY+ANUooEtiWmzeBOC8+SoNJYyBb5+LjduXIsixCXPjtkYPxZdi4+hi1bh0HrTLa/vWUhhGhZE7E4n2GCWmsMp39py+P0ew6OJfjV5t3kMzanHDFTsRhmUE5iRWXIKIHaPu5z8KAuXBQullSWRdocsFMdSmJ52Po9RGokbGpVwMJbFlejJthdgxqA9O/Ad4GnhRDnzNvKDIYDjcgnrJtlASBFj1Vwl0cht5iiH5PP1CygKtoqyIqA3cXarAahK6+7DRJHVobD+iVi55N4K45nm6z1I+3LtxeLdtiWaBh25A60FgAhBIt13OL3jlo68wB3foisrLAyeLb6veyYqHDwIl24aLnY2g2VzuJuF7MASGwPNx0MpS2LbeM1sZg2dGoO6ebsNwPr9VChk1FxiseANwCfmbeVGQwHGrHf0HEWUrHoIcuiPAq5IcpBXG0J3oC2LDxCdhVqQW47nKQgc10HibO5POcObYOwxOBRr6aSr82SGBiYPtluRmRr2f/ewJI2O6qWH1AbsjQjdMuPVfFzcMoHANg2XqlZFlZGTQSklg1lux2yxdLeUZGvYxY5to4poRnIOgtuWRyrZ00AIKV8HCUez87bqgyGA5EoUE0E3XrLwsLqNcsiu5hiENGXaeLFTi0LGi0LJyxQINf13blwcthbH1SPDzuV1UesJpR6qNGiPRMLp0+ly/rS6RgsTzOiXrtmFmJx7Fu4d9m7udD6HLzmI4RxwkjRZ/lgnWUhVQ1NrMXC6VC0KNMW71GlallsHSvjWIKVQ/kFF4unhRBfFkK8Vv9crbd5QDhvKzMYDjSiimpBPdWy6G58zN6h3rJoUsdAXSbTSLFmWThRkUnZvVjgeKonVv9yWHwEJ69eyg7UnfqiPbQsMjqoPU4/YmqAfgorh3Ics3yg1vV2JgwczM+P/M886iur6OVJHympc0NlcHT8oVqH0iFmUR0eFQdKMNwc28YrLB/Mks/Y81pn0U2A+/2oUaqfQM22uAf4S5RQmMK8fYHSbmUSz/PYRcMeEgfqbtdptCzsHnRDFcci8m6Ty4ftIoVNVgSMFGqWhRsVKLIMx+ry32Dqjln5KhCCV60aZqtcwkGMMjywZ7MbcgPKspgU/RzUYd9Pv30twR7crQ9kHQLdW2q7ji2kYiGtDI6+35aRTyQtPLf1WFWg1mgwqqgKbm1ZrFicw7HFwlVw62K8H0kpPyulPF9K+QdSyqukbXAblgAAIABJREFUlCUpZaJHrBp6mcnt8Nlj4cl/WeiVGDoR+VSkU22EByCF0zups1JWxaLUyrIAcLLkRchInRsqExcpifz0gHgrUnfLYa8G4JWHDrKTJZTxGgYfzYa+QWWhTIjOFspg1mVpf+d+Vq0YyCpBLVQito8r6yGNWUjbxU0ti1CnFnewvKxURIMSJCGh5bHp5QIrhnKqaG+h3FB6BnbJ9IPah9n6sOo7s/WhhV6JoRNxQGWKZYHVQzGLoKgGMaVi0SzADQg3yyI3YqQuwJ2JS5StztXbVapioUq5XNvit8vexHXy3GqF9mwZHsgzIXMUrdlnVXVLWkU+WYnYrgvyamKRwSVCSqnSpnE6ioVIxaKiqhYe3FphdzHg3acchudYC+6GqgCPCiF+AlQrbaSUH5u3VRnmjpd1bsLIMwu7DkNnogoVx2mos5DCxuqVmIUuyCM3RMmPqjOvp+HkGHTimhtKSjJxkYqYgfvIyYLlwiHrqptOPPvd/Pr5N3RvnbRgSV+GF+RytjuH7tFxuiEVi4IfsWOiQsaxqsV+0vZwiYgSWSta7NayqIwBcOezk5x19DJec9QSrr//hXm1LLoRi3/VP4Z9kR2Pq9+7jVj0NEkCSURZOtWuqQAIe+4ti8LLsO0RWPPGmb2vKhaLKYWtLQvcLANhxK7UDRUUsZDE7gzE4pAT1VzpulTS844/mPOOP7jNm7pjuC/Du4NPsXb5Et65x0drz0BWCcNkJWLbeIVDFmWrYidslwyq7xRxqCyLDqnF1aK9shKL0cDmv557DKCHI81jzKKjWEgpvyWEyAGHSymfmreVGOaHl59Qv0eeVT5nE+TuTXS/n1JsN1gWWDb2XMcs7v8a3P0Z+ORL1crirmiwLCbbWhZ9dsTuNBvKV9PbIncGbp/Xf6r7fWfI0v4MJbI4nWoa5oBqzMKP2KGzlqo4Hh4RpShRNTaysxsqXXNcHsUGvGye41eoKIHnWlQWePjRW1EtP36sn68TQtwybysyzB1xCLuehuxiCIsq2G3oTXTqZCmZnjo7526o0i7V9lx3ce0aLRZhZhFBnNDXxrLIW2HNDaXFQnZoLb63GO5TAev6epb5ohazCNk+USvIA0DHLIIohjjoyg1leyruExbU30LUib3nWAs7zwL4NKph4BiAlPJhYPW8rcgwd+zaBEkIx7xJPTeuqN6lKhbWNMvCmmvLQrswZisWZUfdyeZaiYWTJScCSkFMOYhrYpGZ/4ByNyzOuViizfrnkNSyGC2FbNduqBTheFhCEgQBQge4Pbv9mlw9DyQuqb+FVdfy3XNUnYWUcq5PQ31WF/tETRoGzs9qDLPnBx+C39zQuO1lFa94sO8M9dwEuXuXqhvKwbOnVHAjSZI5/C+XupMmZigWOqhatJWF0LJlt5Mli7IqRoo++BMAWNk9bNMxR1iWYEm/R65Zncgc06/F4r5nRwjihBNW1hJL03nbQeAjurQsMtqySLTgO1Msi0RCNJf/Vuro5tt6TAjxHwFbCLEG+Bjwi3lZjWF2JDE8+j1lRZz47tr2lx9HWg7vuaOPx3MutrEsehc9V9mXDvn6i7CwsUmIpcRijuJNldSy2Dqz95VHwclSStRFrmWA2+vHS1QB2kghYGWgyrHsDq019ib/3/knsHJ4Bqm8s8RzbDK2xT2/U0OcTl1Va1yYzq4I/TLZJOgqZuF6ShxkKhbZOstCW6R+lLQdOztbujninwNrAR/YCEygqrkNvUJpRLVGmHqnuONxKoOr8cmw1TrYWBbd8vKT8PT/27ufqS0Ln0zjHbvl4BATz6llocViYhZikRui5Cu3WMsAd3YRmUi5nkaKPklFWRZurnfKtd7wyuUce/DesXT6sw6lIOaIJXkOqotZpLMrorCCSEIi4aiuuG3I6CmBQgt+Kh5AtZhzvmZadJMNVQL+Wv8YepHU9zz1TnHHbxkbOB6AJ4NlHLLrd12Zkgc8d/xPeOnX8J+f3HufGamCrQCnMXBsqQD3nLoWKnsgFtnFlALV/K5lgDu7CCfQYlEICCrjZAE33ztisTcZyDrsLgbT5mGklkUU+NhxQNhF7bOnLQmhXXuuV0tHTos556vWouO1QwhxNKoX1Kr6/aWUr5uXFRlmzuQO9XtiWy09trQbxl9g28HvAGCzPBh2367y+Ts0TzugkRK23F8NOO81tBsqwG20LFI3VDxHYiHlHgS4x6rV29AmQJxdhIh9PAJGigGBr8TC6zswxSLNiDp19VDD9qplEfhYSUAkOt/KedqysH0VRvZy9WKhLYuFEgvge8BXgGuhV5rUGBoo6JTY2Fd3f/nhanuPze4aAMazK3AiH4o7YWD5Qq209xnfAoUdamby3kS7oQLpVC8uACK1LOYqw8WfVC5LmHmAuzwKi4+gmFoWrQLcel7EMrfCSMEnCsfxpUtf317+TnuE9O95yqpGyyIddBSHFWwZEIvOPa9ynkcsRVUscvlmlsUCuaFQ2VBfnpdPN8wNqWUByrWQH1YVusBTYjWLcgWOWHkYbAZ/cieeEYvWvPSA+v3/t3feYXJd9d3/nOmzU3e2r3ZXvcsqyN3GxrhhG1MMJDYhwTGBQAh5SQIJvCGBF0JCAoSE0EKMASfBtBgwzRQXcLckWy6SLFldK22dstP7ef84987M7k7b1a60ku7nefaZdmfmnpnZ8z2/egrZ+sfNNRWWRcsUN5SFIuniHK0WdReUq1MtMmZiaabC0LO5bFnU6tHkUF1d+52qmWChECWGs1TNfK7hdVppc9lY1j65gn2SWBRzFEzNiIWFLFacBdV5yVFpWWgB7vQ81Vo08yv5sRDiT4QQPUKIgP43L2djMDsq3QmxIfaPxji663FoXcJgxkGb28ZAn9rwfvD4DP3U5xqDmlgU82oiPVVoMYsM1mmWhZni3AW49bTZrnVqjImxGTw3olp9ZJqzLPqcqk15MR0jIR2TxnUu8WevXsnnfnfztJ5WeuuOfDaDSeYomBqLaYvVTIbycS0t5ULHkhtqngLczYjF24EPotJld2h/2+flbAxmR3ykvK9w9AT3PH0MTuwk37WJYDxDu8tOT49qmjY60qRYTByHvT+fpxNewAxW/LQLpzBuUdAtiymps7obaq5iFnq8onOdumw2fTafUV0AnK0kc3o2VO2YBUC3PUMwkUFmYsRx4nWcm2JxXp+PK1ZN32lPb91RzGWwFHMURWOxcNrKYpGWVtzO8nPmO8DdUCyklEur/C2bl7MxmB2xYejeqF0fIhUNMmAaI+JbSzCepc1tK4lFONhky48nvgjf/j0o5OfppBcghRwM7azYYOYUikVeT5214rbNo2Whu6E616rLZuMWushoqbMmweRW6pVoYtFpzRCKZxFZJRbuc1QsaqFvoVrMZ7DIHMUm3FB2i6ksFtgmWWvzHeCuKRZCiL+quP6WKY/9w7ycjcHsiI+Avx9a2iF6Al9EtSU/5lhNMKHEwupuByAZadLtED6kAqGp0Hyd9dwRPKB2DTtZRl5U7qC+C9TtwincNVhPnZXWSZsKCZMFk5DkC3PkWtDdUJ3r1WWzlkVFx1l9/+2arcJLAe4U44kspmycmDx3Yxa1sGpuqEIug5UcRXNjsRBCkKsQi0prrVyUd+rdULdWXP/wlMde08yLCyH8QojvCyFeEkLsEUJcosU8fiWEeFm7bK04/sNCiP1CiL1CiOsr7t8qhHhBe+zz4mQb2p9NSKksC3cXeHsgNkRXXHWafVEuIZzM0uayg7WFnLCSjY0397rhw+pyJj7t00EhB195JWy78+RfSy9a7NX2UDgNbihpsU+qvhUmJRyF/BxZeLqF0L4ShHkGlkW542wqW6jfV0kTi1ZTimy+iDmnLIuadRnnKLplIXIqWN2MZQGQ07Km0tI2yVpzlGIWp94NJWpcr3a7Fv8G3C+lXANsAvYAHwIekFKuBB7QbiOEWIcSqPUoMfqStq0rwJeBdwErtb+mxOpkOBZKzm0/nvkiGVJtPjw94OmF6BBbUk+ypzjA0yMCKVVLZoQgY/VjTkdKRVU1kbIsFvHReR/CSZGeUL70maaBViMZVJdebVOc0+CGslinbOFpUpNBYa6snHREvabdA57u5mstKsQikS3UDm6D2rjIbMNvSqqbuQQZs+ukNy0629AtC5NWwCibsCwA8rpYTHVDWU9fzELWuF7t9jSEEF7gCuBrAFLKrJQyArwe+KZ22DeBN2jXXw98W0qZkVIeAvYDFwohegCvlPIJqdop3l3xnHlhNJrmik8/xIfufX7eOjjOGXqNhUezLIL72SRf4ieFi3nmiPoHb9P2EJbOVlpFjP2jDbZOT4xBLqldb9ISOV3oK+X01F6XsyAxDghlpcGpTZ/V3ktvFKdTsizm0g3lbFWFm+6u5tvWT7Is8rWD26Be2+HDjfoNOYoJcuaF0Z58IWHRxMKcVZaFNDe317eeYpvBistWGbM4fW6oTUKIqBAiBmzUruu3z2vitZcBY8DXhRDPCiHuFEK4gC4p5RCAdtmpHb8IOFbx/EHtvkXa9an3T0MI8S4hxHYhxPaxsdm7T4Ym0kgJ390+yGd+qe33VCzAfe+DY0/P+nVnRCYGuXTj4/R/dne3sizyynf/0+JFHI+o6/qG8xZ3G34RZ99IA7EIHSpfX+huKD1gm5kDsUiOQ0sbWLUJ+5RaFmmKmHDYJ08YZbGYQzeUVgeBs7V5kU2XA9yJTJ1d8nQcPlzFBFby2MiRt8xgl7xzBc2SMOe13arNzcV0dLHICTumil5Spy3ALaU0Sym9UkqPlNKiXddvNzMqC/AK4MtSyi2o/bs/VOf4ajaqrHN/tXP+qpTyfCnl+R0d01PVmiWcVKu8jX0+vvjQAY6FkuQOPwnP3M2R//3buW3qNpVUGH76l/DpFfCTP298fKzSslDukxeLSwja+kuHtLnVj8vu6SAg4rw8Eqv/mroLCha+WMy1ZeFqL2dDndIAd4acsOKyT56ETWa1cizOlWWRVrUSgIotNPu5pcIqxmH3kszW2X9bx+HDUYjhQi1YijbDspiGJhaWnLZ4a9qy0NqEmCYfr3esTZ/GOovZMggMSimf0m5/HyUeI5prCe1ytOL4/orn9wEntPv7qtw/b0SSapJ4/zWqVcaDL40y8tT3AOgPP8X7v/Ij8oV5Kth67N9g+13gDMDBh1T8oB7xCsvC2wPATwsXT+qb367tDGZyBWgzJdjXrFg4Wxe+WKTnUCySQZVRpu0zcKoD3NP6QlEZ4J4j4dKaAQKaWEQmP57PVs8sS4WVyAhBMtucZWHLx3ALQyxqootFXhMLS3Mxi6ImKgXz5G1hzSaB1SxOX53FbJFSDgPHhBCrtbuuBnYD96EK/dAuf6Rdvw+4VQhhF0IsRQWyn9ZcVTEhxMVaFtQfVDxnXggnMrzW9ASbOy0sbXfx4J4RXIfuZ5dcihCw4sQPeWm4wYQ7W8ZfhvZVcPmfq+Bj9DhPHwrxu//xRPUVQ2wE7D61l3L/xZxYcRvfKbyKTf1qQrCYBF6nNgE5A3hljKFIgzTT8GHl0vL1nQExC82XPmeWRdtpq7PITvFBA4i5tiy0ZoBA2bKoXJD89C/gntuqPK8sMkosGlsWpkyULpsW97EvjI2PFhRa19nupHJ1Jx3dTT1ND4QXqlgidov5tGRDzQXvA/5HCPE8sBn4B+BTwLVCiJeBa7XbSCl3Ad9FCcr9wHul1Due8R5UI8P9wAFgXkuL/cd/wxds/47/8U/yqtUdRA7toDU7xDNdbyLcewW/Y/4NidQ8TSThw9C6FPrOV7ePPc0jL4/x1KFQdYGaGCxn79jdbNvwt4TwsqlP/WO3uW3lLJSWAGYKpBOR6a8z5RxiLX0cSrWcY5bFuLIsdN/xqQxw5zNkpWVSjQVUxizmMBtKd0M5/WqM+YrYWPBAaYfFSeiBcSCZzU9zl01DE6Jepzpv4Vg4Gx8tGEwWigj6MgeIyhZGvBuaepoeCJeW6Zs32S2m0xLgPmmklDu1GMJGKeUbpJRhKWVQSnm1lHKldhmqOP6TUsrlUsrVUsqfV9y/XUq5QXvsT+U8pyidN/g/AJh2fIObeuNczxMUpKDtFW8guvIWekQIMbZr7t9YT1ltXQLd56kUxMHtDIaVJbD7RHT6c0IHIVAuqA8l1AS3qV+5odpcFasPp2rpJZKh+lle4cPsSrbybMhK4UxInQVIRxu77OpRLKg0ZFd7acV3Si2LQob0lL5QACZNLObEsigWID3BUNbBx3+8G2nXXJWpisVDekKlS08NqFeIRaJRnQWUxGKRXf12Tbrry6CMEKUWH4/K87Bam3NDSf33aXFMe0yJxZlpWZx5jO5hRWwb37e8FqxOtj58O39iuY/fyk1ctmkNpg4VxyAyWP91ZkMyCNm4EguzFXq3wODTDIZVCuLuoSmr52JRVVoHlpbuCiWymAR0ehx0ee2l4DagutECHhkjmqqRXZNLQ+wEz8b9BKX3DHBDaROdLICWgji71wkDUrMsTkeAO6u2VK3lhirOQTaUJqzPjkrueuwQx1LWSfer6xFATrcoNfdVvlAkmy9Oc5dNw+GDfJqNHKAoBdnW5Sd//mchukvpocKmhluqltBFwjpdLBxWsyEWp4wnv0xW2PiR7/fgig8g4iM82nkbv1r/T/icVmyBAQDMsXkQCz2w3LpEXfZdAEPPMRpSFsU0yyI+rFwIU8TC32LDbBL8yatWcNuFA+XjNcuiVcQJJWu4WCJHAdibaSMovZjzyZObhOeb9JRV8WzRJ0dX22kJcBczcRLSjntaNpRmWcxFBbf2WR1NKTHcNqxZYpWfW62NkfQtVRs1EdTRqrg3ZJ/joOzBcY5ufNQIoRVh/rawsWmx0HfYE1X2XLFZTKdvW9VzimIRhl/gQdtVWNztcNkNcP4dXO7wcbl2iNPfRVpascaPz/37VxOLxz9PW+oljomVvDQco1CU5X16QwfVZYUbKpzMEnCpye7tly6Z/PqaZeEnRiiRYWl7ldx37RyOyk68ulGSGAfbAs2Tn+pC8VUtwWmMbkFVWhaVbqjoCbWh1JqbZvf6DSgmg4Sle5plUUqdLc7BBKCNcX9MfbG/PZblTVAWi0JOVcOD6jdWOjnlvmpq/20dze3Ul9rLffKic7Y9eSPMFjt75QAjBLCZmxMLoVkUZtt0sbAblsUpwmSCdz7IZ8Tt+FtspUrUSlx2C8dlO47kHLSXmEpYK4bza9bAwMVIYeZm0+OcvzhAMlvgSLBila+JxePhcvAwGM8SaKnh+6ywLILxGpaFLlj+xXT3ahnLC9kVlY6UzfKTsSyS2hhdNQLcT/8nfOdtpU2K5pxUmAncVWIWejbUHFgW2u/l2UQrHR47z+tfa7UkgUrLIj0BSLWXRWmXvOYsCxMFXiwuxWN0nK2KOO/N/Nz7ZqBOF98pmHSxsJ9FAe4zEiEYTpnxt1SvO7SYTQyLdlyp+RCLw6QdHbz1m8+rvlTuTkaXvZHbzA/yuuXqq9pV6YoKHaKAmTt+MFyq1g4ns7S6atRMOv1IhHJDJapPesXQIVLSxpqVK+jsVmUviVCNsc5Vt9eTITVRFteTckNVWBbVAtyxYZDFcv+oOcaUjhCR7mnZUCU3VC2xCB9RK/9mCB5AChPHZCd3XLaUqNRWpvrnVmmlVe6+WNHq45iWbOF1NqjLrVhk7ZJLjPbktbjuE4wsvQWgaTeUWRMLi72KZWEEuE8d2XyReCZPa63VOTBm6sSTabKnzkwIH+GE6OLxA0FeOK7+gXcM3IGFAq+NfRerWbB7qFIsDjJi6iJdMPEfv1EdU0OJHAFXjUpQkxkcPvzECNYQi9jQfo7KTi5Z3s5Av5qEh04cm37gxCB86WJ4+quzH+9ckI6UxSJTJVusWXQRaAlUD3AntKyw+Uglzmcx5RKEpXtaUZ5uWchqgjC2Dz6/BZ7/TnPvEzpA0tlLFiuvXtNJW5vW5aCRZVGxl8X3th/D57RyybK2+u9VIRaZ9vWs7DRSZ2uxrlfVoDQds9B6SlmrioX5tG6rek4RSalJtLWGZQEQsnTizQfnPrUyfJhDBfUP/MBLanLam+3gB8XL8e3+b65qi0y2LMKHOFTsRAj49rZjjETTWsyi9rmLlgDtpgThGmJRCB3iqOxk6+JWVixdAkBwtEp8Zvtdyk0z/vKku9/xjW1866mjzY/5ZCgWlED4F6vbJ2tZOPzKBWW2gDBNCnAX9JV2ch5cctpkHcE9LcuormWx4+sqC2zo+ebeJ3iAMVsfJgGL21po83lIC3tFy5Rw+dj4dMsigotf7BrmllcswlFr/20dXSz8i7n3L26iw9NcK4tzkQ2aWDSMA2lYbMr9ZHNWi1kYbqhTht7qw1/HsohYta6k0TnsOpLPICcG2ZNWK7aHNLE4Hklxt+P3ETY3H8/8EwcGh1WNhJTI0EH25zv4na39FIqSv/7f5ykUZV2rCGeADktikhvqzkcOcsc3toGUuBKDjFl76fE5CPj9JHCQCk9xQ+XSsOMb6vpE2eoIxjM88NIoTx+aH1fNNHRxaNXFokGxYT2SWl8oHbNt0mIgGVaWZD46D3Un2mQ8IV1VekMp4Z9mWeRSsFPVAxHcry5/UqP6GlQNSuggR2Q3fa0tOKxm3A4LcVzT3VD+gSkxC3X//Qcy5AqSt1Zm2NVCF4ueTY2PPcfZ3O/nK2/byqtWN9fPrqfdr122TnvMbjGdsRXcZxz6irvehBuza2X5E3OYPhs5hkByMN9Of8DJC8cnGI2mGQwnsQf64M1fozNzlH/IfZqhYwchGURkYhyR3Vy6oo0PXr+ah/cqF8mk2oqp+BYxwPAkN9RTh0I8+NIoJ04cwy5T4F9SqvqOmfzYMlMm/133QjKI9PQiKz6D549PECBKPF6lq20mBl++DI4+OfvPaCq6OLg6weI8actCtrSXEwjM9nKAu1ikJatqR2OheXA/llbu0y0Ls7lGgHvXD7Xsr34IatbdgQfg4MMqq28qiXHIRNmT7WB5h8psc9stRCvFQrs8bOqf3LpcO7/v70pwwZJWVnY14VKyOFSd0OobGh97jiOE4DUbuktdYxvR5lWfv98zvYWK3WJkQ50ywiXLorYrJ+nUxaKKL3+2HPoNAHvkYt5xmaqbeHjvGIPhFItanbDsVQxd9vdcZNpDx92vhB/8MQBHZCd9rU7efeVy/uzVKwDo9U3PkijRfxHdxZFJVpFuZTy5fQcAnp4VpccSFj/O3ORJuPD89wk7BvjGxBYK4WOlyukXjkW4z/4Rrg/ePf19R3apbUuPPDaDD6UBJV+6Hxzek8yGCjIuPVz56YdVPYulwrJIRzCjVva6hTGn6GJRLWahTSByagX3c9+CthWw6TZVGxMfU5lsuWQ5q66SkIppPRMPsKxDNfVz2y1MyJaKmIW6fDTSpmIz8TH45UfgmOoF+uw4vHJlk92chYB3PQyb39rc8QbN075SNQ5tm17o6DDcUKeOiFas1uqqvTrPtKjurrO1LEZjaT523y7imYrV4s5vEXKtYI8c4KaNvSzyO/nsr/YyNJGmr1VN/h1XvYeb8p9mn/dSGH6RgsnG7uIS+lqV7/LPr13Fg395JRcuDdR+84FLAFiceK50VzCuJsUdO58FoHvJmtJjKasPV2HyJBwa3MtjiR6Oy3YsxUwpOHz4yCH6xDiBbBX3nO4q0T+zh/4Rfv7XDT6pBuiTnMOntZc4iQB3YpwTWbXifvpQULMs1MJBVvjvc7F5CHAnldUSYbobylyrzmL0JVh8qWo6KYuw577yYyNVWtFoW8buzXexXBMLl91CpNgyybLIYuWlTJt6zYf/AR7/d3jhexSsbgqYWdWMVWEwv7Qthw/sVY0+p2BYFqeQSEpNEPUC3A5nC0H80yyLB18a4favP83h8foVz4/vD/KNxw9z5yNaUd34y3B8O096r8PfYqPdbePLb3sFrS02CkXJkjY1idksJjy9q/h/9r+ED+zlM1sfJGjuoEPb3EgIwbIOd/3tK7s3kjG1sCbzYuku3SUVyKpA9spV60qPZa1+3MWKSbhYxJcdAd8Aq1atBSA9fhgpJZkT6jWdhSqT9lSx2PNjlcVzMv2cdMvC4Z/Z3gxTyaUhGeRoWonyzmMRZVloAe7oeFn8ZHwexEKPWeDGaZ0as9AsjWJFZlYupbKz/APQrlmBu35QfryaWIQOUBQWBmUHS9rV4sLjsBCWLmSqHLOICTcjUvOFP/vfanFx5YfYs/R2AFZ3G2KxkLFbTGTzxXnZ4dMQiymEk1lsZtO0f9pKVGFe2zTL4vDTP+Xmgx/nji/cx+P7a2fNuAcf5lH7n/HjR7YzFsuw5/6vIIWJHxUuY7k22W/s8/OT913Of73jQl6/uVyVvKnPx4vHJygUJUcncixqdU7aLashZgujvvPYwh5S2QKZfIFYOs+mfj8DYpQxEcDvK6c95uyt+GRFt9vEKDZyJFt6WbZCWSAH9r/ESDRDV1q5P9zF2PT9PirFQkpywYNqkjyZJAFdHJyzEIvDj8Kd16hV/Z77QBa4P7EK0MSiIsAdHFGLgmHZijk1D8H7VJgiZoo27zShLxXlVVoWWksW/EuUK0ofT0u7uj3yItMIHiDuXEQBc2nx4bZbiMoWZIUbKkYLo1Jr+lfIwtY/hKs+zI98v4fdYmIgMD0Dx2DhMJ/7cBtiMYVIIoe/xVp3de62WxgstiFDk33DVx//Mm8yP8K9/BX3/vB71Z+cTXDRix+nT4xzReFxrvnsg/hevpcnxWaeGrexoqO8SYzFbOKVKzsm5V9v6veTzBY4MBZnMJwquahmNMb281ktBgkFRwgn1Ir15o09LDaPEXVMbpdRdLTiEmlyGa34LqImzoyrj/VrVUvl44f38dxghFVCPdZKnInUlCZ8mhuEiUFIjGEtqNc7uqfGNrXFIuz8FkzUaauSPgnLYtudMLgNHvtX2PFNCv4l/Cyxkk7lCNDLAAAgAElEQVSPncPBJHlhLQW4Y5plcYB+HNlQvVedHakwSfP0eAWULYtJMYvwEXXZuhjsHuW/RqpOxV0bplsWUsL4Pkati7BZTHR7VZ6+y24hSgsiE1Wfd3qCiHSVLQubB9beDMDekTgrOt3lVjMGC5LfOb+fX//FFU23DpkJhlhMIZzM1k89RYnFtuJqRPhQuc7gxE4WZ/bxM9cbMTtc3Bq9i/2jVbKCHv4UnswwIenhbf7dXG7eTa8IcU/2ciLJHMs76/dg2qjtU/HMkTDHwykW+WcuFuneizAJSfbQE4zHM1xr2s4bXngv55sP0L14zeSDtRYh8YhyvxS1iarg7cPpayct7MRGDvPzF4ZYbVKWlk/ES4kC6klFJRYWB2SijO17qvTQ9qd+W/0kD/8Wfvge+PdXwKOfq35MSlkA4ayZos3TvFjkUrDvl2CywJNfhiOPcmTxm5GYuFVLC00UzCWxyESGyUkzCc9SXPmTSM+tRSpMwuSt3snVpP5FK1NnY8PKStsWdqtgfLvWCblnoxKL8CHIVPz2HvkMjO7mWctG+issUWVZuBCyCNk4MhUhUmxhHB85YYMNt6hNtYB9wzFWG/GKBU+7286KTs/MvA1NYojFFCLJXN1MKFArsp8VLlI3dv1QXT7zTTLYeKj7DxGbfpfNYj+/fGbf5CeGDsKTX2JH4CZ+aLqGZYmdfGH5U2D3sfHVKj9+TXf9HcWWtbtY1uHiSw8fYDyemZVlYR64gKhsofvxj2Hf+XW+bP1X3KkTmDa8EdcVfzrpWOFSdR+piAryZoJKLEytA2qLTWcPLakT/GjnIOssx5EIvCLFRLwibhMdVP7/gYvVx7DrAQBywoptfDfPHK0oBtMZfgGAwqIL4NcfKwWBJ5GOIB0+XvXZ3/BCUKgCvWZ8tQceUg3zbvy0Ot5k4Tct1wJw6wX9CAGxnKnUB0rGR4kIH8LdiYvk3BdjpsJEhRtPtRYaQsuG0sTiO9uOcs8vHyUtrbzlWwe58fOPsCfbqY7t3ghd69X10T3qctcP4MG/h42/y135G1jcVl6MuO0WJtBupyNKLGQLeSx8NPBpuO4TAEykcgxH06wy4hXnNIZYTKEZy8JlNzNCgFTPheqfMR2F57/H/VyMy9eOe+21WESRoZ2/UoGmnLYT2W8/CyYLPwzcwTb7xQhZQOz7BWx4I++4ai33/smlXL6ive57m0yCj968nqMhtceFngk1E1p9Pv4g+yFMmQlWbvsou+QShm79BdzyVejdPOlYs1uJRSaqYjD54GHVlsKjLBxXx1LWu2J8/7Z+7MUU6YCyTGKRCt++Fq94OKcmspbjT1BEIJZeyVpxhIdfqlLoNrIL6e7mExMqTz83+Oz0Y1IRinYfE6kcRxIWZQkMvwCHG6Tn7rlPua02vw2u+Rhc8UG2B230B5z0+p2s7HQTzopSgNucGiduDWDxdGpvO1L7tWdDKsxY3smyal2AS+0+VObcyyNxBkxjFHz93PPOS7hkWRs/GVKTeCKwlqhPxV1KcYuXfgbubuTrvsCRcJrFbeXfi9thmdwfKj3BhFTn8ECsr1RY97K2Z/uqLmMf7XMZQyym8H9vXMsfXrak7jF6Z9DQ4hthdBfcdT0yl+CuzDW0uWzQdyE5s5MV8W0M3f8Z+FQ/PPBxeO4eOP8Ojud9HHOuVcVkAJveihCCVwy0NmU+Xrmqg+vWqSryRbOwLAItNnbKFfx461282Hcbt2f/ikCgeq8fm0fl1WdjSixk5CiDsr1kfdnbBhgwB9nq0Cqc+1RqbipakTWkxSs+vV9t/9qb3kfY0oGlfytLTMMkEtO3i5UjL/JioZ97h5V4Rg9UiW2kI+RtakI7mtRW5V+7Dv77ltpptIUc7P0ZrL5RZTxd+qfwqg+x+0SU9T3aHgyLfEQylCyIlmyQrKMdh1995uOjc1i5DxRTYYZzLaViuUloO+XpzQLjmTyLTeO4OpdzyfI2/vqGNdydupz/7vs7rrl7mD/64ahK+9XqKkiMgq+P8ZQkmS2wuCJA7babVVEeQCqMyEwQxcXSdhcj0Uypw+zeklgYlsW5jCEWU7hqTScXNWiSpgciT/ReBwgI7id803/ynFxBu8euJqHFl/Eayw78T32GvNkJj3xWZdhc9n7i6Txupw023Qo9m6H/whmf58dfv4E/vmJZaa/tmeB1WvDYLTyf6eanfe8nbvbhrdEV1O5Vk3VeEwtzdJDjsqPsqvP1qwKuQyr2YFl6KQDZaEU22PjLJIWTg+alZKUZM0UynsXQtQEzEmd4iruukKM4+hKPxbq5busaDhW7yA8+M/3kkiGymlgcimvnb3WoDaH2/Lj64CeOqVX0kstLd52IpDg0nuC8PvVa7W57KWYRTefwywi4OvAEVDHmxNjcioVMhJiQrlL9wySmuKFi6TyLGC01T9zc7+f81Yv5yP41DE2kOT6RBneXKqgDVbnt7uRoSLkFJ7uhrEQ1S4KJ4whZZEK6WK/1KjoSVNbr88cmcNsts4qPGZw9GGIxC3TLImxugzd8Gd7+Y453K593m1bMZ115NV2EMFPg5szfk73x3+ANXwRPF9F0Do/DqnzC73pYVbvOkG6fgw/fuLb5rRgrEEKwosvN3uEYobjaLKlW9leLX1k/xWQQpMSeOM5x2Y7Pqbnq9MKgJ78IgeU4OlVVaS5RjjFEj7/E/kI3H3zNOsIWZak4OpdDt8qmao3tnfymwf2YizmGHMv425vW8YJcRst4lWZ5qQhpixLLJ7IrSK68Gd75ELQuhee/XX3wWm1Gzu5n1wkVEL/3GRWYf90mZfl4HRbS0kIxn+VYMEE7E1i83QQ6VaZYfC6ruAs5zLkYYelhRWcVsdAC3Hltp7xcMoKXeLkfFvB3N6/nDy9bwi1bFhFOZsHdWW4EGB8FVweHx9XEPzDFDTWBdlur+p7AxfpeJZpHggnimTw/fWGI69Z11a/fMTjrMcRiFuiWRSKbh823wcDFjCeUy6JNK5BjxTUAHFp1B3sybRxf9hbY8CZArQ5Lm8Gcpn/A1V0e9o3ECCYytVuaAx6Xi7h0IFJhSIawFFIMyvZy0eLqG+CVH4Bb7oQ7foFwqrRLmSjHLFJjhzghunjL+X34u1UrE/+iVeBfQlbYCKSOTHrP8EEVn1i2/iJ8LVYO2VaplvBTC+JSIZJmtQo+QTvbLvic2mJ24+/CoUfgyONq06LKgLSWMfXz/Slu+vyj/Hr3CN/bMcjFywL0ay4an9NKVlqR+QzR8Bg2UcDq66KtS4lJdmKWMYvH/x2+ePHk3k3a+USFe9Kqv4RmWWSzKtjuSmq1Pf6yWCxtd/HRm9ezvNNNMlug4OpUIlEsag0SOzgSSmISTEqIaLGaGZEBkhY/PP9ddR6ypWRZ7DgS5ofPHieeyfO2S8rvZ3BuYojFLNBbMsQz5XRGfee5dr2JX8cqePdjDG/5cwBCifKEFUvn8JzmbSZXdXkIJ3PsHYmVz7kKboeFCG7MqRBE1KSuLAtNLJx+uPpvYeNbwN0BmlhUbqTjygax+XvxOKzY29SkYwosBZOJsLWbQG6yW+fAi0+RlWauuuwyACb8ygJhaGf5oHwWsnHi5rIf/YCeqrzxdwAJX78BfvYB2Hd/+XlabcavD6nv4333PMuRYJK3bO0vHeJ1WslhQeYzZCLKirB6u7C7WslhoaCJ1lgsw9cfO9RctezQ8yqra2zPlL0iVCaYxRWobiVqAe5MTqUie1LaZ9U6ffLWEzMy9nZlWaRCqm2Hu5OjwQQ9PuekZnUmk8Bud7Ct400lyyKKi26fg5s39XLno4f411+/zLoeL1v6Z+7uNDi7MMRiFuhuqHi63NtpXOuv1O6uWKV3b6DNo1aruphIKYln8soNdRrRg5XHQqnSnt3VsJpNRPBiyUZK7U1C1h4stYp+HD4KmDBntHTYXBo3CYotWpaX7rYKKAsj6lhER2FyNlR+6AVOWBcz0KkmKNG7kSICebwibqFNsnGhxmExCQ6MKbF4OtrKV4pv5M7ia9WhowcrnqfE4umhIm/csoiClLjtFm44r7t0iNdhJYsFChkyWqDe4e8CIYiafLgmXobjz3Dnb1/m//14d2mXwmnkM3DXDfAfV8B3/6Cc1huqPB81Dpe/RoM+LcCdyWptaLKa0Pini4W+j0nC2qb6denV8a52joSSkzKhdFx2M7/xvb60NW1UunDZLXz6zRvZ3O9nPJ7hbRcvNlxQBoZYzAan1YxJQKKiEWAwnsFhNdFim9wmJKCt2vXOrslsgaLktO9JvKq77B9vq+OGAoibPNizkVIBYszZW/tgIUiaPNiyyr2Sj6qVedGl7QEycDF4+6BNFZIlXX0skiOT2oMM5A8T11NAgb7uLvYV+yju/FZ5u8+U3nxPicXaHi/7R+PsOBLm9q8/zfd8t7Nnw18RkS5Gj1UE0DW3zwQu/vTVK/j8rVv45Bs3TNp4xuu0ksGKKGTJJ9Rk3uJVSQ859yIuzD4N/3kVrufuAihVwU/j0c/B0cfVRkrRE/CaT6n79UwloKC561rbO2t8npobSrMsvPkgeWErW3AV6HuwRC0BQJZrLVydHA+n6K+SZu22WxgpeErdYcPSjdtmwWE187W3X8DHbl7Hm7YumvY8g3MPQyxmgRACl80yqWtsMJ6lzWWftgLTA956s76YZo2cbsuiw20vZTTV3f8CiJt9OPIROPEsQ5Y+rK76LomUxYdda2seD6nVrcmjicXKa+EvdoFdiVXO049PJImGVfZUOpujkzBpV1mQlne4+ZvcHcoPf/frVYGeVqQXki5sFhNrezzsGYryrru30+mxc887L+YD169iUHZgiVY0fExHyGOmr6ON5R1uXrOhe1LvLQCf00IWC6ZCjoL2PnaPEovsLd/gbdkPc8y5htemf4qgqILKUxnZDb/9DGx4s0pi+MgIXPAOlRFXYVmExpVV1dnZU/3DNOkxixy5QhFvcYKUtbVqrEu3EEMm7fvRay1cHURSOfxVdlB0O6zqd/zqv+VXqz7KCdpLbtaAy8btly1tep8Fg7MbQyxmictumWRZjCeyVX3/DquZFpu5ZFnE0mqFeLo3sBdClFxRbXXcUABpqx9XPgrHd7DPvLJh0WLW5sNViCKlJBlUYmH1dVc9tujT2mtorqJYeBSLKCJbym6Z5Z1udsjV/GbL55TPf9e9JfdNsODC67CwvMNNNJ0nmy/ytdsvoNPrIOCyMSg7cCTK/aWy8RAR6eL6DTUmZzQ3lLQiKGLSGwc61QQ8sGQ5452X8pmJV7PMNMylpl2TxUJKck98hfRXr2FCOvma592MxtJqcjeZoXXJJLEIjitLaVFvjdW7UP+ihUJOFYyKGBn7dKsCynuwjDNZLNL2NrL5It4qCxS33azEoiXAdv8NOKym2i5Gg3Ma41cxS1x2s8qG0hiPZSbHKyoIuGylPSNiGd2yOL1iAeWK3HoxC4Cs1YdLxiE2xPNyeTm4XYO83Y+XOMlsgUxE+didrdUnZ3PbEvUeY2oCLW3h6i67ZXq8DhxWE48VN6iA78RgyQ01WnDhcVjZMtCK1Sz411s3l+oV7BYzw6YuPOkTpXhBJh5mQrpYWacaWQ9wA9hS4xQRYC934r3xvB5+XryIiPDy++ZfT9qiNrn3Qay/+Gu2ZZfyfven+cRDo9z4b4/w+IFxgvEMOd8SqGhAmYmOUZSC3q6u6iejBbjNFDkeThEQMfI1xEIX8eGCdq7DL4IoF95V+97cdksp9hbP5EvxOAODqRhiMUvcdsvkbKhEpqY7p81tn+aGqlUEdyrRG8O11RA5nVzF5LQjv7Rh7yzpaMUv4oSTWfJRtXJ2tVW3LOztKtBdDB0GIB1Wx1u95cnTZBKs6HSzZzgOnl7l/9fcQ6M5Jx6HhQuXBnjhY9dz9drJk27E1o2tmFbFaUAxGSZGS13Bc1jNFEzqcWdmnKRwleodQIlFTlg5MnAL15h2kIyW04T3v6ziBLHr/oWv/+Wt/OL9V+B1Wnnrfz7F1r//Nd85YEWGDpaD3akwE7jw14obaW4oM5LjkRQBYhSd1YtGrWYTHruFE3ktQyyhaiyi2u/UW2XMLnvZnZrI5Kt2vjUwAEMsZk2lG0pKqWIWNSbdNpdtmhvqdMcsAK5f381tF/aX8uprUXBotRMmC0+lFuF31rdEREsAP3EiyRwyPkpIugl4qnfT9fjbiUonYkLt0ZDTAuJ23+RJ//zFAZ45Gqbo6VFikQqDycpYxlqy0hxV9iCJOzX3jrYHhND6H1WbOCsxW1V2kDsXImWe3OZiRaebn/3ZK9lwwauxiCKEj5YeK8SVcLxizTJAbRb0o/dext/cuJabzuthb64DkUuWiubMmQhR3FhruX60ALepwrKgpXaHAb/LylhKgF37Tt0dpXbx1QTSUyEW8Uy+eudbAwMMsZg1lWIRTeXJF2VN339gklio5ywEc7/T6+Afb9lYdZKtRGptyosd60hJW0PLwuIO4BZpwtE45uQoY9JfM87hd9kZlJ3YYqrYrBhTAV9HYLIlctHSAOlckYi1E6LHlRuqJUAsk8djr30+GbeWqhs5DIApM8EErqr++0rMNiX83mKIjGW6mK7t8WL2qSC8iJfrJmQyRFaa8XnL1pjHYeWdVyzj9Zt7OSI1EdTiFtbsBHFTnZ5LJcuiyFAwilckMbtrN5sMtNhUe3jdjefqIJqqbc3qv2M9pXsh/C4NFiaGWMwSj8NCVFuxjWnxiA5PbcsimMgipaywLM6cf0qT1qY83r4RKKdo1sLlU8Hp8fERbKlxQsJfU5A8dgvHZActemVyYoycNOOZUneg7yt+JO8vu6GcrZOr4atQ9GrFdpplYclGicr6bigAi01ZFoFihJythuXlVXEYW7Jc0W1Kh5nAjcM2fbxdXgeHpSaCmljYcxMkzXXEQgtwm0WBSFBZXfXEwt9i01p+aKLk6qxrWbgdFvJFSSZfJJEpTNsD3MBAxxCLWdLrczISy5AvFFW2C9QNcGfzRRLZAvF0HiE4o8x96V1ERloZ9KuGh/4GE603oFa14fFhnNkgMXOg5rEmk2DM3IU3MwRSYkqOEcSL1zn5s2xz21nR6WZ33K0aBQYPgDOgiUXt83F7/YSlBxk+ClJizUeVZeGs//lbNbFoFTEKFcHtyS/eRRGBI10uKrRkwkSFp2oRW7fPwXHZTlFYSmLhyEdJV7FcSgiBFKr5YiKi3sfmrVGTgfqtlfpDAbjaiWoLlGquN72TQCydN2IWBnUxxGKW9LU6KRQlQxNpBkOqgrda0ROUs42C8QzRtDL152Mnq/nC5u3kwswXecZ9JUBDN5TZpcQhGhrDkw+RsNXv4hu29WpB6DFs6XFC+Kq2vrhoaYBtIe0zHt+HdPq1avjaE1yry8Yx2U4hdBiyCcyyQMLkaVg7YLeXeyhJR426ErOVhKUVT7bcs8qWmyBhrj75t7vtSGEmYu8ptW13FWNkrA1aaZiUWOT0anJfjWpv1HcTTuTKloW7kwlt18JqrrdSn7NM3nBDGdTFEItZom86NBhOcTSUxGwS9PgdVY/Vs6SCiaxaCZ9h/5Aeh4UJ3AyGlSg2Egu9FUVr6FnsMk3GUX9Dp4hTiysED+DIhpgwVZ88L1rWxqGstsqXBXK21tL51SLgsnFMdiAjR0p9oXLWxvsy2Ozl71JUqZbWids68OWDpf5QztwEKUt1S8RsEnR47ATNWtylWMAlE+Tt9cVCCDM2k8RVUIWO+h4j1Qi02Ihn8uT1OhVXB9F0DqfVXFWAS61rMoZlYVAfQyxmSX9ArTwHw0mOhpIs8jtrZrToXV1D8axqIrgAMqFmgj4ZP7xXrWxrudtKBJYxZuvn6sTPAMg7a09uABGPtof06C5cuRBxS/XJ+RUDfoZk2UrJWL2Tzq8abVphnjk6WEq31TdMqofdUbYszC21xSLj7KSTEKmcSk9tKUTrWgrdXgcjBCA6BOkJTEiKDcQCkxmnRbnEAHDViVloVmzSqn1OWsyilttNF4toOkciWzDEwqAm8yoWQojDQogXhBA7hRDbtfsCQohfCSFe1i5bK47/sBBivxBirxDi+or7t2qvs18I8XmxALqa9ficCKEsiyOhJAOB2tub6llSoUS2odtkIaK7L/aOxPjjK5Y1DHAjBEe7Xk0fKvAr3fXFQrp7ieGC4Rdx58Mka7itur0OQsJHQUsn1Vfw9cQ34LKxXy7CVMiUutbKWjGICpyOsmVh89SOueRd3XSKsMp2kxKPjJKrM/l3eh0M5v0QH6YYV7Ufoo4YqQPMtFigDW33vzqWTkD7bkLeNWBxQscqoql8zYC+3klgLKaSNNxGgNugBqfCsrhKSrlZSnm+dvtDwANSypXAA9pthBDrgFuB9cBrgC8JIfRf7peBdwErtb/XnILzrovNYqLb62AwnOJYKDlpU5mpTHNDnWFiobudLlnWxgevX93UcxJLy1+R2VO9IE/H57KxVw7A0SexkSNdQywsZhNtHidRi1pZ67GBRm6ol4qqpQhHHleXjsZi4XCWv0+7u07MxdNNu4gyEYtDNoGNPMUaFdagBO9Q1gPFPOmRl4AmxMJkxqFZFnHhAnNtcdT3GRlyroKPDIN/QFkWNQRVtyRGoulJtw0MpnI63FCvB76pXf8m8IaK+78tpcxIKQ8B+4ELhRA9gFdK+YRUjuG7K55zWulrdfLScJRQIlvXsmixWXBYTYQSGWLpHO4zzA3V19rCv9+2ha+8bWvTfYNcyy5iRKoVdq2+UDqtLTZeLPSrvk9AvkaFMqj00zGhHo9p7ckbWRYvy0UUMcGRx4D6biUdZ4VYtPhqn4/Zp+2eN3681EFWttS2RLp9Dg5nlFjlTqjeTRZX/QQATGYcZgiIGDFzfZdVq2bFRir6VUXTuZqWhZ7Ztl/bC8QIcBvUYr7FQgK/FELsEEK8S7uvS0o5BKBd6nmAi4CK9qAMavct0q5PvX8aQoh3CSG2CyG2j42NVTtkTulrbWH3kHIN1BMLUG3Ag/Ez07IAuHlTL75Gge0K+ttc3F+4gJS0lbZmrYW/xcpLcqB0u9hS223V5XUwJLVsq5JY1P483XYLRbOTsKOvVGthdjUWC1dL+ft0emtP5vaACs5nwsdLqa2mOmLR6bEzLLX3H9kNgM3TQCxa2ukujhIgStJc3yrSix9DFWKhYhbVv7s2t50Ni7z89HlVWHgmpXQbnFrmWywuk1K+ArgBeK8Q4oo6x1aLQ8g690+/U8qvSinPl1Ke39FR308+F/S1OkstfhqJRcBlY89wTNt/++z/h+xw2/m8uI03ZT9Gq7t6lpiOz2ktu4oA6a4tLl1eO4dzarINo5oB1vs8hRC0uqwM2paV7rM3aLEOk8WiXjaUq02JRWHiOMmIWqBY60z+3T4Hw5rYWYPKDeXw1c8WY9mVLE8+R48IkbbVP3fdZRiuaG4YTdW2LADeuKWPRFYF6A03lEEt5lUspJQntMtR4AfAhcCI5lpCu9QrmgaB/oqn9wEntPv7qtx/2qncz7hezAJgc7+fPUNRcgVJR6NsorMAIQSB1nZ2yyUNu9r2tbawT/YhtXWBxVOjAyvK538gpybb8aKyLBq17gi47BwyLQHUHtPelvriBZPFQm9PXg13hyZysSHSMeWGsntqT/5dXgfj+JCYcERVYV5LI8ti+auxygzLTUNkbbWtFlD9sXp8DvaNKLdSsSiJZfJ1G1fevKkHvezHcEMZ1GLexEII4RJC+QmEEC7gOuBF4D7g7dphbwd+pF2/D7hVCGEXQixFBbKf1lxVMSHExVoW1B9UPOe0otdatLZYG05Yn3jDBrb9zTV8948v4fcumr4l5tlIv2ZtBRpkT63p9pASDsZtvRSlwFZnsu30Ovhe4UpGX38Po9KP1SywV9u7uoI2l409Uq03VBPBxhOi16UaHxYwga12XYbZFSCDFWtihFxMZTc5vPXdaAXMJO1tmGSBCdmCz9VAvBZfRkGoc8476osFwKY+P88NqpqSWCaPlNWrt3U6PQ4uX6nO2Wj3YVCL+bQsuoBHhRDPAU8DP5VS3g98CrhWCPEycK12GynlLuC7wG7gfuC9Ukq9B/h7gDtRQe8DwM/n8bybRrcsGrmgdDo8di5cGsBZpW/Q2Ui/9vk07CVlt7C03cWufD9BPHhdzprHdnsdpHBw2H9xqWalUSZ1wGXjuawSi2iD9uQ6bpf6TpMm96T25NMQgqAIYE+NlALc7tbaYud1WHBazUTM6piIdDfsgIvdTSiwBYCiswmx6PdzJJgknMiW+pc1eo+3X7KYbq+DLm9jq8vg3GTebE4p5UFgU5X7g8DVNZ7zSeCTVe7fDmyY63M8WfRai/4mxeJc47aLBhhoc1WtHJ7Kuh4vn3jhTfSJK3lPnYmt26cms5FouulkgYDLxsNJPwWri4lM446zABabEqy02UOjeu+IpR1XZgyZCBGTTrzu6u3YQbnnurx2xmljERDBTX8Trp+J3svpCG6r255cZ1O/CoI/NxgpFVA2GvPVa7um7QViYFCJUcF9EtgsJt564QCv3Vh7i85zmTXdXt5x+dKmjl3X6+WAXMRvipvqrvy7PGWxCCWyTU38PT4H0UyRo91Xs02ubrySBzCZKGDG4mq8kh91Lmdpdh/W5DAR6W5ouSxuc3E0p2pE4iZPU33C0suuJy9NFALLGx573iIfQsBzxyZKlkUz1pSBQT0MsThJPvnG83hNnf2cDZpjXU+5+V69ic3rVDUrg+EUzxwJl1bR9TivTx3z2ZY/53P5tzQ9cZqtdlrbGmfVRQeuwUma/tDjTIjGTQrX93rZl1T2ytSNlWqxYsOF/POmn7PuwmsbHutxWFnR4ea5wUhFx1kjcG1wchhiYbAgWFexW1+9lb9y4zi4/8VhEtkCV6xsPJnrK+3H9qsAdDPWCABmG9TqOFtBYMPVxKQTezFF3FR/10GA9b0+ThRVOm6tppSxCB8AAAj7SURBVINTcVjN/N9bLm7cakVjU7+f545F6u5lYWAwEwyxMFgQdHoctLvtmE0CV4MEgC6vg+FoGotJcOmKBjUKlFfa4WQOIWaw8VRgKXSubXjYeYu7+E1RbQyVqrc3hca6Xi/DKLHIWpsTi5myqd9PMJHlqUOqeWJTrjcDgzoYYmGwYFjX68XrsDTMbtIzdrYubm26LmBjn7IQZrSXyB89CK/8QMPDfE4rL7guBSDTxOS/ONBS6m9VaNRxdpZcu7aLgMvGvc8cxyTAbVRmG5wkhlgYLBj+8LIlvPvKxgHcbq/K8LlydfNV+pu12EbTLihQKbP10mYrSCy+hrS0EnfU74OlXlbQ0r2K7+Wv4HDrJc2fzwzo9jn4+u0X0GIz43FYz6jNtgwWJsZyw2DBcNXqTq5aXb+PFEC3T6W1NhOv0NnUr1bw8+W7X720n+ue/2cu7drAm5s4fs2iAB88+m7e4W8uW2w2bOr381/vuIgjwcS8vYfBuYMhFgZnHLdsWUTAZWV9b+P4gM6abi82s2nesoK29Ps5Kru41uVu6vj1vbOwdGbB1sWtbF3cuHGigUEjDLEwOONoddl445a+xgdWYLOYuGJVR9PV9jNldbeH/oCTNd3NpcLq2V9GSqvBmYLQ9w4+2zj//PPl9u3bT/dpGJxDSCkbBud1CkXJ5361j9suGmCRv3Z7EwODU40QYkfFZnUljGWNgcEcMZPdfs0mwQea3HXQwGAhYGRDGRgYGBg0xBALAwMDA4OGGGJhYGBgYNAQQywMDAwMDBpiiIWBgYGBQUMMsTAwMDAwaIghFgYGBgYGDTHEwsDAwMCgIWdtBbcQYgw4crrPYwa0A+On+yTmmLNpTGfTWODsGs/ZNBad0zmmxVLKaV06z1qxONMQQmyvVmJ/JnM2jelsGgucXeM5m8aisxDHZLihDAwMDAwaYoiFgYGBgUFDDLFYOHz1dJ/APHA2jelsGgucXeM5m8ais+DGZMQsDAwMDAwaYlgWBgYGBgYNMcTCwMDAwKAhhljMEiFEvxDiISHEHiHELiHE/9HuDwghfiWEeFm7bNXub9OOjwshvlDxOh4hxM6Kv3EhxL/WeM+tQogXhBD7hRCfF9puO0KId2v37xRCPCqEWHcmj6fi8TcLIaQQYkYphAtpLEKI24UQYxWv8UczGctCHJP22O8IIXZr5/KtM3UsQojPVTx/nxAiMpOxLNAxDWiv/awQ4nkhxI2zGdM0pJTG3yz+gB7gFdp1D7APWAf8M/Ah7f4PAf+kXXcBlwPvBr5Q53V3AFfUeOxp4BJAAD8HbtDu91Yc8zrg/jN5PBXn8FvgSeD8M3UswO31XvMMHdNK4FmgVbvdeaaOZcox7wPuOgu+n68C79GurwMOn+zvT0ppWBazRUo5JKV8RrseA/YAi4DXA9/UDvsm8AbtmISU8lEgXes1hRArgU7gkSqP9aBE4QmpfgV3V7x2tOJQFzDjrIWFNB6NT6D+0Wq+/hk0lpNmgY3pncAXpZRh7b1Gz+CxVHIbcM9MxrJAxyQBr3bdB5yYzZimYojFHCCEWAJsAZ4CuqSUQ6B+QKgvu1luA76jfflTWQQMVtwe1O7Tz+G9QogDqAn2z2Zy/lM53eMRQmwB+qWUP5nxyU/hdI9F402aO+D7Qoj+GbxnVRbAmFYBq4QQjwkhnhRCvGZmIyizAMain8diYCnw4AzesyoLYEwfA94mhBgEfoaymE4aQyxOEiGEG/hf4P1TVviz4VZqr2xElftKPyIp5RellMuBvwY+MtsTON3jEUKYgM8Bf3mS733ax6Jd/hhYIqXcCPya8ipzViyQMVlQrqhXoSa0O4UQ/pm++QIZS+Xzvy+lLJzMSSyQMd0GfENK2QfcCPyX9n91UhhicRIIIayoH8b/SCnv1e4e0UxE3VRsykQXQmwCLFLKHdptc0WQ6+OolUNfxVP6qG5efptZukAWyHg8wAbgYSHEYeBi4D4x8yD3QhgLUsqglDKj3f+fwNaZjGMhjkl77EdSypyU8hCwFyUeZ+JYdOpNzE2xgMb0DuC7AFLKJwAHqjHhSWGIxSzRMg++BuyRUv5LxUP3AW/Xrr8d+FGTLznJXyqlLEgpN2t/f6eZsDEhxMXae/+B/tqab1PnJuDlM3U8UsoJKWW7lHKJlHIJKsD9Oinl9jNtLNq59FS8zutQvuwZs5DGBPwQuEo7r3aUW+rgGToWhBCrgVbgiWbHsMDHdBS4WjuvtSixGJvl0MrIOYiSn4t/qEwGCTwP7NT+bgTagAdQE/YDQKDiOYeBEBBHrQzWVTx2EFjT4D3PB14EDgBfoFyB/2/ALu0cHgLWn8njmXLMw8w8G2rBjAX4R+27eU77buq+zhkyJgH8C7AbeAG49Uwdi/bYx4BPnUXzwTrgMe03txO47mTGpv8Z7T4MDAwMDBpiuKEMDAwMDBpiiIWBgYGBQUMMsTAwMDAwaIghFgYGBgYGDTHEwsDAwMCgIYZYGBjMAUKIglYwtUsI8ZwQ4i8aVc0KIZYIId56qs7RwOBkMMTCwGBuSElVMLUeuBaVY//RBs9ZAhhiYXBGYNRZGBjMAUKIuJTSXXF7GbAN1WZhMfBfqI7AAH8qpXxcCPEksBY4hOoZ9XngU6ieS3ZUZ9f/OGWDMDCogyEWBgZzwFSx0O4LA2uAGFCUUqa11iz3SCnPF0K8CviAlPK12vHvQu0N8fdCCDuqCvctUvVfMjA4rVhO9wkYGJzF6J1BrcAXhBCbgQKql1I1rgM2CiHerN32oRr0GWJhcNoxxMLAYB7Q3FAFVJfRjwIjwCZUnLDWhjcCeJ+U8hen5CQNDGaAEeA2MJhjhBAdwFdQ22VKlIUwJKUsAr8PmLVDY6iW7Dq/AN6jtbpGCLFKCOHCwGABYFgWBgZzg1MIsRPlcsqjAtp6q+ovAf8rhHgLqvNsQrv/eSAvhHgO+Aaqe/AS4Bmt7fQYc7w9q4HBbDEC3AYGBgYGDTHcUAYGBgYGDTHEwsDAwMCgIYZYGBgYGBg0xBALAwMDA4OGGGJhYGBgYNAQQywMDAwMDBpiiIWBgYGBQUP+PxWznUd20QoPAAAAAElFTkSuQmCC",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"plt.figure()\n",
"plt.plot(multi_X_test[\"timeStamp\"], multi_y_test, label=\"Actual Demand\")\n",
"plt.plot(multi_X_test[\"timeStamp\"], multi_y_pred, label=\"FLAML Forecast\")\n",
"plt.xlabel(\"Date\")\n",
"plt.ylabel(\"Energy Demand\")\n",
"plt.legend()\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 4. Forecasting Discrete Values"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Load Dataset and Preprocess\n",
"\n",
"Import [sales data](https://hcrystalball.readthedocs.io/en/v0.1.7/api/hcrystalball.utils.get_sales_data.html) from hcrystalball. The task is to predict whether daily sales will be above mean sales for thirty days into the future."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"from hcrystalball.utils import get_sales_data\n",
"time_horizon = 30\n",
"df = get_sales_data(n_dates=180, n_assortments=1, n_states=1, n_stores=1)\n",
"df = df[[\"Sales\", \"Open\", \"Promo\", \"Promo2\"]]\n",
"# feature engineering - create a discrete value column\n",
"# 1 denotes above mean and 0 denotes below mean\n",
"import numpy as np\n",
"df[\"above_mean_sales\"] = np.where(df[\"Sales\"] > df[\"Sales\"].mean(), 1, 0)\n",
"df.reset_index(inplace=True)\n",
"# train-test split\n",
"discrete_train_df = df[:-time_horizon]\n",
"discrete_test_df = df[-time_horizon:]\n",
"discrete_X_train, discrete_X_test = (\n",
" discrete_train_df[[\"Date\", \"Open\", \"Promo\", \"Promo2\"]],\n",
" discrete_test_df[[\"Date\", \"Open\", \"Promo\", \"Promo2\"]],\n",
")\n",
"discrete_y_train, discrete_y_test = discrete_train_df[\"above_mean_sales\"], discrete_test_df[\"above_mean_sales\"]"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>Date</th>\n",
" <th>Sales</th>\n",
" <th>Open</th>\n",
" <th>Promo</th>\n",
" <th>Promo2</th>\n",
" <th>above_mean_sales</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>2015-02-02</td>\n",
" <td>24894</td>\n",
" <td>True</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>2015-02-03</td>\n",
" <td>22139</td>\n",
" <td>True</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>2015-02-04</td>\n",
" <td>20452</td>\n",
" <td>True</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>2015-02-05</td>\n",
" <td>20977</td>\n",
" <td>True</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>2015-02-06</td>\n",
" <td>19151</td>\n",
" <td>True</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>...</th>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" </tr>\n",
" <tr>\n",
" <th>145</th>\n",
" <td>2015-06-27</td>\n",
" <td>13108</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>False</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>146</th>\n",
" <td>2015-06-28</td>\n",
" <td>0</td>\n",
" <td>False</td>\n",
" <td>False</td>\n",
" <td>False</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>147</th>\n",
" <td>2015-06-29</td>\n",
" <td>28456</td>\n",
" <td>True</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>148</th>\n",
" <td>2015-06-30</td>\n",
" <td>27140</td>\n",
" <td>True</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>149</th>\n",
" <td>2015-07-01</td>\n",
" <td>24957</td>\n",
" <td>True</td>\n",
" <td>True</td>\n",
" <td>False</td>\n",
" <td>1</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>150 rows × 6 columns</p>\n",
"</div>"
],
"text/plain": [
" Date Sales Open Promo Promo2 above_mean_sales\n",
"0 2015-02-02 24894 True True False 1\n",
"1 2015-02-03 22139 True True False 1\n",
"2 2015-02-04 20452 True True False 1\n",
"3 2015-02-05 20977 True True False 1\n",
"4 2015-02-06 19151 True True False 1\n",
".. ... ... ... ... ... ...\n",
"145 2015-06-27 13108 True False False 0\n",
"146 2015-06-28 0 False False False 0\n",
"147 2015-06-29 28456 True True False 1\n",
"148 2015-06-30 27140 True True False 1\n",
"149 2015-07-01 24957 True True False 1\n",
"\n",
"[150 rows x 6 columns]"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"discrete_train_df"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Run FLAML"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"from flaml import AutoML\n",
"automl = AutoML()"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"settings = {\n",
" \"time_budget\": 15, # total running time in seconds\n",
" \"metric\": \"accuracy\", # primary metric\n",
" \"task\": \"ts_forecast_classification\", # task type\n",
" \"log_file_name\": \"sales_classification_forecast.log\", # flaml log file\n",
" \"eval_method\": \"holdout\",\n",
"}"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"[flaml.automl: 08-03 20:33:26] {2520} INFO - task = ts_forecast_classification\n",
"[flaml.automl: 08-03 20:33:26] {2522} INFO - Data split method: time\n",
"[flaml.automl: 08-03 20:33:26] {2525} INFO - Evaluation method: holdout\n",
"[flaml.automl: 08-03 20:33:26] {2644} INFO - Minimizing error metric: 1-accuracy\n",
"[flaml.automl: 08-03 20:33:27] {2786} INFO - List of ML learners in AutoML Run: ['lgbm', 'rf', 'xgboost', 'extra_tree', 'xgb_limitdepth']\n",
"[flaml.automl: 08-03 20:33:27] {3088} INFO - iteration 0, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3221} INFO - Estimated sufficient time budget=11912s. Estimated necessary time budget=12s.\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.2s,\testimator lgbm's best error=0.2667,\tbest estimator lgbm's best error=0.2667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 1, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.2s,\testimator lgbm's best error=0.2667,\tbest estimator lgbm's best error=0.2667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 2, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.2s,\testimator lgbm's best error=0.1333,\tbest estimator lgbm's best error=0.1333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 3, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.3s,\testimator lgbm's best error=0.1333,\tbest estimator lgbm's best error=0.1333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 4, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.3s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 5, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.4s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 6, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.5s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 7, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.5s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 8, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.5s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 9, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.6s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 10, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.6s,\testimator lgbm's best error=0.0667,\tbest estimator lgbm's best error=0.0667\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 11, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 12, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 13, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 14, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 15, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 16, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 17, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 2.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 18, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 3.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 19, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 3.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 20, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:29] {3268} INFO - at 3.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:29] {3088} INFO - iteration 21, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 22, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 23, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 24, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 25, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 26, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 27, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 28, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 29, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 30, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 31, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 32, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 33, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 34, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 35, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 36, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 37, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 38, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 39, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 40, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 41, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 42, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 3.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 43, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 4.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 44, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:30] {3268} INFO - at 4.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:30] {3088} INFO - iteration 45, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.1s,\testimator rf's best error=0.1333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 46, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.1s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 47, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.2s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 48, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.2s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 49, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.2s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 50, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.3s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 51, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 52, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 53, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.4s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 54, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.5s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 55, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.5s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 56, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.6s,\testimator rf's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 57, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 58, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.6s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 59, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 60, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 61, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 62, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 63, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 4.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 64, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 5.0s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 65, current learner rf\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 5.0s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 66, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:31] {3268} INFO - at 5.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:31] {3088} INFO - iteration 67, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 68, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 69, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 70, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 71, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.3s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 72, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.3s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 73, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 74, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 75, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.5s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 76, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.5s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 77, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 78, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.6s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 79, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 80, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 81, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 82, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 83, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 84, current learner rf\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 5.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 85, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 6.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 86, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:32] {3268} INFO - at 6.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:32] {3088} INFO - iteration 87, current learner rf\n",
"[flaml.automl: 08-03 20:33:33] {3268} INFO - at 6.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:33] {3088} INFO - iteration 88, current learner rf\n",
"[flaml.automl: 08-03 20:33:33] {3268} INFO - at 6.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:33] {3088} INFO - iteration 89, current learner rf\n",
"[flaml.automl: 08-03 20:33:33] {3268} INFO - at 6.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:33] {3088} INFO - iteration 90, current learner rf\n",
"[flaml.automl: 08-03 20:33:33] {3268} INFO - at 6.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:33] {3088} INFO - iteration 91, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:34] {3268} INFO - at 7.8s,\testimator xgboost's best error=0.1333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:34] {3088} INFO - iteration 92, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:34] {3268} INFO - at 7.9s,\testimator extra_tree's best error=0.1333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:34] {3088} INFO - iteration 93, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:34] {3268} INFO - at 7.9s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:34] {3088} INFO - iteration 94, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:34] {3268} INFO - at 8.0s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:34] {3088} INFO - iteration 95, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:34] {3268} INFO - at 8.0s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 96, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.1s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 97, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.1s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 98, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.2s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 99, current learner rf\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 100, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.3s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 101, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.3s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 102, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 103, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.4s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 104, current learner rf\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.5s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 105, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.6s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 106, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.6s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 107, current learner rf\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 108, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.7s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 109, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.8s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 110, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 111, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 8.9s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 112, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 9.0s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 113, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:35] {3268} INFO - at 9.0s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:35] {3088} INFO - iteration 114, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 115, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 116, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.2s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 117, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.2s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 118, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 119, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.3s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 120, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.4s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 121, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.4s,\testimator extra_tree's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 122, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.5s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 123, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.5s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 124, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 125, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.6s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 126, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 127, current learner rf\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 128, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.8s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 129, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.9s,\testimator xgboost's best error=0.1333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 130, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 9.9s,\testimator xgboost's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 131, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 10.0s,\testimator xgboost's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 132, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:36] {3268} INFO - at 10.0s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:36] {3088} INFO - iteration 133, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 134, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 135, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.2s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 136, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 137, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.3s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 138, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 139, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 140, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 141, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 142, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.5s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 143, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 144, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 145, current learner rf\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 146, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.8s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 147, current learner rf\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 148, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 10.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 149, current learner rf\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 11.0s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 150, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:37] {3268} INFO - at 11.0s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:37] {3088} INFO - iteration 151, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 152, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 153, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 154, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 155, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.3s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 156, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 157, current learner rf\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.4s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 158, current learner rf\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.5s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 159, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.5s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 160, current learner rf\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.6s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 161, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 162, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.7s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 163, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.7s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 164, current learner rf\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 165, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.8s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 166, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 11.9s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 167, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:38] {3268} INFO - at 12.0s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:38] {3088} INFO - iteration 168, current learner rf\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 169, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.1s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 170, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.2s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 171, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 172, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 173, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 174, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 175, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.4s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 176, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.4s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 177, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.5s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 178, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.5s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 179, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.6s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 180, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 181, current learner rf\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.7s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 182, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 183, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.7s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 184, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.8s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 185, current learner rf\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.9s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 186, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 12.9s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 187, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:39] {3268} INFO - at 13.0s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:39] {3088} INFO - iteration 188, current learner rf\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.1s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 189, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.1s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 190, current learner rf\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.2s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 191, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 192, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 193, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.3s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 194, current learner rf\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.4s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 195, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.4s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 196, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.5s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 197, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.5s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 198, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.6s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 199, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.6s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 200, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 201, current learner rf\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.8s,\testimator rf's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 202, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.8s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 203, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.9s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 204, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 13.9s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 205, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:40] {3268} INFO - at 14.0s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:40] {3088} INFO - iteration 206, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.1s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 207, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.1s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 208, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.1s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 209, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 210, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.2s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 211, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.3s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 212, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.3s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 213, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.4s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 214, current learner xgb_limitdepth\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.4s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 215, current learner xgb_limitdepth\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.5s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 216, current learner xgb_limitdepth\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.5s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 217, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.6s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 218, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.6s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 219, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.7s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 220, current learner xgb_limitdepth\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.7s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 221, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.7s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 222, current learner xgb_limitdepth\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.8s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 223, current learner xgb_limitdepth\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.8s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 224, current learner lgbm\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.9s,\testimator lgbm's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 225, current learner xgboost\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.9s,\testimator xgboost's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 226, current learner xgb_limitdepth\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 14.9s,\testimator xgb_limitdepth's best error=0.0667,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3088} INFO - iteration 227, current learner extra_tree\n",
"[flaml.automl: 08-03 20:33:41] {3268} INFO - at 15.0s,\testimator extra_tree's best error=0.0333,\tbest estimator lgbm's best error=0.0333\n",
"[flaml.automl: 08-03 20:33:41] {3532} INFO - retrain lgbm for 0.0s\n",
"[flaml.automl: 08-03 20:33:41] {3539} INFO - retrained model: LGBMClassifier(boosting_type='gbdt', class_weight=None, colsample_bytree=1.0,\n",
" importance_type='split', learning_rate=0.7333523408279569,\n",
" max_bin=31, max_depth=-1, min_child_samples=8,\n",
" min_child_weight=0.001, min_split_gain=0.0, n_estimators=4,\n",
" n_jobs=-1, num_leaves=5, objective=None, random_state=None,\n",
" reg_alpha=0.0009765625, reg_lambda=7.593190995489472,\n",
" silent=True, subsample=1.0, subsample_for_bin=200000,\n",
" subsample_freq=0, verbose=-1)\n",
"[flaml.automl: 08-03 20:33:41] {2817} INFO - fit succeeded\n",
"[flaml.automl: 08-03 20:33:41] {2818} INFO - Time taken to find the best model: 2.6732513904571533\n"
]
}
],
"source": [
"\"\"\"The main flaml automl API\"\"\"\n",
"automl.fit(X_train=discrete_X_train,\n",
" y_train=discrete_y_train,\n",
" **settings,\n",
" period=time_horizon)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Best Model and Metric"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Best ML leaner: lgbm\n",
"Best hyperparmeter config: {'n_estimators': 4, 'num_leaves': 5, 'min_child_samples': 8, 'learning_rate': 0.7333523408279569, 'log_max_bin': 5, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 7.593190995489472, 'optimize_for_horizon': False, 'lags': 5}\n",
"Best mape on validation data: 0.033333333333333326\n",
"Training duration of best run: 0.017951011657714844s\n",
"LGBMClassifier(boosting_type='gbdt', class_weight=None, colsample_bytree=1.0,\n",
" importance_type='split', learning_rate=0.7333523408279569,\n",
" max_bin=31, max_depth=-1, min_child_samples=8,\n",
" min_child_weight=0.001, min_split_gain=0.0, n_estimators=4,\n",
" n_jobs=-1, num_leaves=5, objective=None, random_state=None,\n",
" reg_alpha=0.0009765625, reg_lambda=7.593190995489472,\n",
" silent=True, subsample=1.0, subsample_for_bin=200000,\n",
" subsample_freq=0, verbose=-1)\n"
]
}
],
"source": [
"\"\"\" retrieve best config and best learner\"\"\"\n",
"print(\"Best ML leaner:\", automl.best_estimator)\n",
"print(\"Best hyperparmeter config:\", automl.best_config)\n",
"print(f\"Best mape on validation data: {automl.best_loss}\")\n",
"print(f\"Training duration of best run: {automl.best_config_train_time}s\")\n",
"print(automl.model.estimator)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Predicted label [1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1 0 0 1 1 1 1 1]\n",
"True label 150 1\n",
"151 1\n",
"152 0\n",
"153 0\n",
"154 1\n",
"155 1\n",
"156 1\n",
"157 1\n",
"158 1\n",
"159 0\n",
"160 0\n",
"161 1\n",
"162 1\n",
"163 1\n",
"164 1\n",
"165 1\n",
"166 0\n",
"167 0\n",
"168 1\n",
"169 1\n",
"170 1\n",
"171 1\n",
"172 1\n",
"173 0\n",
"174 0\n",
"175 1\n",
"176 1\n",
"177 1\n",
"178 1\n",
"179 1\n",
"Name: above_mean_sales, dtype: int32\n"
]
}
],
"source": [
"\"\"\" compute predictions of testing dataset \"\"\"\n",
"discrete_y_pred = automl.predict(discrete_X_test)\n",
"print(\"Predicted label\", discrete_y_pred)\n",
"print(\"True label\", discrete_y_test)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"accuracy = 1.0\n"
]
}
],
"source": [
"from flaml.ml import sklearn_metric_loss_score\n",
"print(\"accuracy\", \"=\", 1 - sklearn_metric_loss_score(\"accuracy\", discrete_y_test, discrete_y_pred))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 5. Forecast Problems with Panel Datasets (Multiple Time Series)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Load data and preprocess\n",
"\n",
"Import Stallion & Co.'s beverage sales data from pytorch-forecasting, orginally from Kaggle. The dataset contains about 21,000 monthly historic sales record as well as additional information about the sales price, the location of the agency, special days such as holidays, and volume sold in the entire industry. There are thousands of unique wholesaler-SKU/products combinations, each representing an individual time series. The task is to provide a six month forecast of demand at SKU level for each wholesaler."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"def get_stalliion_data():\n",
" from pytorch_forecasting.data.examples import get_stallion_data\n",
"\n",
" data = get_stallion_data()\n",
" # add time index\n",
" data[\"time_idx\"] = data[\"date\"].dt.year * 12 + data[\"date\"].dt.month\n",
" data[\"time_idx\"] -= data[\"time_idx\"].min()\n",
" # add additional features\n",
" data[\"month\"] = data.date.dt.month.astype(str).astype(\n",
" \"category\"\n",
" ) # categories have be strings\n",
" data[\"log_volume\"] = np.log(data.volume + 1e-8)\n",
" data[\"avg_volume_by_sku\"] = data.groupby(\n",
" [\"time_idx\", \"sku\"], observed=True\n",
" ).volume.transform(\"mean\")\n",
" data[\"avg_volume_by_agency\"] = data.groupby(\n",
" [\"time_idx\", \"agency\"], observed=True\n",
" ).volume.transform(\"mean\")\n",
" # we want to encode special days as one variable and thus need to first reverse one-hot encoding\n",
" special_days = [\n",
" \"easter_day\",\n",
" \"good_friday\",\n",
" \"new_year\",\n",
" \"christmas\",\n",
" \"labor_day\",\n",
" \"independence_day\",\n",
" \"revolution_day_memorial\",\n",
" \"regional_games\",\n",
" \"beer_capital\",\n",
" \"music_fest\",\n",
" ]\n",
" data[special_days] = (\n",
" data[special_days]\n",
" .apply(lambda x: x.map({0: \"-\", 1: x.name}))\n",
" .astype(\"category\")\n",
" )\n",
" return data, special_days"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"data, special_days = get_stalliion_data()\n",
"time_horizon = 6 # predict six months\n",
"# make time steps first column\n",
"data[\"time_idx\"] = data[\"date\"].dt.year * 12 + data[\"date\"].dt.month\n",
"data[\"time_idx\"] -= data[\"time_idx\"].min()\n",
"training_cutoff = data[\"time_idx\"].max() - time_horizon\n",
"ts_col = data.pop(\"date\")\n",
"data.insert(0, \"date\", ts_col)\n",
"# FLAML assumes input is not sorted, but we sort here for comparison purposes with y_test\n",
"data = data.sort_values([\"agency\", \"sku\", \"date\"])\n",
"X_train = data[lambda x: x.time_idx <= training_cutoff]\n",
"X_test = data[lambda x: x.time_idx > training_cutoff]\n",
"y_train = X_train.pop(\"volume\")\n",
"y_test = X_test.pop(\"volume\")"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>date</th>\n",
" <th>agency</th>\n",
" <th>sku</th>\n",
" <th>industry_volume</th>\n",
" <th>soda_volume</th>\n",
" <th>avg_max_temp</th>\n",
" <th>price_regular</th>\n",
" <th>price_actual</th>\n",
" <th>discount</th>\n",
" <th>avg_population_2017</th>\n",
" <th>...</th>\n",
" <th>football_gold_cup</th>\n",
" <th>beer_capital</th>\n",
" <th>music_fest</th>\n",
" <th>discount_in_percent</th>\n",
" <th>timeseries</th>\n",
" <th>time_idx</th>\n",
" <th>month</th>\n",
" <th>log_volume</th>\n",
" <th>avg_volume_by_sku</th>\n",
" <th>avg_volume_by_agency</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>25</th>\n",
" <td>2013-01-01</td>\n",
" <td>Agency_01</td>\n",
" <td>SKU_01</td>\n",
" <td>492612703</td>\n",
" <td>718394219</td>\n",
" <td>17.072000</td>\n",
" <td>1141.500000</td>\n",
" <td>1033.432731</td>\n",
" <td>108.067269</td>\n",
" <td>153733</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>-</td>\n",
" <td>9.467128</td>\n",
" <td>249</td>\n",
" <td>0</td>\n",
" <td>1</td>\n",
" <td>4.390441</td>\n",
" <td>2613.377501</td>\n",
" <td>74.829600</td>\n",
" </tr>\n",
" <tr>\n",
" <th>7183</th>\n",
" <td>2013-02-01</td>\n",
" <td>Agency_01</td>\n",
" <td>SKU_01</td>\n",
" <td>431937346</td>\n",
" <td>753938444</td>\n",
" <td>19.984000</td>\n",
" <td>1141.500000</td>\n",
" <td>1065.417195</td>\n",
" <td>76.082805</td>\n",
" <td>153733</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>-</td>\n",
" <td>6.665160</td>\n",
" <td>249</td>\n",
" <td>1</td>\n",
" <td>2</td>\n",
" <td>4.585620</td>\n",
" <td>2916.978087</td>\n",
" <td>90.036700</td>\n",
" </tr>\n",
" <tr>\n",
" <th>8928</th>\n",
" <td>2013-03-01</td>\n",
" <td>Agency_01</td>\n",
" <td>SKU_01</td>\n",
" <td>509281531</td>\n",
" <td>892192092</td>\n",
" <td>24.600000</td>\n",
" <td>1179.345820</td>\n",
" <td>1101.133633</td>\n",
" <td>78.212187</td>\n",
" <td>153733</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>music_fest</td>\n",
" <td>6.631828</td>\n",
" <td>249</td>\n",
" <td>2</td>\n",
" <td>3</td>\n",
" <td>4.895628</td>\n",
" <td>3215.061952</td>\n",
" <td>130.487150</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10588</th>\n",
" <td>2013-04-01</td>\n",
" <td>Agency_01</td>\n",
" <td>SKU_01</td>\n",
" <td>532390389</td>\n",
" <td>838099501</td>\n",
" <td>27.532000</td>\n",
" <td>1226.687500</td>\n",
" <td>1138.283357</td>\n",
" <td>88.404143</td>\n",
" <td>153733</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>-</td>\n",
" <td>7.206737</td>\n",
" <td>249</td>\n",
" <td>3</td>\n",
" <td>4</td>\n",
" <td>4.992553</td>\n",
" <td>3515.822697</td>\n",
" <td>130.246150</td>\n",
" </tr>\n",
" <tr>\n",
" <th>12260</th>\n",
" <td>2013-05-01</td>\n",
" <td>Agency_01</td>\n",
" <td>SKU_01</td>\n",
" <td>551755254</td>\n",
" <td>864420003</td>\n",
" <td>29.396000</td>\n",
" <td>1230.331104</td>\n",
" <td>1148.969634</td>\n",
" <td>81.361470</td>\n",
" <td>153733</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>-</td>\n",
" <td>6.612974</td>\n",
" <td>249</td>\n",
" <td>4</td>\n",
" <td>5</td>\n",
" <td>5.168254</td>\n",
" <td>3688.107793</td>\n",
" <td>159.051550</td>\n",
" </tr>\n",
" <tr>\n",
" <th>...</th>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" <td>...</td>\n",
" </tr>\n",
" <tr>\n",
" <th>8403</th>\n",
" <td>2017-02-01</td>\n",
" <td>Agency_60</td>\n",
" <td>SKU_23</td>\n",
" <td>530252010</td>\n",
" <td>850913048</td>\n",
" <td>25.242657</td>\n",
" <td>4261.294565</td>\n",
" <td>4087.082609</td>\n",
" <td>174.211956</td>\n",
" <td>2180611</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>-</td>\n",
" <td>4.088240</td>\n",
" <td>190</td>\n",
" <td>49</td>\n",
" <td>2</td>\n",
" <td>0.924259</td>\n",
" <td>2.418750</td>\n",
" <td>2664.670179</td>\n",
" </tr>\n",
" <tr>\n",
" <th>10359</th>\n",
" <td>2017-03-01</td>\n",
" <td>Agency_60</td>\n",
" <td>SKU_23</td>\n",
" <td>613143990</td>\n",
" <td>886129111</td>\n",
" <td>25.374816</td>\n",
" <td>4259.769000</td>\n",
" <td>4126.776000</td>\n",
" <td>132.993000</td>\n",
" <td>2180611</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>music_fest</td>\n",
" <td>3.122071</td>\n",
" <td>190</td>\n",
" <td>50</td>\n",
" <td>3</td>\n",
" <td>0.536493</td>\n",
" <td>4.353750</td>\n",
" <td>2965.472829</td>\n",
" </tr>\n",
" <tr>\n",
" <th>12114</th>\n",
" <td>2017-04-01</td>\n",
" <td>Agency_60</td>\n",
" <td>SKU_23</td>\n",
" <td>589969396</td>\n",
" <td>940912941</td>\n",
" <td>27.109204</td>\n",
" <td>4261.896428</td>\n",
" <td>4115.753572</td>\n",
" <td>146.142856</td>\n",
" <td>2180611</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>-</td>\n",
" <td>3.429057</td>\n",
" <td>190</td>\n",
" <td>51</td>\n",
" <td>4</td>\n",
" <td>0.231112</td>\n",
" <td>2.396250</td>\n",
" <td>2861.802300</td>\n",
" </tr>\n",
" <tr>\n",
" <th>13884</th>\n",
" <td>2017-05-01</td>\n",
" <td>Agency_60</td>\n",
" <td>SKU_23</td>\n",
" <td>628759461</td>\n",
" <td>917412482</td>\n",
" <td>28.479272</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>0.000000</td>\n",
" <td>2180611</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>-</td>\n",
" <td>0.000000</td>\n",
" <td>190</td>\n",
" <td>52</td>\n",
" <td>5</td>\n",
" <td>-18.420681</td>\n",
" <td>2.182500</td>\n",
" <td>3489.190286</td>\n",
" </tr>\n",
" <tr>\n",
" <th>15669</th>\n",
" <td>2017-06-01</td>\n",
" <td>Agency_60</td>\n",
" <td>SKU_23</td>\n",
" <td>636846973</td>\n",
" <td>928366256</td>\n",
" <td>29.609259</td>\n",
" <td>4256.675000</td>\n",
" <td>4246.018750</td>\n",
" <td>10.656250</td>\n",
" <td>2180611</td>\n",
" <td>...</td>\n",
" <td>0</td>\n",
" <td>-</td>\n",
" <td>-</td>\n",
" <td>0.250342</td>\n",
" <td>190</td>\n",
" <td>53</td>\n",
" <td>6</td>\n",
" <td>0.924259</td>\n",
" <td>2.362500</td>\n",
" <td>3423.810793</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"<p>18900 rows × 30 columns</p>\n",
"</div>"
],
"text/plain": [
" date agency sku industry_volume soda_volume \\\n",
"25 2013-01-01 Agency_01 SKU_01 492612703 718394219 \n",
"7183 2013-02-01 Agency_01 SKU_01 431937346 753938444 \n",
"8928 2013-03-01 Agency_01 SKU_01 509281531 892192092 \n",
"10588 2013-04-01 Agency_01 SKU_01 532390389 838099501 \n",
"12260 2013-05-01 Agency_01 SKU_01 551755254 864420003 \n",
"... ... ... ... ... ... \n",
"8403 2017-02-01 Agency_60 SKU_23 530252010 850913048 \n",
"10359 2017-03-01 Agency_60 SKU_23 613143990 886129111 \n",
"12114 2017-04-01 Agency_60 SKU_23 589969396 940912941 \n",
"13884 2017-05-01 Agency_60 SKU_23 628759461 917412482 \n",
"15669 2017-06-01 Agency_60 SKU_23 636846973 928366256 \n",
"\n",
" avg_max_temp price_regular price_actual discount \\\n",
"25 17.072000 1141.500000 1033.432731 108.067269 \n",
"7183 19.984000 1141.500000 1065.417195 76.082805 \n",
"8928 24.600000 1179.345820 1101.133633 78.212187 \n",
"10588 27.532000 1226.687500 1138.283357 88.404143 \n",
"12260 29.396000 1230.331104 1148.969634 81.361470 \n",
"... ... ... ... ... \n",
"8403 25.242657 4261.294565 4087.082609 174.211956 \n",
"10359 25.374816 4259.769000 4126.776000 132.993000 \n",
"12114 27.109204 4261.896428 4115.753572 146.142856 \n",
"13884 28.479272 0.000000 0.000000 0.000000 \n",
"15669 29.609259 4256.675000 4246.018750 10.656250 \n",
"\n",
" avg_population_2017 ... football_gold_cup beer_capital music_fest \\\n",
"25 153733 ... 0 - - \n",
"7183 153733 ... 0 - - \n",
"8928 153733 ... 0 - music_fest \n",
"10588 153733 ... 0 - - \n",
"12260 153733 ... 0 - - \n",
"... ... ... ... ... ... \n",
"8403 2180611 ... 0 - - \n",
"10359 2180611 ... 0 - music_fest \n",
"12114 2180611 ... 0 - - \n",
"13884 2180611 ... 0 - - \n",
"15669 2180611 ... 0 - - \n",
"\n",
" discount_in_percent timeseries time_idx month log_volume \\\n",
"25 9.467128 249 0 1 4.390441 \n",
"7183 6.665160 249 1 2 4.585620 \n",
"8928 6.631828 249 2 3 4.895628 \n",
"10588 7.206737 249 3 4 4.992553 \n",
"12260 6.612974 249 4 5 5.168254 \n",
"... ... ... ... ... ... \n",
"8403 4.088240 190 49 2 0.924259 \n",
"10359 3.122071 190 50 3 0.536493 \n",
"12114 3.429057 190 51 4 0.231112 \n",
"13884 0.000000 190 52 5 -18.420681 \n",
"15669 0.250342 190 53 6 0.924259 \n",
"\n",
" avg_volume_by_sku avg_volume_by_agency \n",
"25 2613.377501 74.829600 \n",
"7183 2916.978087 90.036700 \n",
"8928 3215.061952 130.487150 \n",
"10588 3515.822697 130.246150 \n",
"12260 3688.107793 159.051550 \n",
"... ... ... \n",
"8403 2.418750 2664.670179 \n",
"10359 4.353750 2965.472829 \n",
"12114 2.396250 2861.802300 \n",
"13884 2.182500 3489.190286 \n",
"15669 2.362500 3423.810793 \n",
"\n",
"[18900 rows x 30 columns]"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"X_train"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Run FLAML"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Missing timestamps detected. To avoid error with estimators, set estimator list to ['prophet']. \n",
"[flaml.automl: 08-13 01:07:10] {2540} INFO - task = ts_forecast_panel\n",
"[flaml.automl: 08-13 01:07:10] {2542} INFO - Data split method: time\n",
"[flaml.automl: 08-13 01:07:10] {2545} INFO - Evaluation method: holdout\n",
"[flaml.automl: 08-13 01:07:10] {2664} INFO - Minimizing error metric: mape\n",
"[flaml.automl: 08-13 01:07:10] {2806} INFO - List of ML learners in AutoML Run: ['tft']\n",
"[flaml.automl: 08-13 01:07:10] {3108} INFO - iteration 0, current learner tft\n",
"GPU available: False, used: False\n",
"TPU available: False, using: 0 TPU cores\n",
"IPU available: False, using: 0 IPUs\n",
"\n",
" | Name | Type | Params\n",
"----------------------------------------------------------------------------------------\n",
"0 | loss | QuantileLoss | 0 \n",
"1 | logging_metrics | ModuleList | 0 \n",
"2 | input_embeddings | MultiEmbedding | 1.3 K \n",
"3 | prescalers | ModuleDict | 256 \n",
"4 | static_variable_selection | VariableSelectionNetwork | 3.4 K \n",
"5 | encoder_variable_selection | VariableSelectionNetwork | 8.0 K \n",
"6 | decoder_variable_selection | VariableSelectionNetwork | 2.7 K \n",
"7 | static_context_variable_selection | GatedResidualNetwork | 1.1 K \n",
"8 | static_context_initial_hidden_lstm | GatedResidualNetwork | 1.1 K \n",
"9 | static_context_initial_cell_lstm | GatedResidualNetwork | 1.1 K \n",
"10 | static_context_enrichment | GatedResidualNetwork | 1.1 K \n",
"11 | lstm_encoder | LSTM | 4.4 K \n",
"12 | lstm_decoder | LSTM | 4.4 K \n",
"13 | post_lstm_gate_encoder | GatedLinearUnit | 544 \n",
"14 | post_lstm_add_norm_encoder | AddNorm | 32 \n",
"15 | static_enrichment | GatedResidualNetwork | 1.4 K \n",
"16 | multihead_attn | InterpretableMultiHeadAttention | 676 \n",
"17 | post_attn_gate_norm | GateAddNorm | 576 \n",
"18 | pos_wise_ff | GatedResidualNetwork | 1.1 K \n",
"19 | pre_output_gate_norm | GateAddNorm | 576 \n",
"20 | output_layer | Linear | 119 \n",
"----------------------------------------------------------------------------------------\n",
"33.6 K Trainable params\n",
"0 Non-trainable params\n",
"33.6 K Total params\n",
"0.135 Total estimated model params size (MB)\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 19: 100%|██████████| 129/129 [00:58<00:00, 2.20it/s, loss=49.6, v_num=0, train_loss_step=44.50, val_loss=62.50, train_loss_epoch=48.40]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"[flaml.automl: 08-13 01:28:51] {3241} INFO - Estimated sufficient time budget=13011770s. Estimated necessary time budget=13012s.\n",
"[flaml.automl: 08-13 01:28:51] {3288} INFO - at 1301.4s,\testimator tft's best error=1513915657450885.2500,\tbest estimator tft's best error=1513915657450885.2500\n",
"GPU available: False, used: False\n",
"TPU available: False, using: 0 TPU cores\n",
"IPU available: False, using: 0 IPUs\n",
"\n",
" | Name | Type | Params\n",
"----------------------------------------------------------------------------------------\n",
"0 | loss | QuantileLoss | 0 \n",
"1 | logging_metrics | ModuleList | 0 \n",
"2 | input_embeddings | MultiEmbedding | 1.3 K \n",
"3 | prescalers | ModuleDict | 256 \n",
"4 | static_variable_selection | VariableSelectionNetwork | 3.4 K \n",
"5 | encoder_variable_selection | VariableSelectionNetwork | 8.0 K \n",
"6 | decoder_variable_selection | VariableSelectionNetwork | 2.7 K \n",
"7 | static_context_variable_selection | GatedResidualNetwork | 1.1 K \n",
"8 | static_context_initial_hidden_lstm | GatedResidualNetwork | 1.1 K \n",
"9 | static_context_initial_cell_lstm | GatedResidualNetwork | 1.1 K \n",
"10 | static_context_enrichment | GatedResidualNetwork | 1.1 K \n",
"11 | lstm_encoder | LSTM | 4.4 K \n",
"12 | lstm_decoder | LSTM | 4.4 K \n",
"13 | post_lstm_gate_encoder | GatedLinearUnit | 544 \n",
"14 | post_lstm_add_norm_encoder | AddNorm | 32 \n",
"15 | static_enrichment | GatedResidualNetwork | 1.4 K \n",
"16 | multihead_attn | InterpretableMultiHeadAttention | 676 \n",
"17 | post_attn_gate_norm | GateAddNorm | 576 \n",
"18 | pos_wise_ff | GatedResidualNetwork | 1.1 K \n",
"19 | pre_output_gate_norm | GateAddNorm | 576 \n",
"20 | output_layer | Linear | 119 \n",
"----------------------------------------------------------------------------------------\n",
"33.6 K Trainable params\n",
"0 Non-trainable params\n",
"33.6 K Total params\n",
"0.135 Total estimated model params size (MB)\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 19: 100%|██████████| 145/145 [01:13<00:00, 1.98it/s, loss=48.3, v_num=1, train_loss_step=52.00, val_loss=71.40, train_loss_epoch=47.80]\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"[flaml.automl: 08-13 01:53:06] {3552} INFO - retrain tft for 1454.1s\n",
"[flaml.automl: 08-13 01:53:06] {3559} INFO - retrained model: TemporalFusionTransformer(\n",
" (loss): QuantileLoss()\n",
" (logging_metrics): ModuleList(\n",
" (0): SMAPE()\n",
" (1): MAE()\n",
" (2): RMSE()\n",
" (3): MAPE()\n",
" )\n",
" (input_embeddings): MultiEmbedding(\n",
" (embeddings): ModuleDict(\n",
" (agency): Embedding(58, 16)\n",
" (sku): Embedding(25, 10)\n",
" (special_days): TimeDistributedEmbeddingBag(11, 6, mode=sum)\n",
" (month): Embedding(12, 6)\n",
" )\n",
" )\n",
" (prescalers): ModuleDict(\n",
" (avg_population_2017): Linear(in_features=1, out_features=8, bias=True)\n",
" (avg_yearly_household_income_2017): Linear(in_features=1, out_features=8, bias=True)\n",
" (encoder_length): Linear(in_features=1, out_features=8, bias=True)\n",
" (y_center): Linear(in_features=1, out_features=8, bias=True)\n",
" (y_scale): Linear(in_features=1, out_features=8, bias=True)\n",
" (time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
" (price_regular): Linear(in_features=1, out_features=8, bias=True)\n",
" (discount_in_percent): Linear(in_features=1, out_features=8, bias=True)\n",
" (relative_time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
" (y): Linear(in_features=1, out_features=8, bias=True)\n",
" (log_volume): Linear(in_features=1, out_features=8, bias=True)\n",
" (industry_volume): Linear(in_features=1, out_features=8, bias=True)\n",
" (soda_volume): Linear(in_features=1, out_features=8, bias=True)\n",
" (avg_max_temp): Linear(in_features=1, out_features=8, bias=True)\n",
" (avg_volume_by_agency): Linear(in_features=1, out_features=8, bias=True)\n",
" (avg_volume_by_sku): Linear(in_features=1, out_features=8, bias=True)\n",
" )\n",
" (static_variable_selection): VariableSelectionNetwork(\n",
" (flattened_grn): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((7,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=66, out_features=7, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=7, out_features=7, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=7, out_features=14, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((7,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (single_variable_grns): ModuleDict(\n",
" (agency): ResampleNorm(\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (sku): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (avg_population_2017): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (avg_yearly_household_income_2017): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (encoder_length): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (y_center): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (y_scale): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" )\n",
" (prescalers): ModuleDict(\n",
" (avg_population_2017): Linear(in_features=1, out_features=8, bias=True)\n",
" (avg_yearly_household_income_2017): Linear(in_features=1, out_features=8, bias=True)\n",
" (encoder_length): Linear(in_features=1, out_features=8, bias=True)\n",
" (y_center): Linear(in_features=1, out_features=8, bias=True)\n",
" (y_scale): Linear(in_features=1, out_features=8, bias=True)\n",
" )\n",
" (softmax): Softmax(dim=-1)\n",
" )\n",
" (encoder_variable_selection): VariableSelectionNetwork(\n",
" (flattened_grn): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((13,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=100, out_features=13, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (context): Linear(in_features=16, out_features=13, bias=False)\n",
" (fc2): Linear(in_features=13, out_features=13, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=13, out_features=26, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((13,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (single_variable_grns): ModuleDict(\n",
" (special_days): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (month): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (time_idx): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (price_regular): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (discount_in_percent): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (relative_time_idx): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (y): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (log_volume): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (industry_volume): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (soda_volume): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (avg_max_temp): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (avg_volume_by_agency): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (avg_volume_by_sku): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" )\n",
" (prescalers): ModuleDict(\n",
" (time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
" (price_regular): Linear(in_features=1, out_features=8, bias=True)\n",
" (discount_in_percent): Linear(in_features=1, out_features=8, bias=True)\n",
" (relative_time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
" (y): Linear(in_features=1, out_features=8, bias=True)\n",
" (log_volume): Linear(in_features=1, out_features=8, bias=True)\n",
" (industry_volume): Linear(in_features=1, out_features=8, bias=True)\n",
" (soda_volume): Linear(in_features=1, out_features=8, bias=True)\n",
" (avg_max_temp): Linear(in_features=1, out_features=8, bias=True)\n",
" (avg_volume_by_agency): Linear(in_features=1, out_features=8, bias=True)\n",
" (avg_volume_by_sku): Linear(in_features=1, out_features=8, bias=True)\n",
" )\n",
" (softmax): Softmax(dim=-1)\n",
" )\n",
" (decoder_variable_selection): VariableSelectionNetwork(\n",
" (flattened_grn): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((6,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=44, out_features=6, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (context): Linear(in_features=16, out_features=6, bias=False)\n",
" (fc2): Linear(in_features=6, out_features=6, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=6, out_features=12, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((6,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (single_variable_grns): ModuleDict(\n",
" (special_days): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (month): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (time_idx): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (price_regular): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (discount_in_percent): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (relative_time_idx): GatedResidualNetwork(\n",
" (resample_norm): ResampleNorm(\n",
" (resample): TimeDistributedInterpolation()\n",
" (gate): Sigmoid()\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (fc1): Linear(in_features=8, out_features=8, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=8, out_features=8, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=8, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" )\n",
" (prescalers): ModuleDict(\n",
" (time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
" (price_regular): Linear(in_features=1, out_features=8, bias=True)\n",
" (discount_in_percent): Linear(in_features=1, out_features=8, bias=True)\n",
" (relative_time_idx): Linear(in_features=1, out_features=8, bias=True)\n",
" )\n",
" (softmax): Softmax(dim=-1)\n",
" )\n",
" (static_context_variable_selection): GatedResidualNetwork(\n",
" (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (static_context_initial_hidden_lstm): GatedResidualNetwork(\n",
" (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (static_context_initial_cell_lstm): GatedResidualNetwork(\n",
" (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (static_context_enrichment): GatedResidualNetwork(\n",
" (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (lstm_encoder): LSTM(16, 16, num_layers=2, batch_first=True, dropout=0.1)\n",
" (lstm_decoder): LSTM(16, 16, num_layers=2, batch_first=True, dropout=0.1)\n",
" (post_lstm_gate_encoder): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (post_lstm_gate_decoder): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (post_lstm_add_norm_encoder): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (post_lstm_add_norm_decoder): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" (static_enrichment): GatedResidualNetwork(\n",
" (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (context): Linear(in_features=16, out_features=16, bias=False)\n",
" (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (multihead_attn): InterpretableMultiHeadAttention(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (v_layer): Linear(in_features=16, out_features=4, bias=True)\n",
" (q_layers): ModuleList(\n",
" (0): Linear(in_features=16, out_features=4, bias=True)\n",
" (1): Linear(in_features=16, out_features=4, bias=True)\n",
" (2): Linear(in_features=16, out_features=4, bias=True)\n",
" (3): Linear(in_features=16, out_features=4, bias=True)\n",
" )\n",
" (k_layers): ModuleList(\n",
" (0): Linear(in_features=16, out_features=4, bias=True)\n",
" (1): Linear(in_features=16, out_features=4, bias=True)\n",
" (2): Linear(in_features=16, out_features=4, bias=True)\n",
" (3): Linear(in_features=16, out_features=4, bias=True)\n",
" )\n",
" (attention): ScaledDotProductAttention(\n",
" (softmax): Softmax(dim=2)\n",
" )\n",
" (w_h): Linear(in_features=4, out_features=16, bias=False)\n",
" )\n",
" (post_attn_gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" (pos_wise_ff): GatedResidualNetwork(\n",
" (fc1): Linear(in_features=16, out_features=16, bias=True)\n",
" (elu): ELU(alpha=1.0)\n",
" (fc2): Linear(in_features=16, out_features=16, bias=True)\n",
" (gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (dropout): Dropout(p=0.1, inplace=False)\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" )\n",
" (pre_output_gate_norm): GateAddNorm(\n",
" (glu): GatedLinearUnit(\n",
" (fc): Linear(in_features=16, out_features=32, bias=True)\n",
" )\n",
" (add_norm): AddNorm(\n",
" (norm): LayerNorm((16,), eps=1e-05, elementwise_affine=True)\n",
" )\n",
" )\n",
" (output_layer): Linear(in_features=16, out_features=7, bias=True)\n",
")\n",
"[flaml.automl: 08-13 01:53:06] {2837} INFO - fit succeeded\n",
"[flaml.automl: 08-13 01:53:06] {2838} INFO - Time taken to find the best model: 1301.3580441474915\n",
"[flaml.automl: 08-13 01:53:06] {2849} WARNING - Time taken to find the best model is 434% of the provided time budget and not all estimators' hyperparameter search converged. Consider increasing the time budget.\n"
]
}
],
"source": [
"from flaml import AutoML\n",
"automl = AutoML()\n",
"settings = {\n",
" \"time_budget\": 300, # total running time in seconds\n",
" \"metric\": \"mape\", # primary metric\n",
" \"task\": \"ts_forecast_panel\", # task type\n",
" \"log_file_name\": \"stallion_forecast.log\", # flaml log file\n",
" \"eval_method\": \"holdout\",\n",
"}\n",
"fit_kwargs_by_estimator = {\n",
" \"tft\": {\n",
" \"max_encoder_length\": 24,\n",
" \"static_categoricals\": [\"agency\", \"sku\"],\n",
" \"static_reals\": [\"avg_population_2017\", \"avg_yearly_household_income_2017\"],\n",
" \"time_varying_known_categoricals\": [\"special_days\", \"month\"],\n",
" \"variable_groups\": {\n",
" \"special_days\": special_days\n",
" }, # group of categorical variables can be treated as one variable\n",
" \"time_varying_known_reals\": [\n",
" \"time_idx\",\n",
" \"price_regular\",\n",
" \"discount_in_percent\",\n",
" ],\n",
" \"time_varying_unknown_categoricals\": [],\n",
" \"time_varying_unknown_reals\": [\n",
" \"y\", # always need a 'y' column for the target column\n",
" \"log_volume\",\n",
" \"industry_volume\",\n",
" \"soda_volume\",\n",
" \"avg_max_temp\",\n",
" \"avg_volume_by_agency\",\n",
" \"avg_volume_by_sku\",\n",
" ],\n",
" \"batch_size\": 128,\n",
" \"gpu_per_trial\": -1,\n",
" }\n",
"}\n",
"\"\"\"The main flaml automl API\"\"\"\n",
"automl.fit(\n",
" X_train=X_train,\n",
" y_train=y_train,\n",
" **settings,\n",
" period=time_horizon,\n",
" group_ids=[\"agency\", \"sku\"],\n",
" fit_kwargs_by_estimator=fit_kwargs_by_estimator,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Prediction and Metrics"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"17156 59.292\n",
"18946 66.420\n",
"20680 95.904\n",
"3189 52.812\n",
"4954 37.908\n",
" ... \n",
"19207 1.980\n",
"20996 1.260\n",
"3499 0.990\n",
"5248 0.090\n",
"6793 2.250\n",
"Name: volume, Length: 2100, dtype: float64\n",
"Agency_01 SKU_01 2017-07-01 61.183556\n",
" 2017-08-01 58.315655\n",
" 2017-09-01 68.994713\n",
" 2017-10-01 55.561470\n",
" 2017-11-01 63.187683\n",
" ... \n",
"Agency_60 SKU_23 2017-08-01 3.096925\n",
" 2017-09-01 2.331218\n",
" 2017-10-01 1.883857\n",
" 2017-11-01 3.282242\n",
" 2017-12-01 3.580535\n",
"Length: 2100, dtype: float32\n"
]
}
],
"source": [
"\"\"\" compute predictions of testing dataset \"\"\"\n",
"y_pred = automl.predict(X_test)\n",
"print(y_test)\n",
"print(y_pred)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"mape = 2670450710119213.0\n",
"smape = 52.3\n"
]
}
],
"source": [
"\"\"\" compute different metric values on testing dataset\"\"\"\n",
"from flaml.ml import sklearn_metric_loss_score\n",
"print(\"mape\", \"=\", sklearn_metric_loss_score(\"mape\", y_pred, y_test))\n",
"\n",
"def smape(y_pred, y_test):\n",
" import numpy as np\n",
"\n",
" y_test, y_pred = np.array(y_test), np.array(y_pred)\n",
" return round(\n",
" np.mean(\n",
" np.abs(y_pred - y_test) /\n",
" ((np.abs(y_pred) + np.abs(y_test)) / 2)\n",
" ) * 100, 2\n",
" )\n",
"\n",
"print(\"smape\", \"=\", smape(y_pred, y_test))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 6. Comparison with Alternatives (CO2 Dataset)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"FLAML's MAPE"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"flaml mape = 0.0005710586398294955\n"
]
}
],
"source": [
"from flaml.ml import sklearn_metric_loss_score\n",
"print('flaml mape', '=', sklearn_metric_loss_score('mape', flaml_y_pred, y_test))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Default Prophet"
]
},
{
"cell_type": "code",
"execution_count": 34,
"metadata": {},
"outputs": [],
"source": [
"from prophet import Prophet\n",
"prophet_model = Prophet()"
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"<prophet.forecaster.Prophet at 0x1e2d990d7c0>"
]
},
"execution_count": 35,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"X_train_prophet = train_df.copy()\n",
"X_train_prophet = X_train_prophet.rename(columns={'index': 'ds', 'co2': 'y'})\n",
"prophet_model.fit(X_train_prophet)"
]
},
{
"cell_type": "code",
"execution_count": 36,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Predicted labels 0 370.450675\n",
"1 371.177764\n",
"2 372.229577\n",
"3 373.419835\n",
"4 373.914917\n",
"5 373.406484\n",
"6 372.053428\n",
"7 370.149037\n",
"8 368.566631\n",
"9 368.646853\n",
"10 369.863891\n",
"11 371.135959\n",
"Name: yhat, dtype: float64\n",
"True labels 514 370.175\n",
"515 371.325\n",
"516 372.060\n",
"517 372.775\n",
"518 373.800\n",
"519 373.060\n",
"520 371.300\n",
"521 369.425\n",
"522 367.880\n",
"523 368.050\n",
"524 369.375\n",
"525 371.020\n",
"Name: co2, dtype: float64\n"
]
}
],
"source": [
"X_test_prophet = X_test.copy()\n",
"X_test_prophet = X_test_prophet.rename(columns={'index': 'ds'})\n",
"prophet_y_pred = prophet_model.predict(X_test_prophet)['yhat']\n",
"print('Predicted labels', prophet_y_pred)\n",
"print('True labels', y_test)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Default Prophet MAPE"
]
},
{
"cell_type": "code",
"execution_count": 37,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"default prophet mape = 0.0011396920680673015\n"
]
}
],
"source": [
"from flaml.ml import sklearn_metric_loss_score\n",
"print('default prophet mape', '=', sklearn_metric_loss_score('mape', prophet_y_pred, y_test))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Auto ARIMA Models"
]
},
{
"cell_type": "code",
"execution_count": 38,
"metadata": {},
"outputs": [],
"source": [
"from pmdarima.arima import auto_arima\n",
"import pandas as pd\n",
"import time\n",
"\n",
"X_train_arima = train_df.copy()\n",
"X_train_arima.index = pd.to_datetime(X_train_arima['index'])\n",
"X_train_arima = X_train_arima.drop('index', axis=1)\n",
"X_train_arima = X_train_arima.rename(columns={'co2': 'y'})"
]
},
{
"cell_type": "code",
"execution_count": 39,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" ARIMA(0,1,0)(0,0,0)[0] intercept : AIC=1638.009, Time=0.02 sec\n",
" ARIMA(0,1,1)(0,0,0)[0] intercept : AIC=1344.207, Time=0.09 sec\n",
" ARIMA(0,1,2)(0,0,0)[0] intercept : AIC=1222.286, Time=0.14 sec\n",
" ARIMA(0,1,3)(0,0,0)[0] intercept : AIC=1174.928, Time=0.20 sec\n",
" ARIMA(0,1,4)(0,0,0)[0] intercept : AIC=1188.947, Time=0.43 sec\n",
" ARIMA(0,1,5)(0,0,0)[0] intercept : AIC=1091.452, Time=0.55 sec\n",
" ARIMA(1,1,0)(0,0,0)[0] intercept : AIC=1298.693, Time=0.08 sec\n",
" ARIMA(1,1,1)(0,0,0)[0] intercept : AIC=1240.963, Time=0.12 sec\n",
" ARIMA(1,1,2)(0,0,0)[0] intercept : AIC=1196.535, Time=0.19 sec\n",
" ARIMA(1,1,3)(0,0,0)[0] intercept : AIC=1176.484, Time=0.34 sec\n",
" ARIMA(1,1,4)(0,0,0)[0] intercept : AIC=inf, Time=1.18 sec\n",
" ARIMA(2,1,0)(0,0,0)[0] intercept : AIC=1180.404, Time=0.08 sec\n",
" ARIMA(2,1,1)(0,0,0)[0] intercept : AIC=990.719, Time=0.26 sec\n",
" ARIMA(2,1,2)(0,0,0)[0] intercept : AIC=988.094, Time=0.53 sec\n",
" ARIMA(2,1,3)(0,0,0)[0] intercept : AIC=1140.469, Time=0.53 sec\n",
" ARIMA(3,1,0)(0,0,0)[0] intercept : AIC=1126.139, Time=0.21 sec\n",
" ARIMA(3,1,1)(0,0,0)[0] intercept : AIC=989.496, Time=0.51 sec\n",
" ARIMA(3,1,2)(0,0,0)[0] intercept : AIC=991.558, Time=1.17 sec\n",
" ARIMA(4,1,0)(0,0,0)[0] intercept : AIC=1125.025, Time=0.19 sec\n",
" ARIMA(4,1,1)(0,0,0)[0] intercept : AIC=988.660, Time=0.98 sec\n",
" ARIMA(5,1,0)(0,0,0)[0] intercept : AIC=1113.673, Time=0.22 sec\n",
"\n",
"Best model: ARIMA(2,1,2)(0,0,0)[0] intercept\n",
"Total fit time: 8.039 seconds\n"
]
}
],
"source": [
"# use same search space as FLAML\n",
"start_time = time.time()\n",
"arima_model = auto_arima(X_train_arima,\n",
" start_p=2, d=None, start_q=1, max_p=10, max_d=10, max_q=10,\n",
" suppress_warnings=True, stepwise=False, seasonal=False,\n",
" error_action='ignore', trace=True, n_fits=650)\n",
"autoarima_y_pred = arima_model.predict(n_periods=12)\n",
"arima_time = time.time() - start_time"
]
},
{
"cell_type": "code",
"execution_count": 40,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" ARIMA(0,1,0)(0,0,0)[12] intercept : AIC=1638.009, Time=0.02 sec\n",
" ARIMA(0,1,0)(0,0,1)[12] intercept : AIC=1238.943, Time=0.23 sec\n",
" ARIMA(0,1,0)(0,0,2)[12] intercept : AIC=1040.890, Time=0.53 sec\n",
" ARIMA(0,1,0)(0,0,3)[12] intercept : AIC=911.545, Time=1.76 sec\n",
" ARIMA(0,1,0)(0,0,4)[12] intercept : AIC=823.103, Time=3.18 sec\n",
" ARIMA(0,1,0)(0,0,5)[12] intercept : AIC=792.850, Time=5.99 sec\n",
" ARIMA(0,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.26 sec\n",
" ARIMA(0,1,0)(1,0,1)[12] intercept : AIC=inf, Time=1.37 sec\n",
" ARIMA(0,1,0)(1,0,2)[12] intercept : AIC=inf, Time=2.60 sec\n",
" ARIMA(0,1,0)(1,0,3)[12] intercept : AIC=447.302, Time=5.94 sec\n",
" ARIMA(0,1,0)(1,0,4)[12] intercept : AIC=inf, Time=11.23 sec\n",
" ARIMA(0,1,0)(2,0,0)[12] intercept : AIC=inf, Time=1.10 sec\n",
" ARIMA(0,1,0)(2,0,1)[12] intercept : AIC=inf, Time=2.37 sec\n",
" ARIMA(0,1,0)(2,0,2)[12] intercept : AIC=inf, Time=2.75 sec\n",
" ARIMA(0,1,0)(2,0,3)[12] intercept : AIC=427.135, Time=7.49 sec\n",
" ARIMA(0,1,0)(3,0,0)[12] intercept : AIC=inf, Time=3.56 sec\n",
" ARIMA(0,1,0)(3,0,1)[12] intercept : AIC=424.286, Time=6.44 sec\n",
" ARIMA(0,1,0)(3,0,2)[12] intercept : AIC=431.435, Time=6.86 sec\n",
" ARIMA(0,1,0)(4,0,0)[12] intercept : AIC=inf, Time=8.12 sec\n",
" ARIMA(0,1,0)(4,0,1)[12] intercept : AIC=430.321, Time=11.65 sec\n",
" ARIMA(0,1,0)(5,0,0)[12] intercept : AIC=inf, Time=17.56 sec\n",
" ARIMA(0,1,1)(0,0,0)[12] intercept : AIC=1344.207, Time=0.08 sec\n",
" ARIMA(0,1,1)(0,0,1)[12] intercept : AIC=1112.274, Time=0.37 sec\n",
" ARIMA(0,1,1)(0,0,2)[12] intercept : AIC=993.565, Time=0.76 sec\n",
" ARIMA(0,1,1)(0,0,3)[12] intercept : AIC=891.683, Time=3.11 sec\n",
" ARIMA(0,1,1)(0,0,4)[12] intercept : AIC=820.025, Time=5.52 sec\n",
" ARIMA(0,1,1)(1,0,0)[12] intercept : AIC=612.811, Time=0.60 sec\n",
" ARIMA(0,1,1)(1,0,1)[12] intercept : AIC=393.876, Time=1.61 sec\n",
" ARIMA(0,1,1)(1,0,2)[12] intercept : AIC=416.358, Time=3.64 sec\n",
" ARIMA(0,1,1)(1,0,3)[12] intercept : AIC=424.837, Time=8.45 sec\n",
" ARIMA(0,1,1)(2,0,0)[12] intercept : AIC=510.637, Time=1.63 sec\n",
" ARIMA(0,1,1)(2,0,1)[12] intercept : AIC=398.093, Time=3.18 sec\n",
" ARIMA(0,1,1)(2,0,2)[12] intercept : AIC=401.837, Time=4.14 sec\n",
" ARIMA(0,1,1)(3,0,0)[12] intercept : AIC=467.985, Time=8.25 sec\n",
" ARIMA(0,1,1)(3,0,1)[12] intercept : AIC=412.757, Time=10.34 sec\n",
" ARIMA(0,1,1)(4,0,0)[12] intercept : AIC=448.948, Time=7.42 sec\n",
" ARIMA(0,1,2)(0,0,0)[12] intercept : AIC=1222.286, Time=0.14 sec\n",
" ARIMA(0,1,2)(0,0,1)[12] intercept : AIC=1046.922, Time=0.32 sec\n",
" ARIMA(0,1,2)(0,0,2)[12] intercept : AIC=947.532, Time=0.92 sec\n",
" ARIMA(0,1,2)(0,0,3)[12] intercept : AIC=867.310, Time=2.67 sec\n",
" ARIMA(0,1,2)(1,0,0)[12] intercept : AIC=608.450, Time=0.65 sec\n",
" ARIMA(0,1,2)(1,0,1)[12] intercept : AIC=389.029, Time=1.72 sec\n",
" ARIMA(0,1,2)(1,0,2)[12] intercept : AIC=421.446, Time=3.85 sec\n",
" ARIMA(0,1,2)(2,0,0)[12] intercept : AIC=507.685, Time=2.02 sec\n",
" ARIMA(0,1,2)(2,0,1)[12] intercept : AIC=408.463, Time=3.61 sec\n",
" ARIMA(0,1,2)(3,0,0)[12] intercept : AIC=460.596, Time=5.28 sec\n",
" ARIMA(0,1,3)(0,0,0)[12] intercept : AIC=1174.928, Time=0.18 sec\n",
" ARIMA(0,1,3)(0,0,1)[12] intercept : AIC=1037.324, Time=0.56 sec\n",
" ARIMA(0,1,3)(0,0,2)[12] intercept : AIC=947.471, Time=1.46 sec\n",
" ARIMA(0,1,3)(1,0,0)[12] intercept : AIC=602.141, Time=0.82 sec\n",
" ARIMA(0,1,3)(1,0,1)[12] intercept : AIC=399.084, Time=2.40 sec\n",
" ARIMA(0,1,3)(2,0,0)[12] intercept : AIC=500.296, Time=2.60 sec\n",
" ARIMA(0,1,4)(0,0,0)[12] intercept : AIC=1188.947, Time=0.42 sec\n",
" ARIMA(0,1,4)(0,0,1)[12] intercept : AIC=999.240, Time=0.87 sec\n",
" ARIMA(0,1,4)(1,0,0)[12] intercept : AIC=604.133, Time=0.99 sec\n",
" ARIMA(0,1,5)(0,0,0)[12] intercept : AIC=1091.452, Time=0.53 sec\n",
" ARIMA(1,1,0)(0,0,0)[12] intercept : AIC=1298.693, Time=0.05 sec\n",
" ARIMA(1,1,0)(0,0,1)[12] intercept : AIC=1075.553, Time=0.25 sec\n",
" ARIMA(1,1,0)(0,0,2)[12] intercept : AIC=971.074, Time=0.69 sec\n",
" ARIMA(1,1,0)(0,0,3)[12] intercept : AIC=882.846, Time=2.63 sec\n",
" ARIMA(1,1,0)(0,0,4)[12] intercept : AIC=818.711, Time=4.91 sec\n",
" ARIMA(1,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.59 sec\n",
" ARIMA(1,1,0)(1,0,1)[12] intercept : AIC=414.969, Time=1.19 sec\n",
" ARIMA(1,1,0)(1,0,2)[12] intercept : AIC=402.836, Time=3.25 sec\n",
" ARIMA(1,1,0)(1,0,3)[12] intercept : AIC=429.921, Time=6.47 sec\n",
" ARIMA(1,1,0)(2,0,0)[12] intercept : AIC=inf, Time=1.76 sec\n",
" ARIMA(1,1,0)(2,0,1)[12] intercept : AIC=419.397, Time=2.89 sec\n",
" ARIMA(1,1,0)(2,0,2)[12] intercept : AIC=409.246, Time=4.10 sec\n",
" ARIMA(1,1,0)(3,0,0)[12] intercept : AIC=inf, Time=4.96 sec\n",
" ARIMA(1,1,0)(3,0,1)[12] intercept : AIC=419.507, Time=7.41 sec\n",
" ARIMA(1,1,0)(4,0,0)[12] intercept : AIC=inf, Time=11.83 sec\n",
" ARIMA(1,1,1)(0,0,0)[12] intercept : AIC=1240.963, Time=0.11 sec\n",
" ARIMA(1,1,1)(0,0,1)[12] intercept : AIC=1069.162, Time=0.45 sec\n",
" ARIMA(1,1,1)(0,0,2)[12] intercept : AIC=973.065, Time=1.21 sec\n",
" ARIMA(1,1,1)(0,0,3)[12] intercept : AIC=884.323, Time=4.46 sec\n",
" ARIMA(1,1,1)(1,0,0)[12] intercept : AIC=588.156, Time=1.52 sec\n",
" ARIMA(1,1,1)(1,0,1)[12] intercept : AIC=399.035, Time=1.88 sec\n",
" ARIMA(1,1,1)(1,0,2)[12] intercept : AIC=409.509, Time=4.49 sec\n",
" ARIMA(1,1,1)(2,0,0)[12] intercept : AIC=503.551, Time=1.88 sec\n",
" ARIMA(1,1,1)(2,0,1)[12] intercept : AIC=399.929, Time=3.30 sec\n",
" ARIMA(1,1,1)(3,0,0)[12] intercept : AIC=457.277, Time=7.70 sec\n",
" ARIMA(1,1,2)(0,0,0)[12] intercept : AIC=1196.535, Time=0.18 sec\n",
" ARIMA(1,1,2)(0,0,1)[12] intercept : AIC=1042.432, Time=0.50 sec\n",
" ARIMA(1,1,2)(0,0,2)[12] intercept : AIC=948.444, Time=1.55 sec\n",
" ARIMA(1,1,2)(1,0,0)[12] intercept : AIC=587.318, Time=1.60 sec\n",
" ARIMA(1,1,2)(1,0,1)[12] intercept : AIC=403.282, Time=1.93 sec\n",
" ARIMA(1,1,2)(2,0,0)[12] intercept : AIC=498.922, Time=3.90 sec\n",
" ARIMA(1,1,3)(0,0,0)[12] intercept : AIC=1176.484, Time=0.29 sec\n",
" ARIMA(1,1,3)(0,0,1)[12] intercept : AIC=1039.309, Time=0.94 sec\n",
" ARIMA(1,1,3)(1,0,0)[12] intercept : AIC=604.131, Time=1.21 sec\n",
" ARIMA(1,1,4)(0,0,0)[12] intercept : AIC=inf, Time=1.19 sec\n",
" ARIMA(2,1,0)(0,0,0)[12] intercept : AIC=1180.404, Time=0.09 sec\n",
" ARIMA(2,1,0)(0,0,1)[12] intercept : AIC=1058.115, Time=0.33 sec\n",
" ARIMA(2,1,0)(0,0,2)[12] intercept : AIC=973.051, Time=0.92 sec\n",
" ARIMA(2,1,0)(0,0,3)[12] intercept : AIC=883.377, Time=2.84 sec\n",
" ARIMA(2,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.60 sec\n",
" ARIMA(2,1,0)(1,0,1)[12] intercept : AIC=416.548, Time=1.59 sec\n",
" ARIMA(2,1,0)(1,0,2)[12] intercept : AIC=420.663, Time=3.27 sec\n",
" ARIMA(2,1,0)(2,0,0)[12] intercept : AIC=inf, Time=2.23 sec\n",
" ARIMA(2,1,0)(2,0,1)[12] intercept : AIC=402.478, Time=4.16 sec\n",
" ARIMA(2,1,0)(3,0,0)[12] intercept : AIC=inf, Time=6.51 sec\n",
" ARIMA(2,1,1)(0,0,0)[12] intercept : AIC=990.719, Time=0.26 sec\n",
" ARIMA(2,1,1)(0,0,1)[12] intercept : AIC=881.526, Time=1.10 sec\n",
" ARIMA(2,1,1)(0,0,2)[12] intercept : AIC=837.402, Time=3.23 sec\n",
" ARIMA(2,1,1)(1,0,0)[12] intercept : AIC=584.045, Time=2.20 sec\n",
" ARIMA(2,1,1)(1,0,1)[12] intercept : AIC=443.982, Time=2.03 sec\n",
" ARIMA(2,1,1)(2,0,0)[12] intercept : AIC=501.152, Time=2.59 sec\n",
" ARIMA(2,1,2)(0,0,0)[12] intercept : AIC=988.094, Time=0.50 sec\n",
" ARIMA(2,1,2)(0,0,1)[12] intercept : AIC=757.710, Time=2.77 sec\n",
" ARIMA(2,1,2)(1,0,0)[12] intercept : AIC=595.703, Time=3.85 sec\n",
" ARIMA(2,1,3)(0,0,0)[12] intercept : AIC=1140.469, Time=0.95 sec\n",
" ARIMA(3,1,0)(0,0,0)[12] intercept : AIC=1126.139, Time=0.39 sec\n",
" ARIMA(3,1,0)(0,0,1)[12] intercept : AIC=996.923, Time=0.66 sec\n",
" ARIMA(3,1,0)(0,0,2)[12] intercept : AIC=918.438, Time=1.53 sec\n",
" ARIMA(3,1,0)(1,0,0)[12] intercept : AIC=inf, Time=0.88 sec\n",
" ARIMA(3,1,0)(1,0,1)[12] intercept : AIC=406.495, Time=2.17 sec\n",
" ARIMA(3,1,0)(2,0,0)[12] intercept : AIC=inf, Time=3.32 sec\n",
" ARIMA(3,1,1)(0,0,0)[12] intercept : AIC=989.496, Time=0.51 sec\n",
" ARIMA(3,1,1)(0,0,1)[12] intercept : AIC=856.486, Time=1.64 sec\n",
" ARIMA(3,1,1)(1,0,0)[12] intercept : AIC=604.951, Time=0.94 sec\n",
" ARIMA(3,1,2)(0,0,0)[12] intercept : AIC=991.558, Time=1.11 sec\n",
" ARIMA(4,1,0)(0,0,0)[12] intercept : AIC=1125.025, Time=0.18 sec\n",
" ARIMA(4,1,0)(0,0,1)[12] intercept : AIC=987.621, Time=0.50 sec\n",
" ARIMA(4,1,0)(1,0,0)[12] intercept : AIC=inf, Time=1.05 sec\n",
" ARIMA(4,1,1)(0,0,0)[12] intercept : AIC=988.660, Time=1.00 sec\n",
" ARIMA(5,1,0)(0,0,0)[12] intercept : AIC=1113.673, Time=0.22 sec\n",
"\n",
"Best model: ARIMA(0,1,2)(1,0,1)[12] intercept\n",
"Total fit time: 343.809 seconds\n"
]
}
],
"source": [
"start_time = time.time()\n",
"sarima_model = auto_arima(X_train_arima,\n",
" start_p=2, d=None, start_q=1, max_p=10, max_d=10, max_q=10,\n",
" start_P=2, D=None, start_Q=1, max_P=10, max_D=10, max_Q=10, m=12,\n",
" suppress_warnings=True, stepwise=False, seasonal=True,\n",
" error_action='ignore', trace=True, n_fits=50)\n",
"sarima_time = time.time() - start_time\n",
"autosarima_y_pred = sarima_model.predict(n_periods=12)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Auto ARIMA Models MAPE"
]
},
{
"cell_type": "code",
"execution_count": 41,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"auto arima mape = 0.0032060326207122916\n",
"auto sarima mape = 0.0007347495325972257\n"
]
}
],
"source": [
"from flaml.ml import sklearn_metric_loss_score\n",
"print('auto arima mape', '=', sklearn_metric_loss_score('mape', y_test, autoarima_y_pred))\n",
"print('auto sarima mape', '=', sklearn_metric_loss_score('mape', y_test, autosarima_y_pred))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Compare All"
]
},
{
"cell_type": "code",
"execution_count": 42,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"flaml mape = 0.0005706814258795216\n",
"default prophet mape = 0.0011396920680673015\n",
"auto arima mape = 0.0032060326207122916\n",
"auto sarima mape = 0.0007347495325972257\n"
]
}
],
"source": [
"from flaml.ml import sklearn_metric_loss_score\n",
"print('flaml mape', '=', sklearn_metric_loss_score('mape', y_test, flaml_y_pred))\n",
"print('default prophet mape', '=', sklearn_metric_loss_score('mape', prophet_y_pred, y_test))\n",
"print('auto arima mape', '=', sklearn_metric_loss_score('mape', y_test, autoarima_y_pred))\n",
"print('auto sarima mape', '=', sklearn_metric_loss_score('mape', y_test, autosarima_y_pred))"
]
},
{
"cell_type": "code",
"execution_count": 43,
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEGCAYAAACKB4k+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8GearUAAAgAElEQVR4nOyde1zP1x/Hn58uVEg3l9wqkugeFSpquc81mrsw1/YzYxc2tpnNmBmb2eYec8l1wtjcc0kuRahkXZQuLl10v3+/5/dHNG0uGQl9no/H96E+533O9/VJfd+fc3sdSQiBjIyMjIwMgEpVC5CRkZGReXmQk4KMjIyMTBlyUpCRkZGRKUNOCjIyMjIyZchJQUZGRkamDLWqFvAsGBgYCGNj46qWISMjI/NKERISkiqEqPewslc6KRgbGxMcHFzVMmRkZGReKSRJin9UmTx8JCMjIyNThpwUZGRkZGTKqLSkIEmShiRJ5yRJuiRJUrgkSV/cu75VkqTQe684SZJC/1GvmSRJOZIkfVBZ2mRkZGRkHk5lzikUAm8IIXIkSVIHTkmS9IcQYvD9AEmSvgMy/1FvCfBHJeqSkXktKC4uJjExkYKCgqqWIvOSoqGhQZMmTVBXV69wnUpLCqLUVCnn3rfq915lRkuSJEnAW8AbD1zrD8QCuZWlS0bmdSExMZE6depgbGxM6Z+TjMzfCCFIS0sjMTERExOTCter1DkFSZJU7w0P3QEOCSHOPlDsCtwWQkTdi60FzAC+eEKbEyRJCpYkKTglJaWypMvIvPQUFBSgr68vJwSZhyJJEvr6+k/dk6zUpCCEUAghbIEmgKMkSZYPFA8F/B74/gtgiRAih8cghFgphGgnhGhXr95Dl9nKyFQb5IQg8zj+y+/HC9mnIITIkCQpAOgBhEmSpAZ4Am0fCHMCBkmStBDQAZSSJBUIIZa9CI0yLzcKpYIjN46gq6GLQ0OHqpYjI/PaUpmrj+pJkqRz72tNoAsQea+4CxAphEi8Hy+EcBVCGAshjIHvga/lhCADcCnlEsP2D+P94+8z9sBYph6dSmJ24pMryrwQdu3ahSRJREZGPjH2+++/Jy8v7z+/17p16/jf//5X4evPQmW0+SpQmcNHhsAxSZIuA+cpnVP4/V7ZEMoPHcnI/IvU/FRmn5rNyH2j0IxtyLtxi/G5M49LcRH08+/HsovLyC/Jr2qZ1R4/Pz9cXFzYsmXLE2OfNSnIVD6VlhSEEJeFEHZCCGshhKUQYu4DZaOFEMsfU3eOEGJRZWmTebkpUZawMWIj/XYOICYojQnhC3CI6EctUQeVBG2GXppF/4KxrLy0kn7+/TgYdxD5BMGqIScnh8DAQNasWVMuKSgUCj744AOsrKywtrbmxx9/ZOnSpSQnJ+Pu7o67uzsAtWvXLquzY8cORo8eDcDevXtxcnLCzs6OLl26cPv27QprSklJYeDAgTg4OODg4EBgYCBKpRJjY2MyMjLK4kxNTbl9+/ZD46szr7T3kczrx/lb5/km8Ftq/tUQr9szUS/QpIGJNm1HGLHjZir6kh71ruVREtKa6Y2Wcqz5Zt4//j5Ohk587PgxLXRaVPUtVAlf7A0nIjnrubbZppE2n/exeGyMv78/PXr0wMzMDD09PS5cuIC9vT0rV67k+vXrXLx4ETU1NdLT09HT02Px4sUcO3YMAwODx7br4uLCmTNnkCSJ1atXs3DhQr777rsK6Z46dSrTpk3DxcWFGzdu0L17d65evUq/fv3YtWsXY8aM4ezZsxgbG9OgQQOGDRv20PjqipwUZF4KbuXe4vvAH7l9rgjnW97UKNGkibkubXsa09hMh58DYvgpIBaAIe2a4O3UmqAd0TidHkIn+z6sv/MdA/cMZKj5UHxsfahTo04V31H1wM/Pj/feew+AIUOG4Ofnh729PYcPH2bSpEmoqZV+xOjp6T1Vu4mJiQwePJibN29SVFT0VOvsDx8+TERERNn3WVlZZGdnM3jwYObOncuYMWPYsmULgwcPfmx8dUVOCjJVSpGiiF/Pb+bCoTjMbrrQVFkTI2s9HHo2p4GJNgCHI26z6OA1+to0wkhfix+PRpOYUcCSmW25si+eq6dv8rb+VyTZh7Dp6hr2X9/Pe/bv0c+0HypS9bD3etITfWWQlpbG0aNHCQsLQ5IkFAoFkiSxcOFChBAVWg75YMyD6+mnTJnC9OnT6du3LwEBAcyZM6fCupRKJUFBQWhqapa73qFDB6Kjo0lJScHf35/Zs2c/Nr66Uj3+YmReSo6GneLT+T+Tt74RbZJdMbYxYMinjvT2sS1LCFG3s3lvaygWjbRZOMia97u14ttB1pyJTWPY+vO07G1Ev2l2qKqqUveQFZ/lL8ekhimfnf6MkftHEpYaVsV3+fqyY8cORo0aRXx8PHFxcSQkJGBiYsKpU6fo1q0by5cvp6SkBID09HQA6tSpU+4pvEGDBly9ehWlUsmuXbvKrmdmZtK4cWMA1q9f/1S6unXrxrJlfy9cDA0ttVeTJIkBAwYwffp0Wrdujb6+/mPjqytyUpB54URExTD/6/WEL8unSVIb6tvVYOTcjvSf6IB+478nHjPzihn/azAa6qqsHNkODXVVALzaNeXXsY7czipgwM+BpGpJDPnUkXa9jEm5XIjrKW8+0VlIck4yw/YN4/PTn5OWn1ZVt/va4ufnx4ABA8pdGzhwIJs3b2bcuHE0a9YMa2trbGxs2Lx5MwATJkygZ8+eZRPNCxYsoHfv3rzxxhsYGhqWtTNnzhy8vLxwdXV94vzDP1m6dCnBwcFYW1vTpk0bli//e03L4MGD2bhxY9nQ0ZPiqyPSq7xqo127dkI+ZOfVISE6hT07TkNcHYpVCtGwLmCwVxd09f89/l+iUDJm3XnOxKbhN7497Yz1IPow1NSGpo4ARN/JYcy6c9zJKuT7wbb0tDIkLSmHgE2R3IrNwrCVNnE2QWxIXIummiY+tj4MMR+CmsrrMWp69epVWrduXdUyZF5yHvZ7IklSiBCi3cPi5aQgU6kIIUi6dpdD/hfJi5MoUM2loPVNRrzVC6P6TR5Zb96+CFadvM58TyuG2ujC/o/gUunTJs3dwf0TaOpIWk4h438N5sKNDD7uac6ETs1BQPjJJE7vikGpELToUpcdGis4fSsQUx1TPnb8GEdDxxf0E6g85KQgUxHkpCDzUiCUgriwNE7/fo2MG4XkqmeS1PwKQz170N7o8R/Iv11IZPq2S4zqYMTctoWw823IiAfX90t7CoE/QF4qtPAA908oaGDH+9svse/yTYY6NmNuPwvUVVXIuVvIyW1/EXsxBf3GtajdNYdlSd+SlJNEN6NufNDuAwxrGz5Wy8uMnBRkKoKcFGSqFKVCSfSFOwT/Ecfd5Dyya6YR0fQUXbs5MMRyMOoqj/d1D03I4K0VQbRtWoeN5kGoHl8AdQzBcyUYdSwNKsyB86sgcCnkp0PLbig7f8x34Vr8dCwG15YG/DTcHm2N0veKDU3hxJa/yM0spE1nQ662OM7aa6uRkBhnNY7RlqOpqVqzsn80zx05KchUBDkpyFQJimIlkWducuHgDbJS8smqlcJ5wz+xcGrK1HbvYqD55MnCO1kF9Fl2isZSOlvr+6KecBosPKH3EtDU+XeFwmw4txJO/wj5d8GsJwfrj8HnqILm9WqxdrQDTXS1ACjKL+GMfwxXTiRRW6cmlv0N2JS/gkPxh2hSuwkfOXyEW1O3V8p1VE4KMhVBTgoyL5TiQgXhJ5MIPXSD3MwicnVSOdnAH80WJczqMAubejYVaqegWMHQVWcwunmQ7zR9UUUBvb4Fm6HwpA/qgiw4uwKCfoSCTNKadGFiYjfi1FqwxrsdNk3/Tii3YjM5tjGS9ORcWtjXo5ZbDosiFhCbGYtzY2dmOszEuK7xM/xEXhxyUpCpCHJSkHlh3L2Vy67vLpCfXUyJYRYHdDeRVe8mU9tOxdPUE1UV1Qq1I4Rg1tYz2IQtYLBaADRuC56rQP8pLSsKMuHMcgj6CQozCVBpz+JiT3wG96OHZcOyMEWJkouHbhC8Lw5VdRUc+xsTqh/A8su/UKAoYGSbkUy0nkgt9VpP9/4vGDkpyFSEp00K8j4Fmf+EUqHksG8EhSWFHLFfwxrjz+joaMPvA37Hy8yrwgkBYM8f+xgXMZq31I6D6wcw9sDTJwQAjbrgNgPeuwydZ9BJLZw9qh+h3DqS7fsPlJnmqaqp0K6nMUM+daRes9qc8oumzp8WbOq4nTdN3sQ3zJc+u/qwN2avbLT3BFRVVbG1tS17xcXFERAQQO/evR9Zx8bGhqFDh5a7Nnr0aLS0tMptbJs6dSqSJJGamgqUN897GIWFhXTp0gVbW1u2bt36DHf1/Pj666+rWsJTIycFmf9EyJ/x3InP5s+m66nVRIUtvbcwu/1sdDQeMvb/KJQK4vy/otfZUeioKxCj9oLHp6Ba8UPGH4qmDrh/gsp7lylxfp831K4w8OxgrvwwkJJbfxud6TTQot97drwxypz05FwOfBtF37tj+bXbBupr1eeTU5/wv6P/QymUz6bnNUZTU5PQ0NCyl7Gx8WPj7+9ePnHiBLm55Y9iNzU1Zffu3UCp9cSxY8fKdjVXhIsXL1JcXExoaGi5zWmPQ6FQVLj9/4KcFGSqBSk3sjm/7zox9S7SwFKT9T3X00a/zdM1kplEwdo+GId+y+ka7akx5QwqzV2fr1AtPdS6fkaN98M523gULe6eQmV5B4q3jYXUKKDU+qB1x0YMm9OeFvb1Ob8vjojlBSwy+5mp9lM5kXiC36J+e766qjGbN29m5MiRdOvWjT179pQrGzp0aNkTfkBAAM7OzmWGek/izp07jBgxgtDQUGxtbYmJieHIkSPY2dlhZWXF2LFjKSwsBMDY2Ji5c+fi4uLC9u3bOXjwIB06dMDe3h4vLy9yckpPBD5//jwdO3bExsYGR0dHsrOziYuLw9XVFXt7e+zt7Tl9+jQAN2/epFOnTtja2mJpacnJkyeZOXMm+fn52NraMnz48Of1I6x0Xo+tnTIvjJJiBYd8wylQz+VKq4P4uW56etO5iD2IPVMQBQV8LvkwduIsaus8fmjgWVCprU+HCUvxPzWW2wcW4R3xO2pXdyFZeUHnGaDfAi3tGnR724JW7RtyfPM1di8OpY2LM06651kSsgT3pu7oa+pXmsZn5o+ZcOvK822zoRX0XPDYkPsfegAmJibl/IsextatWzl06BDXrl1j2bJl5YaRWrZsye7du7l79y5+fn6MGDGCP/74o0JS69evz+rVq1m0aBG///47BQUFuLm5ceTIEczMzBg1ahS//PJLmaOrhoYGp06dIjU1FU9PTw4fPkytWrX45ptvWLx4MTNnzmTw4MFs3boVBwcHsrKy0NTUpH79+hw6dAgNDQ2ioqIYOnQowcHBbN68me7duzNr1iwUCgV5eXm4urqybNmyV85LSe4pyDwV5/Zc5+7NPA6bbOBTt1kVWmpaRlEu7JkC20YSLxrQu+hrug1/HyODyksID9LfxRaLUd/TnWVslHqjDN8Ny9rBrsmQXmrLbWShz9DPnLDt2oyrgTdxOTecwqIiFocsfiEaXzUeHD56UkI4f/489erVw8jICA8PDy5cuMDdu3fLxXh6erJlyxbOnj2Lq+t/7zleu3YNExMTzMzMAPD29ubEiRNl5feHl86cOUNERATOzs7Y2tqyfv164uPjuXbtGoaGhjg4lJ4Hrq2tjZqaGsXFxYwfPx4rKyu8vLzKLLcdHBzw9fVlzpw5XLlyhTp1Xl3rdrmnIFNhbkZncPHQDSLqB9K5Y1s6NelU8crJoaU7k9NiONfYm+ExHnzc2xpn06czO3tWXFoasHpyT8asM2BFTi82mp/BOHwLXN5auvy10weo65ngPNCUxmY67PvpMt7Np7EyZj79WvR7ee0xnvBE/zLg5+dHZGRk2bxDVlYWO3fuZNy4cWUxQ4YMwd7eHm9vb1RU/vsz65MWCNSqVassrmvXrvj5lT8d+PLlyw/ds7JkyRIaNGjApUuXUCqVaGhoANCpUydOnDjBvn37GDlyJB9++CGjRo36z/qrErmnIFMhigpKOOAbRo5GOml2V5nWdlrFKiqVpbYUq7tAUR5nXX15K6Y7/doaM8bZuFI1P4qWDeqwy8cZ/YbNcA/rxqb2exCO4+HK9tKew54pcDceYysDjK30qXm5Mc1rmPHV2a8oVhRXieZXHaVSyfbt27l8+TJxcXHExcWxe/fuf30YN2vWjHnz5uHj4/NM72dubk5cXBzR0dEAbNiwgc6dO/8rrn379gQGBpbF5eXl8ddff2Fubk5ycjLnz58HIDs7m5KSEjIzMzE0NERFRYUNGzaUTVTHx8dTv359xo8fz9tvv82FCxcAUFdXp7j41fqdkZOCTIUI3BlNTlohp8y2M/+NedRQrfHkSlnJsKE/HPoMWvXgmuefjA7QxK6ZDvMGWFbp7uF6dWqyZXx7elg0ZNbhVD4tHEHJ/y5Au7FwaQv8aA97p9Kxex0URUqGZk/heuZ11oWvqzLNrxJHjhyhSZMmZa9vvvmGxo0bl1tN1KlTJyIiIrh582a5uhMnTqRFi38vSc7LyyvX5uLFjx7S09DQwNfXFy8vL6ysrFBRUWHSpEn/iqtXrx7r1q1j6NChWFtb0759eyIjI6lRowZbt25lypQp2NjY0LVrVwoKCvDx8WH9+vW0b9+ev/76q6zHERAQgK2tLXZ2duzcuZOpU6cCpVbh1tbWr9REs7x5TeaJ3IhIY+/SS1wyPEbPYfYMaDngyZUi98Hu/0FJAfRYQHqrIfT9KZBihZK9/3OhvrZG5QuvAEqlYOGBayw/HkNns3osG2ZHncLbcHIxXPgVtPQ52eg3rpy8zY3uJzic8zu7+u2iaZ2mVS1d3rwmUyHkzWsyz5WC3GIOrLvCXc1b1HUtpL9p/8dXKMqDve/BlmGg0xQmnqDYdiQ+my9wJ7uQFSPbvTQJAUBFRWJmT3Pme1pxKjoVr+VBJAt96L0YvPdCzi0cDA5QQ1ONtjE9UZPUmHd2nrypTea1RU4KMo/lmF8EBdklXLY6wKcusx8/5HPzEqzsDCG+4DwV3j4MBi356vcIzsSms8DTCtumT7G57QUy1LEZ68Y4kHQ3n/4/BXIlMROMOkCrXmiELMGhWwPuROUySecDApMCORR/qKoly8hUCnJSkHkk0RfuEBucxsXGh/ik9zS0a2g/PFCphNPLYJVHqXPpqN3QdS6o1WDLuRusD4pnvKsJnvaPPlTnZcC1ZT12+nREXVWFt1YEcTD8FrzxKRRmY8kmdBpooX6uMW10LPjm3DfkFOVUtWQZmedOpSUFSZI0JEk6J0nSJUmSwiVJ+uLe9a2SJIXee8VJkhR677rjA9cvSZJUgYFrmcoiL6uIQxsuc6fWDRx7Nce2vu3DA7NvwaaBcHAWmHWHyaehuRsAwXHpfLo7DNeWBszoYf7CtD8LZg3qsOudjpg1qM3EjSH8GqsF1oNRDV6Bc089Mm7nM0ZlOin5KfwU+lNVy5WRee5UZk+hEHhDCGED2AI9JElqL4QYLISwFULYAjuB+x4CYUC7e9d7ACskSZL3UVQBQgj2r79AcYGCO06hjLcd9/DAW1fgl44QHwS9v4fBG0FLD4DkjHwmbbxAYx1Nlg21R0311emU1q+jwZYJHXBvVZ+5eyO41XYaKBUY3fmZJua6JB4rYLDxMDZHbiYiLaKq5crIPFcq7S9VlHK/f61+71U2OyeVDk6/Bfjdi88TQpTcK9Z4MFbmxRIRlMTt8Dwumxzl8zdnPtzx9G4cbBwIahow8Ti0G1N27kFBsYKJG0IoKFawalQ76mo9o8FdFaBZQ5Uv+1sCsPySAtqORgrdgEtXDYryS2if3Bvdmrp8GfQlCmXlmqrJyLxIKvXxTZIk1XvDQ3eAQ0KIsw8UuwK3hRBRD8Q7SZIUDlwBJj2QJB5sc4IkScGSJAWnpKRUpvxqSXZ6Acf8rpJcJ5ohb3WjYa2G/w7KTYUNnlBSCCN+g3qtyoqEEMzYeZmw5Ey+H2xLywav7nb/xjqa9LdrzJbzN0hr9y6o1kD/6iJauzTir5MpvNd8BmFpYWz/a3tVS60y7ltnW1pa4uXlRV5e3jO3GRcXh6Wl5VPV8ff3L7Oc+CcpKSk4OTlhZ2fHyZMnn1nfs5KRkcHPP/9c1TIeSaUmBSGE4t5wUBPAUZKkB/+nh3Kvl/BA/FkhhAXgAHwsSdK/1i4KIVYKIdoJIdrVq1evMuVXO4RS8NuqIEoUJdTqlkkXY49/BxXmwKZBpRvThm2D+uXnClaciGV3aDIfdGtFlzYNXpDyymNS5xYUlihZG5oHTpMgbAdOjgWo1lBB/Vwj2hu254cLP5Can1rVUquE+95HYWFh1KhRg+XLl5crr2xr6vs8LikcOXIEc3NzLl68WGE/pcrUXa2Twn2EEBlAAKVzBdybK/AEHnoShhDiKpALPN3jgswzcebINXKuC6JbBzLdfcq/A0qKYNtIuHkZvHyhmVO54mORd/jmz0jetDbEx+0/HJLzEmJavzY9LBrya1A8We18QKMuWufm0a6nMXFX0pikP50iRRELzy+saqlVjqurK9HR0QQEBODu7s6wYcOwsrKioKCAMWPGYGVlhZ2dHceOHQNg3bp19OvXjx49etCqVSu++OKLsrYUCgXjx4/HwsKCbt26kZ+fD0BMTAw9evSgbdu2uLq6EhkZyenTp9mzZw8ffvhhmW32fUJDQ/noo4/Yv38/tra25Ofn4+fnh5WVFZaWlsyYMaMstnbt2nz22Wc4OTkRFBTExo0bcXR0xNbWlokTJ5Ylij///BN7e3tsbGzw8Ch9cDp37hwdO3bEzs6Ojh07cu3aNQDCw8PL2rC2tiYqKoqZM2cSExODra0tH374YeX+p/wHKm0iV5KkekCxECJDkiRNoAvwzb3iLkCkECLxgXgTIEEIUSJJkhHQCoirLH0y5Um/nUOw/w2SdaN5d/goNNT+0UlTKmH3OxBzFPr9BK16liuOScnh3S0Xad1Qm28HWVephcXzxsfNlD/CbrExNBMf5/fgyBdYt08iTF+DqP3ZvP3mOH658jP9TfvTsVHHKtH4zblviEyPfK5tmuuZM8NxxpMDgZKSEv744w969OgBlH5IhoWFYWJiwnfffQfAlStXiIyMpFu3bvz111/l4rS0tHBwcODNN9/EwMCAqKgo/Pz8WLVqFW+99RY7d+5kxIgRTJgwgeXLl9OyZUvOnj2Lj48PR48epW/fvvTu3ZtBgwaV02Vra8vcuXMJDg5m2bJlJCcnM2PGDEJCQtDV1aVbt274+/vTv39/cnNzsbS0ZO7cuVy9epVvvvmGwMBA1NXV8fHxYdOmTfTs2ZPx48dz4sQJTExMSE9PL/1ZmZtz4sQJ1NTUOHz4MJ988gk7d+5k+fLlTJ06leHDh1NUVIRCoWDBggWEhYW9tJbalbm6xxBYL0mSKqU9km1CiN/vlQ3hH0NHgAswU5KkYkAJ+Aghqmef/AWjVAq2Lj9JMQosBxpgpmdWPkAIODgbrmwDj8/AbkS54sz8YsavD6aGqgorR7VFq8brtWjMqkldXFsasPbUdcZOH4/G2RWoHZ9LxwHrObA6HJecXuzX3sfXZ79mZ9+d1FStWdWSXxgPnqfg6urK22+/zenTp3F0dMTExASAU6dOMWVKac/T3NwcIyOjsqTQtWtX9PVLz6nw9PTk1KlT9O/fHxMTk7J227ZtS1xcHDk5OZw+fRovL6+y979/cE5FOX/+PG5ubtwfeh4+fDgnTpygf//+qKqqMnDgQKB0yCkkJKTMOjs/P5/69etz5swZOnXqVHZvenqlq+0yMzPx9vYmKioKSZLKTPA6dOjAvHnzSExMxNPTk5YtWz6V3qqg0v56hRCXAbtHlI1+yLUNwIbK0iPzaA7sPo/yZk3SnIKZ1vYh3dnTS+HMT+A4EVymlytSKAXvbbnIjfQ8No1zoomu1gtS/WLxcTNl6KozbL+UxsjOH8K+92nhfAlDU31C9t5gps8nTD4xkbVX1jLZdvIL11fRJ/rnzf05hX9y3ygOHm9j/c8e5f3va9b8O7GqqqqSn5+PUqlER0fnmZ6wH6dFQ0MDVVXVsjhvb2/mz59fLmbPnj0P7QV/+umnuLu7s2vXLuLi4nBzcwNg2LBhODk5sW/fPrp3787q1atp3rz5f9b/Inh1Fo/LVAqJ8alEH8wgsf5VPhgy/t+/8KF+pS6nFgOgx4KyZaf3+fbANY5dS2FOXwucmr/EJ5M9I+2b62HfTIflx2MpthkBusZIR+fi7NmC/OxiVEPr08ukF6uurCI+K76q5b5UdOrUiU2bNgHw119/cePGDVq1Kl2xdujQIdLT08nPz8ff3x9nZ+dHtqOtrY2JiQnbt5eu9hJCcOnSJQDq1KlDdnb2E7U4OTlx/PhxUlNTUSgU+Pn5PdRS28PDgx07dnDnzh0A0tPTiY+Pp0OHDhw/fpzr16+XXYfSnsJ9B9h169aVtRMbG0vz5s1599136du3L5cvX66w1qpCTgrVGEWJkh3LAylQy6ePdzt0NXXLB0QdKp1HMOkMA1bAA4eeKJWCbw9Esvx4DMOdmjGivdELVv9ikSQJHzdTkjLy2RuWCu6z4PYVGuQeplX7hoQeSWBy86loqGrw1ZmvZMO8B/Dx8UGhUGBlZcXgwYNZt25dWU/AxcWFkSNHYmtry8CBA2nX7qHGnWVs2rSJNWvWYGNjg4WFBbt37wZKD+f59ttvsbOzKzfR/E8MDQ2ZP38+7u7u2NjYYG9vT79+/f4V16ZNG7766iu6deuGtbU1Xbt25ebNm9SrV4+VK1fi6emJjY1N2QluH330ER9//DHOzs7lVi5t3boVS0tLbG1tiYyMZNSoUejr6+Ps7IylpeVLOdEsW2dXYzatP0hGkBqKbvG86zmmfGFiMKzvA/qmMHofaPzte5RbWMK0raEcjLjNEIemfNnfEvVXaMfyf0WpFPRaepISpeDgVBdUVrhCST45w0+y6YtgjCwNyOwUzldnv+Ib12/o1bxXpep51a2z161bVzYBLFN5yNbZMhXicngU6UEq3Gn6F5P7l/mMw6wAACAASURBVJ84JuUv2OQFtRvAiJ3lEkLi3TwG/nKaw1dv83mfNsz3tKoWCQFKbbYnu7Ug+k4OhyJTwONTSI+l9vVt2HUzIubCHZxVumKpb8nC8wvJKsqqaskyMk9N9fhrlilHfkEBB9deIa9mJmMm9EJd5QEbiqxk2OgJKqow8jeoXb+sKDgunX7LAknKyGfdGEfGOJu8VktPK8KbVoY009Pi54AYRMvu0MQRji/Ezq0etXRqErQzhtlOs7lbeJelF5ZWtdyXmtGjR8u9hJcQOSlUQ1av3oNmrg4tB9TGuF6zvwvyM0r9jPLvwvAdoPf3KontwQkMXXWGOhpq7PJxppNZ9dxNrqaqwsTOzbmUkMHp2HToMgeyb6J+aQ0d+jfnTnw2ajF6DDMfxrZr27iScqWqJcvIPBVyUqhmHAkKQiXMgNxWiXi6df+7oDgf/IZCahQM2QSNSteIK5SCefsi+HDHZRxN9PB/xxnT+rWrSP3LwUD7JtSrU5OfA6LB2BlMu8CpJZhZa1LfqA5B/rFMaD2Jepr1+PLMl5Qo/2XhJSPz0iInhWpESkYaIVtvkqt1l0kTHjiuQqmAnePgRhB4rig7DyG7oJhx68+z6uR1vDsYsW6MIzpaNapE+8uEhroq411NCIxOIzQho3RDX/5dpDPLcPZqSW5GIVHH05nhOIOr6VfZErmlqiXLyFQYOSlUE4QQrFyxB80CbVxHtkC7Vp37BbBvOkT+Dj2/AcvSHZ3xabl4/nyak1GpfNXfki/6VY8VRhVlmJMRdTXV+flYNBjalO7jCPqZRg2LaGFfn4sH4+mg7YpLYxd+vPgjt3NvV7VkGZkKIf+VVxM2/LELnetGaLTNoYPdA6eoBSyAkHXg+j44TQQgKCaNfj8FkpJTyK9vO772exD+C7VrquHd0ZiDEbeJup0N7rOhpABOfkdHzxYolYKzu6/zidMnKISCb85/8+RGX1F27dqFJElERlbMe+n777+vsMX2xYsXkSSJAwcOPDauV69eZGRkVKjNp+HDDz/EwsLipdlPsG7dOpKTkyv1PeSkUA0IS7zK7T9Uya+bwWjv3n8XnF8NxxeUehm98SkAm8/eYOSasxjUrom/jzMdWxhUkeqXnzEdjdFUV+WXgBgwMAW74RC8Fm3VO9h6NOXa2VvUSKvLROuJHIo/xMnEqvfyrwz8/PxwcXFhy5aKDZM9TVK437af3z+t0koRQqBUKtm/fz86OjoV1lxRVqxYwYULF/j2228rFF9SUrnzR3JSkHlmCkoK2Lr6BBolWvSf4ID6fbO6iN2w7wMw6wG9f6BEKZizJ5xPdl3BpaUBv/l0xNig1uMbr+bo1qrBUMdm7L6UTEJ6HnSeCUgQsIC2PYzRrKNO4PYovNt407xuc+adnUdBSUFVy36u5OTkEBgYyJo1a8olhYCAAHr3/vsB5H//+x/r1q1j6dKlJCcn4+7ujru7O8AjrayFEOzYsYN169Zx8OBBCgpKf3ZxcXG0bt0aHx8f7O3tSUhIwNjYmNTUVOLi4jA3N2fcuHFYWloyfPhwDh8+jLOzMy1btuTcuXPAo62uH6Rv377k5ubi5OTE1q1biY+Px8PDA2trazw8PLhx4wZQurR2+vTpuLu7M2PGjIfaewPcvn2bAQMGYGNjg42NDadPnwagf//+tG3bFgsLC1auXAmUWoePHj0aS0tLrKysWLJkCTt27CA4OJjhw4eX2YBXBq+XnaXMv/hhmy8Nb7XC0F0N0xZNSy9eP1k6sdzUEQb5klkoeGfzeU5FpzLe1YSZPVujqlK99h/8V8Z3MmHDmThWnogtPb7TcTyc+ZkazlNx6tucgE3XuHEpg9ntZzP2wFhWXl7Ju/bvPncdt77+msKrz9c6u2Zrcxp+8sljY/z9/enRowdmZmbo6elx4cIF7O3tHxn/7rvvsnjxYo4dO4aBgcFjrawDAwMxMTGhRYsWuLm5sX//fjw9PQG4du0avr6+Dz2sJjo6mu3bt7Ny5UocHBzYvHkzp06dYs+ePXz99df4+/s/0ur6Qfbs2UPt2rXLDPj69OnDqFGj8Pb2Zu3atbz77rv4+/sDpZ5Ohw8fRlVVFQ8Pj4fae7/77rt07tyZXbt2oVAoyMkpPa147dq16OnpkZ+fj4ODAwMHDiQuLo6kpCTCwsKA0oN5dHR0WLZsGYsWLXqiHcizIPcUXmP+DDuCWmATlPVz6T/IpfTirSuwZVjpHoShW4jJVNL/50DOXk9j4SBrZr3ZRk4IT4FhXU087ZqwLTiBlOzCUhdZ9Vpw9EtaOzdCv3FtTv8WjZ2+PX1b9MU33JfYjNiqlv3c8PPzY8iQIUCp/9CjhnkexYNW1mpqamVW1k9q28jIiPbt2z+0TRMTE6ysrFBRUcHCwgIPDw8kScLKyoq4uDig1MDOy8sLS0tLpk2bRnh4+BO1BgUFMWzYMABGjhzJqVOnysq8vLxQVVUtZ+99/3CemzdvAnD06FEmTy510FVVVaVu3boALF26FBsbG9q3b09CQgJRUVE0b96c2NhYpkyZwp9//om2tjYvCrmn8JpyM+cmJzdH0UCYMHRyB1RUVeBuXOnmtJp1YMROTiYpeGdTIOqqKmwe3x4HY72qlv1KMsmtBdtDElgbeJ0ZPcyh4xQI+BqV5As4e5my5/tQLh9NZHqn6QQkBPDlmS9Z233tc90N/qQn+sogLS2No0ePEhYWhiRJKBQKJEli4cKFqKmpoVQqy2LvD/38k0d5rykUCnbu3MmePXuYN28eQgjS0tLK3EUftOb+Jw/abquoqJR9r6KiUjbm/yir66fhwf+/+3qe1t47ICCAw4cPExQUhJaWFm5ubhQUFKCrq8ulS5c4cOAAP/30E9u2bWPt2rVPrfG/IPcUXkMUSgWLN66lUboZ1n0MMTDUhtxU2OAJJYWIETtZF1bMaN/zNNLRxP8dZzkhPAMmBrXoaWXIhqB4MvOLoYMPaBnAkS9oaq6HsbUBwX/EoVlch2ltpxF8O5i9sXurWvYzs2PHDkaNGkV8fDxxcXEkJCRgYmLCqVOnMDIyIiIigsLCQjIzMzly5EhZvQetox9lZX348GFsbGxISEggLi6O+Ph4Bg4cWDZc86w8yur6cXTs2LFs3mTTpk24uLj8K+Zx9t4eHh788ssvQGnSy8rKIjMzE11dXbS0tIiMjOTMmTMApKamolQqGThwIF9++SUXLlwAKm4R/izISeE1ZFXgOhpdtqNmsxI6d7eCwpxSg7usZIoHb+GTUyXM2RuBe6v67JjckaZ6r+fBOC+SyZ1bkFNYwsYz8aU9Mdf34fpxiA3AeaApiiIlZ/fG4tnSE5t6Niw6v4jMwsyqlv1M+Pn5MWDAgHLXBg4cyObNm2natClvvfUW1tbWDB8+HDu7v8/bmjBhAj179sTd3f2RVtaPa/t58Cir68exdOlSfH19sba2ZsOGDfzwww8PjXuUvfcPP/zAsWPHsLKyom3btoSHh9OjRw9KSkqwtrbm008/LRsSS0pKws3NDVtbW0aPHl122M/o0aOZNGlSpU40y9bZrxkXb11k5+JgGuQ3w3tOJ7TrqoLfYIg9Tnb/9Yw7Y8DZ6+n4uLXgg26tUJHnD54bo33PcTkxk8AZb6ApFcOPbaFOAxh3hJPbo7hyLJG3ZjmSppXE4N8H09+0P3M6zvnP7/eqW2fLvBhk6+xqTFZRFis37cAwqwWub5mhrVuz9JCcmKPcclvImwdqczEhg+8H2/JRD3M5ITxnfNxMSc8tYuv5G6CuAW4zISkEIn/H4U0TamiqEbgjCjNdM0a2GcnOqJ2E3nk5D2+Xqb7ISeE1QQjBvD8WYR7dCT1zdWxcjOHQp3BlGzHW0+lytCn5xQq2TmhPf7vGVS33tcTRRA8HY11WnoilqEQJNkPBwAyOfImGpgoOvU1IjLxL3JU0JttMpmGthsw9M5diZXFVS5eRKUNOCq8JWy/vQPtYG9Q0VOg31gnp9I8QtIywJkPocr4txgZa7PmfM3bNdJ/cmMx/xsfNlOTMAnaHJoGqGrwxG1KvwaUtWHZujE4DLU7vjKampMHHjh8TdTeKTRGbqlq2jEwZclJ4DbiaEsnlzWnULdTnzf6N0IzZAYc+JVTbnT7Rvell2YjtEztiWFezqqW+9ri1qkdrQ21+OR6DQimgdV9oZAcB81EVxTgPNCXjdh5hx5N4o9kbuDV14+dLP3Mz52ZVS5eRAeSk8MqTV5zH+uW/0zjTDIuE3eRPHMRf3l8QHmTC6nNtmGlblx+H2qFZQ7WqpVYLJEnCx60FsSm5HAy/BZJUaq2dmQDBvhhZ6dPEXJfz+65TkFvMx44fAzD/3PwqVi4jU4qcFF5hFJmZbPtwPk2T7GmacITmtW/SwDEXZWN17t6pyTuhu+g0ZyKx3btz87PPyfrzAIpKcJKUKU8vK0OM9e8d2SkENHcHY1c48S1SUS4uXi0pyi/h/O/XaVS7EZNtJnMs4RjHbhyraukyMpWXFCRJ0pAk6ZwkSZckSQqXJOmLe9e3SpIUeu8VJ0lS6L3rXSVJCpEk6cq9f9+oLG2vOiUpKdxZtIigPmPIy+tEnZLreLxrTWOrM2SbaPK+yxfo7D1A8/37aDB7NjVbtiRr3z6S3nuPvzp05PogL+4sXkLumTMoi4qq+nZeO1RVJCZ1bsGVpExORafe6y18DnmpcOYX9BvXprVLI8KOJ3H3Vi4j2ozAVMeU+efmk1dcMffQl4nKss5eu3YtVlZWWFtbY2lpWbbev6Ls2bOHBQsWPFWdihAZGYmtrS12dnbExMQ89/aflri4uOe2fwMoXbVSGS9AAmrf+1odOAu0/0fMd8Bn9762Axrd+9oSSHrSe7Rt21ZUJ4oSE8XNL+aKq9Y2IsTOVSwbv1ssmbpD5EafF0VfNRbxn5kK7yW7xO3M/H/VVRYVidyQC+LOj8vE9WHDRYSFpYhoZS6u2tiK+LfHidTVa0T+1atCqVBUwZ29fhQUlwjHeYfE4BWn/764eagQXzcRIjdN5GYWihVTA8Tvy0KFEEKcv3leWK6zFBsjNlb4PSIiIp637P+El5eXcHFxEZ9//nmF4o2MjERKSspjYxISEkTz5s1FRkaGEEKI7OxsERsbW2FNxcXFFY59WubPny8+++yzCscrlUqhqMS/q2PHjok333zzkeUP+z0BgsWjPrsfVfA8X4AWcAFweuCaBCQALR8SLwFpQM3HtVtdkkJBTIxImjGz9IPc0krEzJwtvn1vq/jund9FTMhRUfBVE3HjsxZi0jJ/kZVfVKE2S7JzRNbRo+LmV/NEdK83RUQrcxHRylxc69BRJE5/X9zdsUMUJSdX8p293qw6ESOMZvwuguPSSy/cChfi87pCHJglhBAi5M84sWziEXEjIk0IIcSo/aNE9x3dRbGiYh9oL0NSyM7OFo0aNRLXrl0TrVq1Krv+zw+qd955R/j6+ooffvhBqKurC0tLS+Hm5iaEEGLz5s3C0tJSWFhYiI8++kgIIURISIiwsbERJSUl/3rPlStXinbt2glra2vh6ekpcnNzhRBCeHt7i2nTpgk3Nzcxffp04evrK955552yskmTJgk3NzdhYmIiAgICxJgxY4S5ubnw9vYua3vSpEmibdu2ok2bNg/94N+3b59o0KCBaNSoUZn+7777TlhYWAgLCwuxZMkSIYQQ169fF+bm5mLy5MnC1tZWxMXFiYULF4p27doJKyurcm2vX79eWFlZCWtrazFixAghhBB79uwRjo6OwtbWVnh4eIhbt24JIYQICAgQNjY2wsbGRtja2oqsrCzh5OQktLW1hY2NjVi8ePG/ND9tUqhUQzxJklSBEMAU+EkIcfaBYlfgthAi6iFVBwIXhRCFD2lzAjABoFmzZs9f9EtEfng4aStXkX3wIFLNmugOG4qu92h+9j1Nzbt1Me13l0b7J5NSXIPvGy9m8eg3KzyhrFq7FnXc3alzz9O++PZtck8HkXv6NLlBQWTt2wdADRMTanXoQC3njmg5OqJap06l3e/rxlDHZiw7Fs0vAdGs9naABm3AZgicWwXtfbB5oynhJ5M4tT2KwbMc8LbwZuqxqRy+cZgexj2e6r1ObvuL1ISc56rfoGltXN8ye2xMZVln9+nThwYNGmBiYoKHhweenp706dMHAE9PT8aPHw/A7NmzWbNmDVOmTAHKW1j/09Po7t27HD16lD179tCnTx8CAwNZvXo1Dg4OhIaGYmtry7x589DT00OhUODh4cHly5extrYua6NXr15MmjSJ2rVr88EHHxASEoKvry9nz55FCIGTkxOdO3dGV1e3nL33wYMHiYqK4ty5cwgh6Nu3LydOnEBfX5958+YRGBiIgYEB6enpALi4uHDmzBkkSWL16tUsXLiQ7777jkWLFvHTTz/h7OxMTk4OGhoaLFiwgEWLFvH7778/9f/xw6jUiWYhhEIIYQs0ARwlSbJ8oHgo8C+fXUmSLIBvgImPaHOlEKKdEKJdvXr1KkN2lZMXEsKN8ROIGziI3MBA9CdMwPToERp+8gk7DoZTI1EPRbso3EI+IK1YjRUm3zN/bO9nWmGk3qABOgP60/jbhbQ8eQKT3bupP3MG6k2bkLFrF4nv/I+/2ncgbshQUpYuJS84GFEsb7p6HLVqqjG6ozGHr97h2q17JmZuH4NSAce/QVVdhQ4DTElPziUi8CZuTd0w0jZifdj6R7qHvmxUlnW2qqoqf/75Jzt27MDMzIxp06YxZ84cAMLCwnB1dcXKyopNmzaVs72+b2H9MPr06VNmod2gQYNy9tr3LbW3bduGvb09dnZ2hIeHExER8Vj9p06dYsCAAdSqVYvatWvj6enJyZOlJ+w9aO998OBBDh48iJ2dHfb29kRGRhIVFcXRo0cZNGgQBgalJxzq6ZUaUyYmJtK9e3esrKz49ttvy+7R2dmZ6dOns3TpUjIyMlBTe/7P9S/EOlsIkSFJUgDQAwiTJEkN8ATaPhgnSVITYBcwSghR9TM4LxAhBLmnTpG6YgX5wSGo6ulRb9o0dIcNLXs6Px0QTsZZVW41vcIHt5aQUazKry1/5POhPVFTfX75XZIkNFqZodHKDP3RoxFFReSFhpb1IlKXryD1519Q0dJCy8mJ+tOnUbNly+f2/q8Tozsas/JELL8ERPP9EDvQNYJ2Y+D8Guj4Li3sm2NoWpdze2Np6dCAUW1G8eWZLwm5HUK7hhU/SOVJT/SVQWVaZ0Pp76GjoyOOjo507dqVMWPGMGfOHEaPHo2/vz82NjasW7eOgICAsjoVsdR+0E77/vclJSVcv36dRYsWcf78eXR1dRk9evQjdVdE/4NahBB8/PHHTJxY/ll36dKlD7VQnzJlCtOnT6dv374EBASUJcSZM2fy5ptvsn//ftq3b8/hw4cfq++/UJmrj+pJkqRz72tNoAtwf3lCFyBSCJH4QLwOsA/4WAgRWFm6XjaEUknWgYPEDRxEwvgJFCcm0eCTTzA9chiDiRPKEkJidBoh25K5VTeG4dKPFBQLtlv8woxhvZ5rQngYUo0a1HJ0pP5772GydStmQadp/ONStPv1JT80lOuDh5D15+MPVq+u6GjVYLhTM/ZcSuZG2r0VN50+BLWacGwekiTh4tWS/OxiQv6Io2+LvujW1GV9+PqqFV4BKtM6Ozk5ucwuGiA0NBQjIyMAsrOzMTQ0pLi4mE2bnt9u8KysLGrVqkXdunW5ffs2f/zxxxPrdOrUCX9/f/Ly8sjNzWXXrl24urr+K6579+6sXbu27LS1pKQk7ty5g4eHB9u2bSMtLQ2gbPjoQWvv9ev//l2IiYnBysqKGTNm0K5dOyIjI5+7nXZlfpoYAsckSboMnAcOCSHuD3oN4d9DR/+jdO7h0weWrNavRH1ViiguJsPfn9jefUiaOhVFbg6GX32J6cED6I0aiYrm37uPs9ML2PVTMNnq6djrLUO/sIj99quY4tW9SkztVOvWRbtrVww//xwT/13UbGlK0nvvcee77xAVtCGuToxzbY6aigorTtzr/NauD+0nQ9hOuHmZ+kbatGrfkEtHEyjOgqHmQwlIDCA28+U+oa0yrbOLi4v54IMPMDc3x9bWlq1bt5ZZVX/55Zc4OTnRtWtXzM3Nn9v92NjYYGdnh4WFBWPHjsXZ2fmJdezt7Rk9ejSOjo44OTkxbty4cvd6n27dujFs2DA6dOiAlZUVgwYNIjs7GwsLC2bNmkXnzp2xsbFh+vTpAMyZMwcvLy9cXV3LhpagdDmvpaUlNjY2aGpq0rNnT6ytrVFTU8PGxoYlS5Y8889Bts5+wSgLC8nYuZP01WsoTk6mZqtWGEycQJ3u3ZEeMhZaXKjg16+Pk5VaQLbJQiZnx3DUaS2De3WtAvUPR1lUxO2v5pGxbRu1nJ1ptOhb1HRlj6UH+fi3K+wMSeTUDHfqa2tAfgb8YFN6Tvbw7WSnF7BhdhC2XZpi3kuPbju60bt578daa8vW2TIVQbbOfklR5OSStmYN0R5duD33S9Tq1aPJLz9j4r8L7V69HpoQhFKwb80F8m8riTD2ZVxONGddfV+qhACgUqMGhnO/oOGXc8k7d464QV4UXL1a1bJeKiZ1bk6JUsmaU9dLL2jqgMt7EHUQ4k9TR0+D5jYGRAQmo61Sl34t+rE3Zi+p+alVK1ym2iEnhRdAcVISsb17c+fbRWiYtaTZunUYbfGjjrv7Y8/pPbsvlqTL2QQ3282UgjNcc/+V3l26vEDlT4eulxdGGzcgSkqIGzqMzL3PZ4nc64CRfi16Wzdi45l4MvPurdpynAi1G8LhL0AIrNyaUJhbQlTwbUa2GUmxspgtkVuqVrhMtUNOCpVMyd273Bg3HmVeHkabNtJs7VpqtXd64qHt0SF3CNkXT2S9s/RU/Z3cN37FvfPL7/yhaWODyc4daFhakPzhh9yevwBx77D06s5ktxbkFilYHxRXeqGGFnT+CBLOQNRBGpnpoNeoFpePJWKkbYR7U3e2XNtCfsmjj118lYd/ZSqf//L7ISeFSkRZUECizzsUJyXR9Oef0Grb9smVgJQb2Rz0vcKt2tepbfArrdx86eDsXslqnx9qBgYY+fqiO2IE6evXc2Ps25TcW11RnWltqI2HeX18A6+TV3QvUdqPAl0TOPIl0r3eQmpCDrdis/C28CazMJPd0Q/3/NHQ0CAtLU1ODDIPRQhBWloaGhoaT1XvqfYpSJKkCzQVQlx+qnephgiFgqQPPiA/NJTGS5ag1a5ia85zMwvxXxpMjspdIpqv4FOHpdg6uFWu2EpAUlen4exZaFpZcvOzz7k+cBBNflyKppVVVUurUnzcWzDwlyD8ziXwtosJqKqD+yz4bRyE/4aZY3+CdsVwJSCRrmPtsDaw5teIX/Ey80JVpfy8U5MmTUhMTCQlJaWK7kbmZUdDQ4MmTZo8VZ0nJoV7m8763osNBVIkSTouhJj+X0RWB4QQ3J43j5zDR2gwaxbaPbpXqF5JsYLfvj9Hfl4Bh9usYpb9dGwdXq5J5aelbr9+1DA1JXHKFOKHj6Dh55+jM9CzqmVVGW2N9HA00WP1yVhGtjeihpoKWA6EwO/h6FfUaNOP1h0NuXIsEedBpnhbePP+8fcJSAjAw8ijXFvq6uqYmJhU0Z3IvK5UZPiorhAii9IdyL5CiLaUbj6TeQRpK1dxd7Mf+uPeRm/kiArVEULw20/nyLpZzGHTjQy2cKNz+yGVrPTFoGlhgcnOnWi2tefmrFncmjsXUY0tu99xN+VmZgH+F5NKL6iolB7befc6hP2GZefGKJWC8JPJeDTzoHHtxqwLX1elmmWqDxVJCmqSJBkCbwHycpInkLHLn5QlS9Du04d60yvemdrzazApkfmcb7IPE1N1xrrNrESVLx41XV2arVqF3tix3N3sR/zoMRTfuVPVsqqETi0NsGj0wJGdAC27g35LOLcCnfpaNLPQJ/xEEiglRrUZRWhKKKF3QqtWuEy1oCJJYS5wAIgWQpyXJKk58DBn02pPzsmT3Jw9m1odO9Bo3ldIKhWbx9+76wKJQZkk6IVwq8VF5vVY9sTVSa8ikpoaDT76kMaLv6Pg6lXiBg4i78LFqpb1wpEkiXfcTbmemsufYbdKL6qogNNESAqBxGCs3BqTl1VEbGgK/U37o11D+5WwvpB59Xnip5YQYrsQwloI4XPv+1ghxMDKl/ZqkR8WTuLU96hpZkbjpUuRatSoUL3df4aSdPAW+ZoJHGm5jcXdllG7Ru1KVlu1aPfqhfGWLUgaGsR7e3N3y5Zqt4Kmu0VDmhvU4qdj0X/fu80QqKkNZ5djZKGPdj1NrhxLREtdi8GtBnPkxhFuZN2oWuEyLwU5J05U2gbRRyYFSZJ+lCRp6aNelaLmFaUoIYGEiRNR09Gh6YrlqNau2If61kP/Z++8o6Oquj78nEmb9N4TEhJIgNAhdAIoSgcL0puF9ooKlg/1pasICqivoGChSy8WEBSVXgKhhtAJ6b33ZCZzvj8mBhCEAEkmwH3WuouZc9tv4vLue/beZ++TpP18iVJVEZvqf8/ENhNp4NigitXWDNSBAdTetBHLtm1ImjGTxClT0BXf0j7jkeXvlp3nEnPYe6kse8jMGpoNg4itiPxkGnXyJPFqNqkxuQypPwRjlTErz600rHAFg5O1aROx4/9D6udfVMn17zRTCEPfIOffNgVAm5FB7CujQavF+7tvMXGpWA2/Zb8fR/PzMYp1NvzUYDmt67RgSL0hVay2ZmFka4v311/jOG4s2Zu3ED1sOJrEREPLqjaeaeaJu62ar/bcUCU++BV9v4WwZdRv546xqYrwvXE4mTvRx78PP135icyiTMOJVjAYUkpSFy4iccpULNu2xWP+/Cq5z78aBSnlihs3YNM/vj/26AoKiB03Hk1SEl5ff42Zn99dz5FS8uW2o9j8upNsTR2O198GrkXMaj/rkYwj3A1hZITLhMfPFQAAIABJREFUxIl4LfySkshIrj3fn/yjRw0tq1owNVYxuqMfR69lEBalL5mMoz/UfRrClmJmqiOwtRuXjiZTlKdhRIMRFJUWsf7iesMKV6h2pFZL0rRppC1ciO0zz+D99VcYWf1774gH4a4xBSFEWyHEOeB82fcmQoivqkTNQ4TUaol/8y2Kzp7Fc/48LJrfWi73n+h0kjk/HqXOnpUkFbUl0/8Ux+0P8EnIJ9ia2VaD6pqLddeu+G5Yj5GtLTEvvkTGylWPRZxhUCtvHCxNb54ttB4L+SkQ8SONOntRqtFx7lAC/nb+hHiFsPbCWopLHx9X2+OOrqCAuFcnkLVxE47jx+H+8WyEiUmV3a8i6TGfA92AdAAp5WkgpMoUPQRIKUmaOYu8PXtwmzYV6woUqdOW6pi64TCtQz/jWl4/TD3TWO+8nNebv05Tl6bVoLrmY+bvj+/GDVh17kzy7NkkTJ6MrvDf6/48CliYGvNiO1/+upDCuYQc/aD/E+AUAKFf4+hhiUddO87ujUenk4wKGkVGUQa/XP3FsMIVqgVtejrRI0eRt38/bjNm4PLGG1XuUahQzqSUMvYfQ491J5W0RV+RtXEjjuPGYj/o7gvMdDrJ/609RPczH3A1ewgW9lq+8/qc9p7tGBU0quoFP0QYWVnh9eX/cH7jdXJ+2UbUkKGUxMUbWlaVMqKtL5amRny9t2y2IAS0GgMJJ8vSU73ITS8iOjyNlq4taeDYgBURK9BJ3Z0vrPBQUxIdTdTgIRRfvozXwi+xHzSwWu5bEaMQK4RoB0ghhKkQ4m3KXEmPI5kbN+r9es8+i/Mbb1TonOV/nuCFC+8RnT0EYWbBrw1XY2mp5qMOH6ESSk3CfyJUKpzGj8d78ddo4uKIev558g4+uh1abS1MGNbWh+1nEohKy9cPNhlcnp7q19QJK3szwvfEIYRgVNAoonKi2Be3z7DCFaqMwjNniBo8BF1ODrWWLcX6ieqrkFyRJ9I44FXAE4gDmpZ9f+zI3b2bpBkzsezYEfdZMys0jTtzOoyQ/aNIznmeXOlBfMeTXNCGM6fjHBzNHate9EOMVadO1N60EWMXF2JHjyH9u+8e2TjDyx1qY6xS8d2BshacZlbQbDic+xFVfjJBHT2JPZ9JZlI+T/k8hbulu1L64hEld88eokeOQmVhgc/aNVjcpr1nVVIRoyCklEOllK5SShcp5TAp5WNXB7nw9GniJ72Jul49vD7/rEKBnpzzf+GyaQTHMt4gQdMYu27FbMpbybgm42jl3qoaVD/8mPr44LtuLdZPP03KvPkkvvvuI9mfwcVazXPNPdkYFkdaXlkQuVVZeurxZTTo4IHKWBC+Nx5jlTHD6g/jePJxwlPDDStcoVLJ3LiRuFcnYObnh++6tZgZoOBhRYzCISHE70KIl4UQdlWuqAZSEhVF7LjxGDs7471kMSrLu6eC6cJWULR6ItvTZ5GFH81GuPBZ3gyC3YIZ23hsNah+dFBZWuL52QKc33id7J9+Jm7iRHSPYEG90SF+lJTqWHkoSj/g4AcB3SBsKRYWkrotXLlwOJGSIi3PBzyPtYk1K84p2eGPAlJKUr9cSNLUaVi2a4fPyhUYOzkZREtFylzUBaYAQcAJIcQ2IUTFSn8+AmjT0ogZPQaAWt9+c/f/ULpS+H0K8Zu/Y3PGHIrNnOjymh9zk6egNlIzp+OcW+riK9wdIQRO48fjOmUKeX/8Sdy48egKCgwtq1Lxd7biqfqurDwSfb0JT+uxkJ8KEVtp1NkLTVEpF48kYWliSf/A/uyK3kV83qMdiH/UkVotiVOnkrZoEbbPPov3V4sq9OJZVVQ0++hoWf+EVkAG8Fi8nujy84kdOw5tWhreSxZj6ut75xOK82D9MM79cY6fM6dRYG5B61cdePX0yyTmJ/Jpp09xsajYimeF2+MwbCjus2eTf+QIMa+MpjQ319CSKpWxnfzIKtCw4VhZwp9fF3AKhCNf4+prjYuPNeF74pBSMrTeUFSoWH1utWFFK9w3uoICYl99lexNm3H6z3jcZ39UIdf0hosbOJZ0rEo0VWTxmo0QYqQQYgdwCEhEbxweaaRGQ9wbEym6cAHPzxZg3rjxnU/Ijkcu7cHhMAd250wgydyIui8XMz70FYxURqzssZLW7q2rR/wjjt1zz+K5YAGF4eHEjByFNiPD0JIqjRY+DrT0sefb/dfQlur06amtx0DiKYg7RuMuXmQmFRB3IRNXS1d6+vVk8+XNZBdnG1q6wj2iTU8nesRI8vcfwG3GDJxff71CySuXMy/z8dGP2XhpY5XoqshM4TT6jKNZUsoAKeVkKeUjXftISkni1GnkHziA24zpWHfufOcT4k+gXfI0v1/uzon85wk302L+3Hn+G/Y2de3qsqbXGgIdAqtF++OCTfdueC9aSPHVq0QPH4Em+dHpzTC2kz/xWYVsDy+rA9V4EJjZQuhi/Fu4YG5tQvieOABGNBhBobawyh4QClVD+RqEK1fuaQ2CVqdl2sFp2Jja8F6r96pEW0WMgp+UchJwT32ZhRBqIcRRIcRpIUSEEGJm2fh6IcSpsi1KCHGqbNxRCLFbCJEnhFh47z+l8kj94guyf/wRpwkTsH/hhTsffO4nCr8byM+Jk7hS0Ja96mKyO+1gbeRievj24Ptu3+NkbpiA0aOOVUgI3t9+gzYxkehhwyiJizO0pErhyXou+Dtb8s2+SH0KrpkVNB8O537CuDCFBu09iDqTRk5aIYEOgbTzaMea82soKX30gu+PIoVnzhA1aDC63Fx8li+7pzUIq86t4mz6WSZ5v495adWU2K+IUWhzn7WPioEnpJRN0M80ugsh2kgpB0opm0opmwKbgS1lxxcBU4G37/lXVCKZa9eSvngJdi+8gNOr//n3A6WE/fPJWjOZzRkfk1Tixy82uSQ0/56zuX/wnyb/YW7IXNTG6uoT/xhi2aoVtZYvozQnh+ihwyiOjDS0pAdGpRKMCfEjIiGHg1fKsr/Lq6cuJSjEE4Tg7D59gHlk0EhSC1P59dqvBlStUBFyd+/Wr0GwssJ37RrMm1a8xM217GssPLmQp9y7EbPejGXzw6pEY5XVPpJ68sq+mpRt5SuPhN55NgBYW3Z8vpTyAHrjYBBy//iDpA8+xKpzZ9ymT/t3/562BH56lYQdG9mU/RlFxi5sd00jsd5nFIprzO04l/FNxz+WVU8NgXnjxvisXIksLSV66DCKzp0ztKQH5plmnjhbm7FkX1npC4faENAdwpZibSPwa+LEuYMJaEtKaevelgD7AFZErHhkF/c9CmRu2HB9DcLaNXdPXLkBndQx/dB01MZq2sYOQlVcSl5dA1VJhfuvfSSEMCpzD6UAu6SUoTfs7ggkSynvqbWnEGKMECJMCBGWmpp6L6fekYITJ4l/623UjRriuWA+wtj49gfmp8OqZ7h0OJqfsj5EbW/H7jrRRHt9ipW5jqXdl9LTr2el6VKoGOrAAHxXryrr5jaKgpMPd5tPM2MjXmpfm/2X0zgbXxZEbj0WCtLg7BYadfaiOF/L5bDk8tIXV7KucDDh0S0H8rAipST1f1+SNG06lh3a39cahLUX1nIy5SSTfN8j+VgO59Q6Xulbr0r0VmntIyllaZmbyAtoJYRoeMPuwZTNEu4FKeU3UsqWUsqWzs7O93r6bSmOjCRu/HhM3Nzw/vprVBYWtz8w7TLy2ycJO+/Fruy3cPN3ILTJKU6bz8PF3I2NfdbRxLlJpWhSuHdMfX3x/WE1Rg72xLz8CvmHDxta0gMxpHUtrMyM+XZ/mUvMr7M+PTV0MR51bXHwsOTMbn16anff7rhYuCilL2oYUqMhccoU0r76CtvnnsN70b2vQYjLjeOLE1/QwaMDhX86Uygk/l29sLesWMvfe+V+ax/dwdl+K1LKLGAP0B1ACGEMPAcYvFuIJiVF3znN2Bjv777F2MHh9gdG7qH026fZHf8coTmDqBPswr6Gv7An7yscVY34+fm1eFh5VK94hVsw8fDAd/VqTL28iB0zlty//jK0pPvG1tyEwa282XYmkdiMgrL01LGQeAoRd4xGnb1Ii80jKTIHEyMThtUfRmhiKOfTH9t6lTUKXX6+fg3C5i36NQgffXjPfRCklMw4NAOVUDHS+A1y4vIJtS7llSfqVJHqiq1oTvtn7SPg/budJ4Rw/rsshhDCHOgKXCjb3RW4IKU0aLpIaV4esWPGos3KwnvJEky9vW9/4PHlFK8czra09zif24Em3T3Y6LmQP5I2YVbQmZ/6f4eVadVkAijcO8bOzvisXIFZ/frEvfY62du2G1rSffNSh9oI4PsD1/QDTa6npwa0csXU3Lg8PfX5gOexMLZQSl/UALRpafo1CAcO4jZzZoXXIPyTzZc3E5oUyqSgtznzSwpxRqV0eMoH26JYfWyzCrjfus0DKnCMO7BbCHEGOIY+prCtbN8gbuM6EkJEAQuAUUKIOCFElXWxlyUlxL32mj5P+IsvMG8YdOtBulL47b/k/Pghm3M+I6GoLs0HujFfvEdo0hE0yc+xvO+H2FkoGUY1DSM7O2otXYpF8+YkvPMOmRs2GFrSfeFua06/pp6sPxZLZn4JmFqWp6ealqRQv607V4+nkJ9djI2pDc8HPM/OaztJyk8ytPTHlpKoKP0ahKtX8Vq0EPuBFXlc3kpSfhLzw+bT2q01LmcbUlKg5bCdjpfaesLq/rBhRCUr13O/RuGuJk9KeUZK2UxK2VhK2VBKOeuGfaOklItvc46vlNJBSmklpfSSUlZJGonU6Uj47xQKDh/B/cMPsOrY4daDivNg3VCS9+1kU86X5Etn6g+34t2U8cTmJFIQ8yL/DXmJhp6PdxvNmoyRlSXe3yzBsmMHkqZNJ33ZckNLui/GhPhRqCll9ZFo/UDwKyB1ELaUhp080ekkEfsTABhWX1+WTCl9YRgKT5/W90HIy8NnxXKsu3S5r+tIKZl1eBalspTXvSZzbn8Cx8209H/SD5vjX0PGVWg1upLV6/lXoyCEcPiXzZEKGIWaTMGRI+T88gvOkyZh98wztx6QHQdLuxMZnsGPWXMxtrLFaUg+b10eh7EwJ+vKeLrX6ciw1rWqX7zCPaEyN8d74UKsu3UjZe5cUhcueujSNgPdrOkS6MzyQ1EUaUr16amBPSBsGXYOKmoFORKxL55SrQ4PKw+e9n2aTZc3kVvyaNWFquncsgahyf0nnWyL3Mb++P283vQNLvyURYmJ4Kw9vBikgv3zoMEzUOfJSlR/nTvNFI4DYWX/3riFAQ/10knLdu3wWfMDjmNuY2njjyO/eYLT0YHsyJyMg7ctOT1PM+3cZBo4NCIv8j94WtViznONlHUIDwnC1BTP+fOwffZZ0hYuJGXuJw+dYRjbyZ/0/BI2HS8Lw/2dnhqxhUadPSnIKSHylD5Fe2TQSPI1+Wy+tNmAih8fpJRkrFqtX4NQp849r0H4J2mFacw5OodmLs1onBJCakwuO02KebFTbSz/eA+EEXSbXXk/4B/8q1GQUtaWUvqV/fvPza/KFFUTFs2b3/pQj/gR3dLe7E8fwoGsofg2ceRo8EaWXPmKZ/yfRaSMITPXhEVDmmOtvrcsAgXDIoyNcf/oQ+yHDiVj+XKSpk1Hlj48rcZb13agibcd3+6PpFQnoXYncK4HR77Gp4EDNs7mhO/WG4wgxyBaubVi9fnVaHQaAyt/tJEaDUmzZpH80UdYde6Mz4rlD9wHYXbobIq0RbzfaBqhP18jy0ZFqp0RLzmdh8u/QZf3wNazkn7BrSgNgkFfsmLfPErWj2VH3gzCs0II7OLED96f8FvcTt5s8SbumhHsu5jJ1D4NlDjCQ4pQqXCd8l8cx44la+NGEv5vMlLzcDw0hRCMC/EjOr2A3yKSrqenJp1BxB+lUSdPEq9mkxqjdxmNDBpJckEyv0X9ZmDljy6lOTnEjh1L1tp1OLz8El5f/u+B+yD8HvU7u6J38WqzV4nbVYKmuJRNsoAJHd1R//E+ONeH1uMq6RfcHsUoaIvhx/Hk71rI1sJFROfUpV5fW+YavcWV7Ct83uVzGlr1Y/6uS/Rq7K7EER5yhBC4TJqI85tvkrN9O3FvTERXXGxoWRXi6SA3fB0tWLL3qt791XggqPXpqfXaumNsqiJ8r3620MGzA/62/krpiyqiJDqaqEGDyT8WhvtHH+L6zjsIowdrnpVZlMlHoR8R5BhEV+O+XAxNIsbFCCNbU4ZrNkF2LPReAEZV66V4vI1Cfjqs7Ed62EE25S0mu8QZ7wGC9zLHI6VkRfcVNHXswGtrTuJlb67EER4hnMaMxnXqFPL++ovYcePQ5ecbWtJdMVIJXunox+m4bEKvZZSlp46Acz+j1qYQ0NqNS0eTKcrX6Bc7BY3kQsYFQpNC735xhQqTf/QoUQMGUpqeTq3vv8Pu+ecr5bpzj80lpySHGW1mcmD9VUxtTNhUmMu7wSpMjiyEJkPAp12l3OtOPL5GIfUifPcEMVe1bM75HGlmjfGz8fw35g38bP1Y22stgfb1eHPDKTLyS5Q4wiOIw9ChuH/8MQWhR/Vd3HJyDC3prvRv4YWjpSlL9pYVyrshPbVxZy9KNTrOHdSnp/by64Wj2lEpfVGJZG3eTMzLr2Dk6Ijvxg1YtqqcfmN7YvewPXI7YxqNoeCEmszEfE44CRysTXk28TMwtYCnZt39QpXAnVJSGwkhjgghYoUQ3wgh7G/Yd7Ra1FUVccfhu6eISGvOtoz3sXGxIubJvXwW/TFdfbqyrPsynC2cWbIvkj0XU5U4wiOM3bPP4PnZZxSePUv0Q9DFTW1ixKh2vuy+mMrFpFyw94XAnnB8GY4uxnjUtePs3nh0OompkSlD6w/lYPxBLmfeU91JhX8gS0tJ/uRTEv87BcvgYHzXrcW0VuW4knNKcvjg8AfUta/LIK/hHNt2DVt/G3ZkZDOv/hWMovbBk9PAqnJqvd2NO80UvgZmAI2AS8ABIYR/2b6H+pVZ2vtxSL7FntThuNez4c9mS9mYsJYxjccwr9M8zI3NORaVwbzfLypxhMcAm25P4/3VIkoiI4keNhxNcrKhJd2RYW18MDcx4pt9ZYXyWo+FgnQ4u5lGnb3ITS8i+qy+D8OAwAGYG5uzIkIpfXG/6PLziXvtdTKWLsV+yGC8v1mCkY1NpV1/fth80ovS+aD9B4RuvgYSdpoWUcemlI6Rn4FHM2jxYqXd727cyShYSSl3SimzpJTzgAnATiFEG27oi/AwEhcjORnTiFptbFjqNZPjmceY3WE2rzV7DZVQkZFfosQRHjOsOnak1nffok1OJnroMEpi/1ktvuZgb2nKwGBvfjoVT2J2IdQO0WelhC6mdhNHLO3MCN+t129rZsuzdZ5l+7XtpBQ8Oi1LqwtNQgJRQ4eRt2cPrlOm4DZt2r+X1b8PDiUcYsvlLYwKGoVVkhtXT6bi2MqZA4nZfOm+E5GfAr0WgOrBgtj3wp2MghBClPtMpJS7geeBVYBPVQurSrzrO1DvRXPmmL5Btiab77t9Tx//PgDodFKJIzymWAQHU2v5MnS5uUQPGUrxlSuGlvSvvNyhNhJYeuDaTempRvFHaRjiSez5TDKT9MHzYQ2GoZM61pxfY1jRDxmFp09zbcBANHFxeC9ZgsOwoZV6/XxNPjMPzcTXxpfRQWPZt+4Sdi7m/JCVSWebJOrFrIWWL4Fn80q97924k1GYC9S/cUBKeQZ4kustNB9KjiUd4+2L/8HZwpk1vdbQzKVZ+T4ljvB4Y96oEbVWrURKSfTwERRGRBha0m3xdrCgd2N31oTGkF2ogcYDQG0HoYtp0MEDlbEgfK++Xae3tTdP1nqSDZc2kK+p+VlWNYHs7duJHj4ClVqN77q1t6+P9oB8fvxzEvMT+aD9B0T8kUROaiE2HVw5FZ/JPIvlCHMHeHJqpd/3btxpRfMaKeURACGElRDCsmw8RkpZNZWYqommzk0Z1XAUq3quwsvaq3xciSMoAKgDyrq4mauJGT6C3D//NLSk2zImxI/8klLWhMZcT089/wsWMpU6LVy4cDiRkiItAKOCRpFbksvWy1sNrLpmI6Uk9cuFJLz1NupGjfDdsB6zOpXfuyAsKYx1F9cxtP5QaosATuyMpk4LF765mMBYm8M4ZZ2Bpz8Ac/u7X6ySuWNKqhBivBAiBohG34EtWghxTw12aiImRia80fwNrE2ty8eUOILCjZj6+uK7di2mfn7EvTqB1K++qnGLwII8bOlY14mlB69RrC3Vp6ciIex7Gnf2RlNUysUj+hLajZ0b09ylOavOrUKr0xpWeA1FV1REwltvk7ZoEbbPPEOtZUv/venWA1CoLWT6oel4WXkxoekE9q27jMpYoGtiR1x8LJPkaqjVDpoMrvR7V4Q7paROAfoAnaWUjlJKB6AL0KNs3yODEkdQuB0mrq74rF6FTd8+pP3vS+InTqpxi9zGhviTmlvMjyfjwd5Hn54atgxXLxNcfKwJ3xNXbsxGBo0kIT+BP6L/MLDqmoc2NZXokSPJ2bED57fexP3j2ahMq6bd5aKTi4jJjWFmu5kkRxQQE5FOq961+fLwNT6w2oypNg96zdfHigzAnWYKw4HnpJSRfw+UfR4AVE13BwOhxBEU/g2VWo3H3Lm4vPMOubt2ETVkKCVx8YaWVU77Oo4EediwZF8kOp3UB5wLM/TpqV28yEwqIO5CJgCdvTvjY+PD8ojlNW7WY0iKLlzg2oCBFF+6jNeX/8Np9Ogq8xScTj3NqvOrGBAwgKb2zdm/4RKOnlYkuphglnSc3tpdiDbjwbXK+ovdlTu6j6SURbcZKwR0VaaomlHiCAp3QwiB48sv4b1ksT5FsX9/8kNrxvpNIQRjQvyITM3nzwsp4NsRXBpA6GLqNHfG3NqkvF2nSqgY0WAEEekRHE8+bmDlNYPcv/4iashQ0Onw/WE11l27Vtm9SkpLmHZwGi4WLkxqMYmwX6PIyyym46C6fPnnBT4xX4609oDO71aZhopwJ6MQJ4S4pYuDEOIJILHqJFUfShxB4V6w6tgR3w3rMXJwIOall8j44Yca8cbdq5E7nnbm+tIX5emp4RgnHqVBew+izqSRk1YIQF//vtib2T/2i9mklKR//72+B4KfH74bNqBuULVv54tPLyYyO5LpbadTkiY4/Ucs9dq5c6qwiFZpW6iru4boPhvMrO9+sSrkTkbhdWCJEGK5EOI1IcQEIcQK4Bv0C9keapQ4gsL9YFa7Nr7r12HVoQPJH3yo78tQYtieU8ZGKkZ3rE1YdCZhURnQ6Hp6alCIvu5+xH69y0ttrGZwvcHsidtDZHbknS77yCJLSkicMoWUT+dh3a0bPqtWYuLqUqX3PJ9+nqVnl9LPvx/tPdqzd+1FTNRGtO7nx8rfj/COySak3xP6jmoG5k4pqRFAQ2Af4Av4lX1uWLbvoUaJIyjcL0bW1nh9tQjHMWPI2riR6FEvok1LM6imAcHe2FmYsGRfpL54WouRcH4b1kZp1G7qTMSBBLQl+qZCA+sNxMzIjJURKw2q2RBoMzOJefkVsjdvwek/4/FcMB+VuXmV3lOj0zD14FTs1fa8E/wOl44mk3A5izbP+PPntTQGZ3+DWqVB9JpnsODyjdwp+6gO0EJKuVRK+ZaU8k0p5fdA8A01kB5KjkdnKnEEhQdCGBnh8uYkPBfMp+jcOa71f4HCs4Z7V7IwNWZEGx/+OJ/MlZS86+mpx76ncWcvivO1XA7T13RyUDvQz78fv1z9hbRCwxqz6qQ4MpKogYMoPH0aj08/xfn11xGqqi8UvTR8KRczLzK1zVTUpRYc3HwFF18bAtq6sW/nZp4xOoSq/URwrBmP1Tv9RT4Hbtf5u7Bs30NLoJs1L7bzVeIICg+MTc+e+K75AYQgeuhQsrdtN5iWEe18MTVS8d3+SLCrVVY9dTketc1w8LDkzO7r6anDGwxHo9Ow8tzjMVvIO3iQqIGD0OXnU2vFcmz79K6W+17JvMLiM4vp4duDJ2o9Qegv1yjKLaHT4AC2n45mfMHXFFh6IULeqhY9FeFORsG3rKzFTUgpw9C7kx5arMyMmdK7gRJHUKgU1A0aUHvTRtSNGpLw9tukzJ9vkP7PTlZmvNDSiy0n4knJKdK3bSzMQJRVT02LzSMpUt8zwtfWl95+vfnh3A8k5SdVu9bqJHPtWmLHjMXE3Z3aG9Zj0azZ3U+qBLQ6LVMPTsXaxJp3W79LakwuZ/fE0TDEEwcvK1J/W0AdVQLqvgvApGpdWPfCnYyC+g777voLhBBqIcRRIcRpIUSEEGJm2fh6IcSpsi1KCHHqhnPeE0JcEUJcFEJ0q/jPUFAwLMaOjvgsXYrdwIGkf/sdsePHG6Rpzysd/NDodCw7FAW+HcAlCEKXEBDsgqm5cXl6KsCEZhOQSBaeXFjtOqsDqdWS9OFHJM2chVWHDvisWYOJZ9U1vP8nq8+t5mz6Wd5v/T72pvbsXXsRtZUJrfv58dvBYwwvWU+KZ1dUgTXrUXcno3BMCHFLjSMhxMtARZKci4EnpJRNgKZAdyFEGynlQCllUyllU2AzZcX1hBANgEFAENAd+EoIUX31YhUUHhBhaor7zBm4zZhO/qHDRA0cRHHktWrV4OtkSY+Gbqw+Ek1eSak+PTU5HNPko9Rv687V4ynkZ+t7UntYeTC0/lB+vvozFzMuVqvOqqY0O5vYcePJXL0ah1Gj8PpqEUZWltV2/6jsKBaeWsgT3k/Qzbcb5w4mkHwth/bP10FlZoT13qkIIXDuv6DaNFWUOxmFicCLQog9Qoj5Zdte4BXgjbtdWOrJK/tqUraVJ3ULvTN/ALC2bKgfsE5KWSylvAZcASqn152CQjViP2gQPsuWUpqVRdSAAeTt3Vut9x8b4k9ukZZ1R2Og0Qv6omqhi2nYyROdThKxP6H82FcavYKVqRWfn3iow4TlSJ2OrM2budoCQGTQAAAgAElEQVS9B/lHjuA2ayau705GGFXf+6VO6ph+aDqmRqZMaTOFonwNh3+8ikddOwJau3FoxxpCSkOJa/Iawr7mdSG4U0pqspSyHTATiCrbZkop20opK+SEFEIYlbmHUoBdUsobO4h3BJKllH/3CfQEbuxsElc29s9rjhFChAkhwlJTUysiQ0Gh2rEIDqb2po2YeHkRO2486d99V20L3Zp429HGz4HvD1yjRKWG5iPhwnbszNKpFeRAxL54SrX6ogS2ZraMaTSGA/EHCE0MvcuVazaFERFEDx5C4n+nYOrnR+1NG7EfMKDaday7sI4TKSeYHDwZZwtnDm+9iqawlJDBAWiKCqhzfBaxRt749/m/atdWEe6ajyWl3C2l/LJs++teLi6lLC1zE3kBrYQQDW/YPZjrswSA26UB3fJ/kZTyGyllSyllS2fn6ulZqqBwP5h4euK75gesu3cjZd58Et5+B11hYbXce2yIP4nZRfxyOgGCX+bv6qmNOntRkFNC5KnrL1SD6w/G3dKdBccXoJMPXwWb0qwsEmfOJKr/C5TEx+Mxdw4+q1ehrlev2rXE5cbx+YnPae/Znr7+fUm8ms35g4k0edIbRw8rLmyaiadMJqPzbISxWbXrqwhVn6QLSCmzgD3oYwUIIYyB54D1NxwWB3jf8N0LSEBB4SFGZWGB54IFOE+cSM6vvxI9dBiaxKqvEtM50JlAV2u+2ReJtPWGer3g+HJ8AiywcVLfFHA2MzLjtWavcS79HDuv7axybZWF1OnI3LiRqz16krVhI/bDh+G/41ds+/UzSKq5lJIZh2egEiqmt5mO1En2rr2Ilb0ZLXv5Upx8iXpXv2efujONO/Spdn0VpcqMghDCWQhhV/bZHOgKXCjb3RW4IKWMu+GUn4FBQggzIURtoC5QM6qOKSg8AEIInMaNxWvRIkqio7nW/wUKTpyo8nuOCfHjYnIuey6llqWnZiIiNtGosxeJV7JJjb2+DKmXXy8C7QP538n/UVJq2LIdFaHwbARRgweTNHWa3lW0ZTNu77+PkbXh6gZtubyF0MRQ3mzxJu5W7oTviSc9Lo8OL9TF1MyItA2vUyRNUPf8uEavj6rKmYI7sFsIcQY4hj6msK1s3yBudh39XVZjA3AO2Am8KqWs/mRvBYUqwvqJLviuX4fKypLokaPI3LChSu/Xp4kH7rZqfaE8n/bg2hBCl1CvjRvGpqqbZgsqoeLNFm8SnxfP+ovr73BVw6LNzCRx+gyiXngBTXwCHp/M1buKAgMNqispP4l5YfNo5daK/gH9yc8uJvSXSGoFOeDXzJmS8C14ph9mk+0oghvVv/sFDUiVGQUp5RkpZTMpZWMpZUMp5awb9o2SUi6+zTkfSSn9pZSBUsodVaVNQcFQmNWpQ+0NG7Bs1YqkadNJmvUBUqOpknuZGqt4qX1tjkRmcDouuyw99SzqtKMEtHbj0tFkivKv37udZzvaurflmzPfkFNS/Wss7oTU6cjcsIHIHj3J2rQJhxHD9a6ivn0N/tZdqivl/QPvo5M6ZrTTu48ObrqCTivpODAAUZKHZvtkInQ+NOg7yeB670a1xBQUFBSuY2Rri/eSxTi8+CKZa9YQ8/IraDMyquReg1p5Y6025pt9kTelpzbu7EWpRldePfVvJrWYRFZxFkvDl1aJnvuhMDycqIGDSJo2XW9Ut2zB9b33DOoqupEV51ZwLOkY77V+D29rb2IvZHD5WDLNu9XCzsUCzV8fY1mcynrXSbSp42pouXdFMQoKCgZAGBvjOvn/8Jg7h8JTp4jq/wJFFy7c/cR7xFptwrA2Puw4m0hUtg5ajIIL23G0zKRWkAOn/oilpOh6z+b6jvXp7deb1edXG7z8hTYzk8Rp04kaMBBNUiIen35CrVUrUQcGGFTXjZxLP8eXJ7/kKZ+n6Offj1Ktjn1rL2HjpKZ5Nx9IjsDo6GLWaLvQp1c/Q8utEIpRUFAwILb9+uGzehVSqyVq0GASp06j4OTJSl3T8GI7X4xVKr47EAktX9YPHvuO4F61KcrTcHbvzbOFCc0moJM6Fp1aVGka7gVZWkrm+g1Edu9B1ubNOIwYgf+OHdj26VOjXC+F2kIm75uMg9qB6W2nI4Tg1B8xZCUXEDIoEGMTFaW/vEmOtGB/rVcJ9nUwtOQKoRgFBQUDY964Mb6bNmLTvTvZ27YRPXgIkT17kfbNt2iSUx74+i42ap5t5snGsDjSjF2gXm84sQI3LxNqBTlwclfMTbMFTytPhtQbws9Xf+ZS5qUHvv+9UHjmjN5VNH06ZgEB1N66Bdf33sXIyqpadVSEecfmEZ0TzewOs7E1syUnvZCw7VH4NXPGp6EjnF6LUdwRZmsG8Uq3loaWW2EUo6CgUAMwcXHBY87H1N2/D/cPP8DI3p7UBQu40qULMWPGkLNzJ7oH6PA2OsSPYq2OlYejy9NTCd/4r7OF0Y1HY2liyefHq6f8hTYzk8Sp04gaOAhtcjIen35KrZUrUAfUHFfRjeyJ3cOGSxsYFTSK1u6tATiw4TII6PBCXSjIQPf7VE4TQKp/f1r42BtYccVRjIKCQg3CyMoKu/798V3zA/47d+A4ejTFFy8RP3ESlzuGkDTrAwrPRtyze6mOixVPNXBl5eEoCtxbgWsjCF2CW20bajW4dbZga2bL6Eaj2R+/n6OJVbdcSJaWkrlund5VtGULDiNH4rfjV2z79K5RrqIbSStMY9rBadRzqMeEZvrOxJGnUrl2Oo3gXrWxdlDDXx9AQQbvFY9i4lPVv7L6QVCMgoJCDcXU1xeXSROp89efeH/7LVbt25O1aRNR/ftzrd8zpC9ffk9ZS+M6+ZFVoGFDWJw+PTUlAqL2E9z79rOFIfWHVGn5i8LTp4kaMJCkGTMxCwzE78etuL47uUa6iv5GSsmUg1Mo0BYwt+NcTI1MKSnUsm/dJRw9LWnS1RvijyPDlrGG7njUa0UTbztDy74nFKOgoFDDEUZGWHXsgOeC+dTdvw+36dMQZmakzJnL5ZBOxE6YQO5ff911vUMLHwda+Njz3YFraBs8B1ZusPN93GqZ33a2YGZkxoRmE4hIj+C3qN8q7fdoMzJImDJF7ypKScFj/jxqrViOWd26lXaPqmLthbUcjD/I2y3fxs/OD4DDP14lP7uYzsPqYSQkbHuTfFNH5hQ9x8SuNdP9dScUo6Cg8BBhZGuL/eDB1N64Ab9ffsZhxAgKT50m7j+vcrlzF5LnzKXo0r8Hh8eG+BGXWcivF7Oh92eQHA4HFvzrbKFXbX35iy9OfPHA5S9kaSmZa9dytUdPsn/8CYeXXsJvxw5se/Wqsa6iG7mSeYX5YfMJ8QphYOBAABKvZnN2XzyNO3vhVtsWwpZC4ilmFQ+hfVBtGnraGlj1vSOqq5xvVdCyZUsZFhZmaBkKCgZFajTk7T9A9tYt5O7eA1ot6oYNsX3uWWx79sTI7rr7QqeTdP1sL+YmRmx7rQNiy2iI2Apj9vLLBi0pMbkM/7Atpmrj8nMOxh9k3B/jmBw8mWENht1Zi1aLJikZTVwsJbGxaGLj0MTFURIXhyY6mtLsbCxat8Zt6hTM6tSpqj9JpVNSWsLg7YNJK0xjc9/NOJk7UarRsX72MTTFWgZPa42pNgO+bEmMui4hyZPY8UYI9d1tDC39tgghjkspb5sSZXy7QQUFhYcHYWKC9RNdsH6iC9qMDHK2bSNry1aSZ31AysdzsOr6JHbPPYdlu3aojIwYG+LH5M3hHLySTocen0DkHvjpPwT32Mrm+ac5uy+e5k9fb/7SzqMdbdzbsOTMEvr698WyUEdJbFzZg1//0C//nJgI2usuKIyNMfHwwNTLC3X37li2a4f10089FDODG/nixBdcyrzEoicX4WTuBMCJ36PJTMyn16uN9UZ06zSkpoBXi4bSs5F7jTUId0MxCgoKjxDGDg44jBiBw4gRFJ07R9aWreT88gu5O3Zi7OqKbb9+9Ozbl3nWZizZd5UOL7eGXgtgw3DcEpdSq8FTnPwtmgAfLaQk6N/24+KZdFVD1IU0ouZ2xKTwZjeSkYMDJt5emDdujE2vnph6eWHi5Y2ptxfGrq4I44f7MXM44TArz61kYOBAQrxCAMhIzCdsRxR1W7rg28gJLv0Gp9dyxGMkZ6+5Mu/Jhy+W8DeK+0hB4RFHV1JC3u49ZG/ZQt7+/aDTkeVfn5XWQUzoHoRLQQaa/WvRxCeSog7mqN9o/K9uxSf2DwCEmRkmXl5ctcjjvFk6/ULG4OjfQP/g9/JEZVl9vY+rm6yiLJ7/+XmsTK1Y13sd5sbmSJ1k64ITZCTkM2RGGyxKE2BJCFobb4KTJtOhvjdfDm5maOl3RHEfKSg8xqhMTbHp9jQ23Z5Gk5JCzs8/Y7x5C6+f2gSnNpEGGLs4YyKM8LS6gKtlHnH1+tH2w1FY+Hpj7OyEUKlQ58Xz7tY+pPulMqv9k4b+WVWOlJKZh2eSUZzBoq6LMDc2ByDiQAKJV7J5YkQ9LMx18P1wAL51m0l2bDFvPFnzs6juhJJ9pKDwGGHi4oLjK69Q59ftHJu6kNFP/h+ZW/6g7r59+C6ajUfjSNp3SKRYo+JqthMmri4Ilf4x4WnlyeB6g/np6k9czrx8lzs9/Px45Uf+iPmDN5q9QT0H/QK0/KxiDm+5gmegPfXausOOdyDpDJndvuSLkxr6NfWkjkvNXWdRERSjoKDwGCKEYMCAzkhvHz7cFUmpTkLQc1C/D+5np+Jdx4yTv8egKb65z9XoRqOxNLbk8xPVU/7CUETnRPPx0Y9p7daaEUEjysf3rb9Eaamk89BAxMnVcGIlssNb/F+4J1LCxK4P9ywBFKOgoPDYojYx4v+6B3IuMYctJ+JACH3Q2dSCYONvKMrTEL437qZz7NR2vNL4FfbF7eNY0jEDKa9aNDoN7+1/DxOVCR92+BCV0D8mI0+lEnkyleBevtiVXoFf34bandjm9CK7ziXz1tMB+Dg+/PEVxSgoKDzG9GnsQRMvW+b9fpHCklKwcoEen+Ce9SPe7nmc2nXrbGFIvSG4WriyIGxBpZb4riksOb2E8LRwpredjpulGwDFhVr2rb2Io5cVTTvYwobhYO5AZo+vmfHLBZp42fJS+9oGVl45KEZBQeExRqUSTOndgOScYr7dH6kfbPQCBPYkWPsJhbm3rnJWG6t5rdlrnE0/y2/RlVf+oiZwIvkE34Z/Sz//fjzt+3T5+OGtVynIKaHLkACMfh4P2XEwYAUz/0ohp0jD3P6NMTZ6NB6nj8avUFBQuG+CfR3oHuTG4r1XSckp0ruRen+Gu2UM3jbXOPl79C2zhd5+vQmwD+CL41+gKa2aHtPVTW5JLu8feB8PSw/ea/1e+XjClSwi9sXTuIs3rvFL4dIOePoj/sr34cdTCfyncx3quT2cC9Vuh2IUFBQUeLdHPTSlOhbsKqubZO0G3ecQbLyEwtvURDJSGTGpxSTi8uLYcGmDARRXPrNDZ5OUn8SckDlYmuhjA6UaHXtWX8DaQU2rxvH6kthBz5Hb5CXe33KWAFcrXu3y8JTrqAiKUVBQUMDXyZLhbXzZEBbLhaQc/WCTwbgH+eBtdoaTv0XeMlto79Ge1u6tWXJ6CbkluQZQXXnsuLaDbZHbGNtkLE2cm5SPH98ZRWZSAZ2eccL055fBsS70/ZKPd14kJbeIT/o3wdT40XqMPlq/RkFB4b55/ck6WKtN+Gj7ef2AEND7c4LtfqYwX8fZPbE3HS+EYFKLSWQWZ7Ls7DIDKK4cEvMS+eDwBzR1bsroRqPLxzMS8jm+M5q6LZzxOfMqaAph4CoOxxWzJjSGlzvUpulD1iuhIihGQUFBAQA7C1Nee6IO+y+nsediWW9oW0/c+43C2/QUJ3dcvmW2EOQYRM/aPVl1bhXJ+ckGUP1glOpKee/Ae+jQMbvjbIxV+iIPUifZvfoCJmojOjhvhLij0O9LCm3r8O6WM/g4WvDmU4EGVl81VJlREEKohRBHhRCnhRARQoiZN+x7TQhxsWz8k7IxUyHEMiFEeNk5natKm4KCwu0Z0dYXH0cLZv96Hm1pWbe1ZsMJDrxCYZERZ3eG33LOa81eo1SW8tXpr6pZ7YOzLGIZx5OP837r9/G29i4fj9gfT1JkNh3aZGNx6gtoPR4aPs+CXReJTi9gznONMTc1MqDyqqMqZwrFwBNSyiZAU6C7EKKNEKIL0A9oLKUMAuaVHT8aQErZCHgKmC+EUGYyCgrViKmxine71+NScp6+bSeAELgPfQ9vdTgnd8WhuaE7G4CXtReD6g3ixys/ciXzigFU3x8RaREsOrmIbr7d6OPXp3w8L7OYQ1uv4uVvRuDFseDdGp6axanYLL4/cI0hrWvR1t/RgMqrlip76Eo9eWVfTco2CYwH5kgpi8uOK5un0gD484axLOC2VfwUFBSqju4N3Qj2tWfBrkvkFZcZADtvgrs6Uai14Oyan245Z0yjMQ9V+YsCTQHv7n8XR3NHpraZelN/h33rLiJLJZ1VHyBMzeGF5ZRgzORNZ3CxVvNuj3oGVF71VOmbuBDCSAhxCkgBdkkpQ4EAoKMQIlQIsVcIEVx2+GmgnxDCWAhRG2gBeN/mmmOEEGFCiLDU1NSqlK+g8FgihOC/vRqQllfMkr1Xy8fdew3B2zaak2EqNCnRN51jp7bj5UYvszdu70NR/mJe2Dx9faOOH2Nrdr1l5tWTKVw7nUawTxi2eaHQfynYeLBo9xUuJucy+7mG2KhNDKi86qlSoyClLJVSNgW8gFZCiIboy3XbA22Ad4ANQm+mlwJxQBjwOXAI0N7mmt9IKVtKKVs6OztXpXwFhceWpt529G3iwbf7I0nMLtQPCkHwoLYU6mw5u3QF/KPExdD6Q3G1cOWz45/V6PIXf8X8xcZLG3mx4YsEuwWXjxcXaNi37hJODkU0zf0InpgCfp24kJTDV3uu0K+pB0/UczWg8uqhWnz2UsosYA/QHf2Df0uZe+kooAOcpJRaKeUkKWVTKWU/wA549OvzKijUUN7pFohOwqe/XSwfc29WDy+PAk7GNERz9Iebjlcbq5nQbALhaeH8Hv17dcutEKkFqcw4NIP6DvWZ0HTCTfsObb1KYU4JXVTTUQV2g/aT0JbqmLzpDDZqE6b3CTKQ6uqlKrOPnIUQdmWfzYGuwAXgR+CJsvEAwBRIE0JYCCEsy8afArRSynNVpU9BQeHOeDtY8GJ7X7aciOdsfHb5ePCgdhTq7Dj7417Ivnmlcx+/PtS1r8sXJ2pe+Qud1DH14FQKtYXMCZmDidF1N1DC5UzO7U+gse2fuDiXwLNfg0rF0oPXOB2XzYy+QThYmhpQffVRlTMFd2C3EOIMcAx9TGEbejeRnxDiLLAOGCn1c00X4IQQ4jwwGRhehdoUFBQqwKtd6uBgacqH28+Vu4Q8AhzwqqPmZHZPND+9fZMbyUhlxKTmk4jNjWXjpY2Gkn1b1l5Yy8GEg7wT/A5+tn7l41pNKbtXX8DaLJvWFqtgwCowtycqLZ/5v1+ia31Xejd2N6Dy6qUqs4/OSCmbSSkbSykbSilnlY2XSCmHlY01l1L+VTYeJaUMlFLWl1J2lVJG3/kOCgoKVY2N2oSJXetyJDKDP86nlI8HP9NAH1sIN4bT6246p4NnB1q7tWbx6cXkleT985IG4XLmZRaELaCzV2deCHjhpn3Hd0STlVxIZ4vPMekzG9wbo9NJJm8+g6mxio+ebXhTdtKjjrIOQEFB4Y4MblULP2dLPv71PJqyBW0edezwqmfPycIBaH6dCjmJ5ccLIZjUsqz8RYThy18UlxYzef9krE2tmdFuxk0P+PSEPE7svEaAei+12jaGZsMAWHsshtBrGUzpVR9XG7WhpBsExSgoKCjcERMjFe/3qE9kWj5rQmPKx4N716ZQa8nZ7BDY/uZNbqQgxyB61O7ByoiVpBSk3O6y1cYXJ77gcuZlPmj/AY7m1xedSZ1kz/LTmJJHhzqh0ONTABKyCvn41wu0r+PIgJa3ZMU/8ihGQUFB4a48Wd+Ftn6OfP7HJXKK9AHk8tlC8SA0F/6A8E03nfN6s9fRSi1fnTJc+YtD8YdYdW4Vg+sNpqNXx5v2nd0dRVJMMe0d1mI+dDGYqJFS8t+t4ZTqJB8/2/ixchv9jWIUFBQU7op+QVt9sgo1LNp9vZRFcO/aFBYZE2EyGna8A7nXi+J5WXsxKHAQW69s5WrW1dtdtkrJLMpkysEp+Nv682aLN2/al5dZxOEtl/A2PUXgsFFg7wvAT6cS2H0xlbe7BVLL0aLaNdcEFKOgoKBQIRp62vJcMy+WHYgiNqMAuD5bOJH5tL4m0q9v3eRGGtN4DBbGFnx+vHrLX0gpmXFoBlnFWcwNmYvaWH3Tvr2L/0SW6uj0pBZRrzsAaXnFzPwlgua17BjVzrda9dYkjA0tQEFB4eHhnW6BbA9P4JPfLvLl4GYABPeqzdb5J4gI/Jim5ydAxFZo+BwA9mp7Xm70Ml+c+IKwpDBauj14OTONTkN6YTpphWmkFaaRWpiq/1yg/5xemE5KYQpJ+Um83fJtAh1uLnF99a9jREWb085nH7Z9ppaPT/85gvziUuY+3xgj1ePnNvobxSgoKChUGDdbNWM6+vG/v67wUntfmtWyx6Nu2WzhsglB/q0w+fVtqB0Clk4ADKs/jLUX1vLZ8c9Y3XP1bf30UkoKtAWkFqSWP9jLH/aFaaQWpJJWpH/wZxZn3labvZk9ThZOOJs742vrS6B9IMMaDLvpmKL0VPZvicfZLJsmr74KKn35698ikth+JpG3ngqgrqt1Jf/VHi5ETa5Rcjdatmwpw8LCDC1DQeGxIq9YS+dP9+DjaMGmcW0RQpBwOYut80/Qvps1TcN7QP3e8MLy8nO2Xt7KtEPTeKnhS1iZWN38hl+2FWoLb7mXicoEJ3P9g97R3BFnc2ecLJzKx/4edzR3xER1l0J1Oh27Z33N+aQAXnjJGudWbQDILtTw1IK9OFqZ8fOE9pgYPfpedSHEcSnlbadtykxBQUHhnrAyM+atpwN4b0s4O84m0bOR+/XZwqE8gnq8i8k+fYN7GvQFoK9/X344/wNLzy4FwNrEuvytvqFTw5se8M4WzjipnXC2cMbG1KbSMoDiN3/DuaT6NG2YiXOrp8rHZ28/T3p+CUtHBT8WBuFuKEZBQUHhnhnQ0pvlB6OYs+MCT9Z3wczY6HpsQfs8Td1+0q9d8O0AFg4YqYxY0WMFmUWZOJk73RT4rQ60F/9izx5LbMxzaTW6X/n4gctprA+LZVwnfxp62t7hCo8PillUUFC4Z4xUgvd71Scmo4BVh/UVacpnC7vi0PRcBIWZsOP/ys+xNLHEy9qr2g0C2fGELdtGVqknnUe1xMRM/y6cX6zl3S1n8HOyZGLXutWrqQajGAUFBYX7olOAMyEBzvzvz8tkFZQA+kykwlwNEZfsIOQdCN8IF341nEhtCekr3uFkVncCm1jg3cSzfNe83y8Sl1nInOcbozZ5NPst3w+KUVBQULhv/tuzPnnFWv73p35Bm0ddOzwD7TnxewyaVm+Aa0PYNhEKMgyiT/f7VHZf7oipWkX74c3Lx49HZ7D8UBQj2vrQqraDQbTVVBSjoKCgcN8EulkzMNibVUeiiErLB6BVb18Kc0qIOJQK/RZBfhr89n71iwvfxNndMSRrAukwuCHmVvp+CEWaUv5v0xk8bM35v+6Pdr/l+0EJNCsoKDwQk54K4OdTCczZcYHFw1vgUde+fLYQFNIWk45vwr5PIehZCOhWOTfVFkNR9vWtMAuKsm4YyyL30BaO5H9Krfr2BLS63kZz4V9XuJqaz4qXWmFlpjwC/4nyF1FQUHggXKzVjOvkz/xdlzh6LYNWtR1o1duXrfNPErEvnqad34Hz2+CXifCfw2BuB6VaKM65+UFemHXzg74o6zYP/rLPt1nTcCPy/9u78+AoyzuA499fQkISAgkhRONuSERAAklADB5Yj0EsxaOKwGjVjnbGsfWqPaxHj9GqVLSORUdsPUpbrWNHqbRjrUexWjxR5AiGgAgUE0IFjbnQQI5f/3if7CwhB5vdDbub32dmZ3ef93mPHy95f/s+z/u+T1IqK/f+Ek1O4/RLJwYua62sbeC3/9nKvGl+Tp9gY7x3x5KCMSZsV546lqdWfcLCFzay/JpTupwt+Ei5YAk8PgsemAId7bC/qfcFSjKkZR34ypvYpSzbvbzvmpZFQ3M61ds72FHVyI7az5kx7xhG5KYD0NbewU3LKhiZkcovzi0egH+V+GRJwRgTtvTUZH4y+1h+/Ox6nq+o5fypvgPPFmYdDxc+Bh+/6p0pBB3MSctyZUHfUzPhEG5aa2lupXpTHTWr6qiuqqOprgWA4aPSmDqrgCkz/YG6j76xjcraRn576TSyMwbHeMv9YUnBGBMRc4/zsfSt7dz70mZmTz7y4LOF0vlQOj+sdbS3drBrWwPVVXVUb6xjT3UTKKSmJeM7diTTZo/BX5xD1uj0A+6E3rqnmcUrtjCn5EjmlA6e8Zb7w5KCMSYikpK8MRcueWwVS9/azjVnjOtytjAm5GWqKnW1e70kUFVH7ZZ62vZ3kJQkHDF2BCecezQFxTnkFQ4nqYdHVHR0KDcvqyA9JZlfnj853DATniUFY0zEzDgml1nFeTz82lYuKi9wZwvZrH3lE0pO8zEkte+bxPY27KOmqo7qqi+o3lTHlw3ejXHZR2RQPOMoCibl4BufTWr6oR2+nnx3B6t3fMF9C6aQN3xwjbfcH5YUjDERdcucYmYvXsniFVu484ISpp9zNH+7fy2Vb9Qy5cyDxzxu3d9O7ZZ6qqvqqKmq4/Od3v0OaZkpFEwcib84h4LiHIbnhH5Ar677knte2sRpE0Yzb5qv7xmMJQVjTGSNy8vk0hPH8NSqT7h8RiHjJnhnC2te3sHkU48ieUgSe6qbXJPQF+Tx2WQAAAmESURBVOzaWk9Hm5I8JIn8cVmcPPdICopzyPVnImEMdqOq/HT5BgT41dySQTnecn9YUjDGRNwNZ45n+Zqd3P3PTfz+iumBs4Xl96+lcc9XtOxtBWCUL5OyM/wUTMohf1w2KYfQvHSoln1QwxtbPuOO8yfjHzk4x1vuj6glBRFJA1YCQ916lqnqbW7a9cB1QBvwgqreJCIpwOPANFf/CVW9O1rbZ4yJnlGZQ7l25jgWvbiJtz/+jBkTcjl6Si67dzRRVDoq0CSUMSLyl4Y2trRSUd3Anf/YyPSikVx2YmHE15HIonmmsA+YqarN7oD/poi8CKQD5wNlqrpPRPJc/QXAUFUtFZEMYKOIPK2q/43iNhpjouSKGUU8+c4O7nqhiuev/xpnX10W8XV8ub+NytpGKmoa2FBTT8XOBrbt8fokMocOYdG8MpIG8XjL/RG1pKDeOJ/N7muKeylwNbBIVfe5ers7ZwGGicgQvMSxH2iM1vYZY6IrLSWZm+dM5PtPr+W5NTUsKD+4kzkULa3tbNzVyIaaBi8J7Kzn493NdLgRhfOz0ij1ZXHhcT5K/dlM9WeTldHHEJ3mIFHtUxCRZOADYBywRFVXicgE4FQRWQi0ADeq6vvAMrwziF1ABvBDVT3oebsichVwFcCYMaFf92yMGTjnleWz9M3t3PfKZs4pyycj9dAOOfvbOtj8vyYqdtYHksBHnzbR5jJAbmYqZf5s5pTkU+bPotSXRd4Iu9w0EqKaFFS1HZgqItnAchEpcescCZwETAeeEZGxwAlAO3CUm/6GiKxQ1W1dlvko8ChAeXm5RnP7jTHhERF+fk4x83/3Do+t3M4N3Yxw1tbewZbdzWyoaWB9TT0bdjawaVcT+9s7AMjOSKHUl8V3J46l1JdNmT+L/Kw0u5ooSgbk6iNVrReR14FvADXAc6556T0R6QBygUuAl1S1FdgtIm8B5cC2HhZrjIkD5UU5zCk5kkdWbuWi6QU0tbS65p8GKmrqqaxtZF+blwCGDx1CiS+L75xSRKk/iyn+bPwj0y0BDKBoXn00Gmh1CSEdmAXcg9fPMBN43TUlpQKfAZ8AM0Xkz3jNRycBi6O1fcaYgXPLnImsqPqUGYteDfQBpKckU+IbwWUnFQaagIpGDbOO4cMsmmcK+cCfXL9CEvCMqv5DRFKBpSLyIV5n8uWqqiKyBPgD8CEgwB9UtSKK22eMGSCFo4axcG4plTsbKPV7TUDHjM4k2RJAzBGvFSc+lZeX6+rVqw/3ZhhjTFwRkQ9Utby7aTZGszHGmABLCsYYYwIsKRhjjAmwpGCMMSbAkoIxxpgASwrGGGMCLCkYY4wJsKRgjDEmIK5vXhORPcCOMBaRi/eIjUSSiDEFS+T4Ejm2TokeY7zEV6iqo7ubENdJIVwisrqnu/riVSLGFCyR40vk2DoleoyJEJ81HxljjAmwpGCMMSZgsCeFRw/3BkRBIsYULJHjS+TYOiV6jHEf36DuUzDGGHOgwX6mYIwxJoglBWOMMQFxkxREpEBEXhORKhGpFJEbXHmOiPxLRLa495FB89wqIh+LyGYRmR1UvlBEqkWkuY91Hi8iG9wyHhQ3UKyInCYia0SkTUTmJ1Bc33Pl60TkTRGZFE5sMRjfFSKyx8W3TkSuTKDYfhMU10ciUh9ObDEaY6GIvCoiFSLyuoj44zC2butJBI8pYVPVuHjhDe85zX0eDnwETALuBW5x5bcA97jPk4D1wFDgaGArkOymneSW19zHOt8DTsYbHvRFYI4rLwLKgCeA+QkU14igOt8EXkqw/XYF8FAi/p/sUud6YGmixQg8izd8L3jjvD8Zh7F1W48IHlPC3ueHc+Vh7tC/A2cBm4H8oJ282X2+Fbg1qP7LwMldltHjDnTL2hT0/VvAI13q/DHSOzAW4goqfzGR9hsRTgqxFFuXem8DZyVajEAl4HefBWiMp9gOpV40jimhvuKm+SiYiBQBxwGrgCNUdReAe89z1XxAddBsNa7sUPncPP2dP2SxEJeIXCsiW/F+LX0/tAh6FwvxAfNc88MyESkIKYBexEhsiEgh3q/Yf4ew3EMSAzGuB+a5z3OB4SIyKoRl92iAYosLcZcURCQT+CvwA1Vt7K1qN2WhXH8b7vwhiZW4VHWJqh4D3Az8PITl9r7S2IjveaBIVcuAFcCfQlhuzyuMjdg6XQwsU9X2EJbb94pjI8YbgdNFZC1wOrATaAth2d2vcOBiiwtxlRREJAVv5z2lqs+54k9FJN9Nzwd2u/IaIPiXoB+o7WXZyUEddXe4+YM7snqdPxwxGtdfgAv6E0832xAT8anq56q6z5U/BhwfXmSxE1uQi4Gn+xtPD9sREzGqaq2qXqiqxwE/c2UNcRRbfDicbVchtvcJXifM4i7lv+bATqF73efJHNgptA3XKdRXu17Q9PfxOoY6O7zOjnT7XyzFBYwPqnMesDqR9huundh9ngu8myixuWnHAv/F3ZQaiVcsxYj3BNIk93khcEe8xdZXPWKgT+GwrbgfO/BreKdqFcA69zobGAW8Cmxx7zlB8/wM7wqBzQRdpYHXXl4DdLj323tYZznwoVvGQ51/bMB0N99e4HOgMkHiegCvM28d8BowOcH2290uvvUuvomJEpubdjuwKIH/7ua79X0EPA4MjcPYuq1HBI8p4b7sMRfGGGMC4qpPwRhjTHRZUjDGGBNgScEYY0yAJQVjjDEBlhSMMcYEWFIwJgQi0u5uRqoUkfUi8iMR6fXvSESKROSSgdpGY8JhScGY0HylqlNVdTLew9POBm7rY54iwJKCiQt2n4IxIRCRZlXNDPo+Fu8O3FygEHgSGOYmX6eqb4vIu0AxsB3veUsPAouAM/Dujl2iqo8MWBDG9MKSgjEh6JoUXNkXwESgCehQ1RYRGQ88rarlInIGcKOqnuvqXwXkqepdIjIUeAtYoKrbBzQYY7ox5HBvgDEJoPPpmSnAQyIyFWgHJvRQ/+tAWdAIW1nAeLwzCWMOK0sKxoTBNR+14z1J8zbgU2AKXn9dS0+zAder6ssDspHGhMA6mo3pJxEZDfwObzQ3xfvFv0tVO4BvA8muahPecI+dXgaudo9tRkQmiMgwjIkBdqZgTGjSRWQdXlNRG17H8v1u2sPAX0VkAd5TWPe68gqgTUTW4z0a+QG8K5LWuEHp9xChsSuMCZd1NBtjjAmw5iNjjDEBlhSMMcYEWFIwxhgTYEnBGGNMgCUFY4wxAZYUjDHGBFhSMMYYE/B/2speOwpDTWgAAAAASUVORK5CYII=",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"import matplotlib.pyplot as plt\n",
"\n",
"plt.plot(X_test, y_test, label='Actual level')\n",
"plt.plot(X_test, flaml_y_pred, label='FLAML forecast')\n",
"plt.plot(X_test, prophet_y_pred, label='Prophet forecast')\n",
"plt.plot(X_test, autoarima_y_pred, label='AutoArima forecast')\n",
"plt.plot(X_test, autosarima_y_pred, label='AutoSarima forecast')\n",
"plt.xlabel('Date')\n",
"plt.ylabel('CO2 Levels')\n",
"plt.legend()\n",
"plt.show()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python ('pytorch_forecasting')",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": ""
},
"vscode": {
"interpreter": {
"hash": "25a19fbe0a9132dfb9279d48d161753c6352f8f9478c2e74383d340069b907c3"
}
}
},
"nbformat": 4,
"nbformat_minor": 2
}