From 2f7ab702951500d776fc5ebc3e4f69126b824974 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Aur=C3=A9lien=20Geron?= Date: Mon, 6 Apr 2020 19:13:12 +1200 Subject: [PATCH] Update notebooks to latest nbformat --- 01_the_machine_learning_landscape.ipynb | 4 +- 02_end_to_end_machine_learning_project.ipynb | 8 ++-- 03_classification.ipynb | 8 ++-- 04_training_linear_models.ipynb | 4 +- 05_support_vector_machines.ipynb | 12 ++---- 06_decision_trees.ipynb | 12 ++---- 07_ensemble_learning_and_random_forests.ipynb | 8 ++-- 08_dimensionality_reduction.ipynb | 20 +++------- 09_unsupervised_learning.ipynb | 16 +++----- 10_neural_nets_with_keras.ipynb | 12 ++---- 11_training_deep_neural_networks.ipynb | 8 ++-- ..._models_and_training_with_tensorflow.ipynb | 18 +++------ 13_loading_and_preprocessing_data.ipynb | 6 +-- 14_deep_computer_vision_with_cnns.ipynb | 14 ++----- ...essing_sequences_using_rnns_and_cnns.ipynb | 12 ++---- 16_nlp_with_rnns_and_attention.ipynb | 2 +- extra_gradient_descent_comparison.ipynb | 4 +- index.ipynb | 12 ++---- math_linear_algebra.ipynb | 35 ++++++++--------- tools_matplotlib.ipynb | 21 +++++----- tools_numpy.ipynb | 38 ++++++++----------- tools_pandas.ipynb | 32 +++++----------- 22 files changed, 112 insertions(+), 194 deletions(-) diff --git a/01_the_machine_learning_landscape.ipynb b/01_the_machine_learning_landscape.ipynb index 710f2cf..24f8aaa 100644 --- a/01_the_machine_learning_landscape.ipynb +++ b/01_the_machine_learning_landscape.ipynb @@ -785,7 +785,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": {}, "toc": { @@ -806,5 +806,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/02_end_to_end_machine_learning_project.ipynb b/02_end_to_end_machine_learning_project.ipynb index 990fc3b..d983d52 100644 --- a/02_end_to_end_machine_learning_project.ipynb +++ b/02_end_to_end_machine_learning_project.ipynb @@ -1664,9 +1664,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "Question: Try a Support Vector Machine regressor (`sklearn.svm.SVR`), with various hyperparameters such as `kernel=\"linear\"` (with various values for the `C` hyperparameter) or `kernel=\"rbf\"` (with various values for the `C` and `gamma` hyperparameters). Don't worry about what these hyperparameters mean for now. How does the best `SVR` predictor perform?" ] @@ -2170,7 +2168,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": { "height": "279px", @@ -2188,5 +2186,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/03_classification.ipynb b/03_classification.ipynb index 4c82d3f..e2314b1 100644 --- a/03_classification.ipynb +++ b/03_classification.ipynb @@ -1163,9 +1163,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Exercise solutions" ] @@ -2553,7 +2551,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": {}, "toc": { @@ -2567,5 +2565,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/04_training_linear_models.ipynb b/04_training_linear_models.ipynb index 58bf27b..1b113a5 100644 --- a/04_training_linear_models.ipynb +++ b/04_training_linear_models.ipynb @@ -1797,7 +1797,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": {}, "toc": { @@ -1811,5 +1811,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/05_support_vector_machines.ipynb b/05_support_vector_machines.ipynb index b3a8af5..2fa5543 100644 --- a/05_support_vector_machines.ipynb +++ b/05_support_vector_machines.ipynb @@ -403,9 +403,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Non-linear classification" ] @@ -1241,9 +1239,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "See appendix A." ] @@ -1834,7 +1830,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": {}, "toc": { @@ -1848,5 +1844,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/06_decision_trees.ipynb b/06_decision_trees.ipynb index 17688bf..a8237ac 100644 --- a/06_decision_trees.ipynb +++ b/06_decision_trees.ipynb @@ -465,9 +465,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Exercise solutions" ] @@ -488,9 +486,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "## 7." ] @@ -733,7 +729,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": { "height": "309px", @@ -750,5 +746,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/07_ensemble_learning_and_random_forests.ipynb b/07_ensemble_learning_and_random_forests.ipynb index 005d9c4..f4f135e 100644 --- a/07_ensemble_learning_and_random_forests.ipynb +++ b/07_ensemble_learning_and_random_forests.ipynb @@ -924,9 +924,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Exercise solutions" ] @@ -1394,7 +1392,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": { "height": "252px", @@ -1411,5 +1409,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/08_dimensionality_reduction.ipynb b/08_dimensionality_reduction.ipynb index aef9dc8..1be0a56 100644 --- a/08_dimensionality_reduction.ipynb +++ b/08_dimensionality_reduction.ipynb @@ -212,9 +212,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "Notice that running PCA multiple times on slightly different datasets may result in different results. In general the only difference is that some axes may be flipped. In this example, PCA using Scikit-Learn gives the same projection as the one given by the SVD approach, except both axes are flipped:" ] @@ -1481,9 +1479,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Exercise solutions" ] @@ -1504,9 +1500,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "## 9." ] @@ -1917,9 +1911,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "*Exercise: Alternatively, you can write colored digits at the location of each instance, or even plot scaled-down versions of the digit images themselves (if you plot all digits, the visualization will be too cluttered, so you should either draw a random sample or plot an instance only if no other instance has already been plotted at a close distance). You should get a nice visualization with well-separated clusters of digits.*" ] @@ -2264,9 +2256,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/09_unsupervised_learning.ipynb b/09_unsupervised_learning.ipynb index 05ef094..fc9197d 100644 --- a/09_unsupervised_learning.ipynb +++ b/09_unsupervised_learning.ipynb @@ -975,9 +975,7 @@ { "cell_type": "code", "execution_count": 47, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "minibatch_kmeans = MiniBatchKMeans(n_clusters=10, batch_size=10, random_state=42)\n", @@ -1418,9 +1416,7 @@ { "cell_type": "code", "execution_count": 71, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "plt.figure(figsize=(10, 3.2))\n", @@ -1706,9 +1702,7 @@ { "cell_type": "code", "execution_count": 91, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "grid_clf.score(X_test, y_test)" @@ -3792,9 +3786,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/10_neural_nets_with_keras.ipynb b/10_neural_nets_with_keras.ipynb index f18bf46..caeac36 100644 --- a/10_neural_nets_with_keras.ipynb +++ b/10_neural_nets_with_keras.ipynb @@ -1597,9 +1597,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Exercise solutions" ] @@ -1613,9 +1611,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "See appendix A." ] @@ -2015,7 +2011,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": { "height": "264px", @@ -2032,5 +2028,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/11_training_deep_neural_networks.ipynb b/11_training_deep_neural_networks.ipynb index 4c4df8c..9621268 100644 --- a/11_training_deep_neural_networks.ipynb +++ b/11_training_deep_neural_networks.ipynb @@ -2103,9 +2103,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Exercises" ] @@ -2684,7 +2682,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.3" + "version": "3.7.6" }, "nav_menu": { "height": "360px", @@ -2701,5 +2699,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/12_custom_models_and_training_with_tensorflow.ipynb b/12_custom_models_and_training_with_tensorflow.ipynb index 879879d..b8b73e5 100644 --- a/12_custom_models_and_training_with_tensorflow.ipynb +++ b/12_custom_models_and_training_with_tensorflow.ipynb @@ -1012,9 +1012,7 @@ { "cell_type": "code", "execution_count": 78, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "model.fit(X_train_scaled, y_train, epochs=2,\n", @@ -1158,9 +1156,7 @@ { "cell_type": "code", "execution_count": 90, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "model.fit(X_train_scaled, y_train, epochs=2,\n", @@ -1856,9 +1852,7 @@ { "cell_type": "code", "execution_count": 142, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "model.compile(loss=\"mse\", optimizer=\"nadam\")\n", @@ -3527,9 +3521,7 @@ { "cell_type": "code", "execution_count": 261, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "model = keras.models.Sequential([keras.layers.Dense(1, input_shape=[8])])\n", @@ -3914,5 +3906,5 @@ } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/13_loading_and_preprocessing_data.ipynb b/13_loading_and_preprocessing_data.ipynb index 7f252ce..144b216 100644 --- a/13_loading_and_preprocessing_data.ipynb +++ b/13_loading_and_preprocessing_data.ipynb @@ -466,9 +466,7 @@ { "cell_type": "code", "execution_count": 26, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "n_inputs = 8 # X_train.shape[-1]\n", @@ -2785,5 +2783,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/14_deep_computer_vision_with_cnns.ipynb b/14_deep_computer_vision_with_cnns.ipynb index e22bde1..860ce3a 100644 --- a/14_deep_computer_vision_with_cnns.ipynb +++ b/14_deep_computer_vision_with_cnns.ipynb @@ -1230,9 +1230,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Exercises" ] @@ -1312,9 +1310,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "## 10. Use transfer learning for large image classification" ] @@ -1348,9 +1344,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "Simply open the Colab and follow its instructions." ] @@ -1386,5 +1380,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/15_processing_sequences_using_rnns_and_cnns.ipynb b/15_processing_sequences_using_rnns_and_cnns.ipynb index 34f6def..e6b228c 100644 --- a/15_processing_sequences_using_rnns_and_cnns.ipynb +++ b/15_processing_sequences_using_rnns_and_cnns.ipynb @@ -972,9 +972,7 @@ { "cell_type": "code", "execution_count": 52, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "np.random.seed(42)\n", @@ -1213,9 +1211,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# Exercise solutions" ] @@ -1974,9 +1970,7 @@ { "cell_type": "code", "execution_count": 100, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "new_chorale_v2_hot = generate_chorale_v2(model, seed_chords, 56, temperature=1.5)\n", diff --git a/16_nlp_with_rnns_and_attention.ipynb b/16_nlp_with_rnns_and_attention.ipynb index b90b5b8..8da8082 100644 --- a/16_nlp_with_rnns_and_attention.ipynb +++ b/16_nlp_with_rnns_and_attention.ipynb @@ -1248,5 +1248,5 @@ } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/extra_gradient_descent_comparison.ipynb b/extra_gradient_descent_comparison.ipynb index ee5ed6e..69643b4 100644 --- a/extra_gradient_descent_comparison.ipynb +++ b/extra_gradient_descent_comparison.ipynb @@ -257,9 +257,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.6.6" + "version": "3.7.6" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/index.ipynb b/index.ipynb index 1e8c655..8ef75a7 100644 --- a/index.ipynb +++ b/index.ipynb @@ -48,9 +48,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "## Prerequisites\n", "### To understand\n", @@ -65,9 +63,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [] } @@ -88,7 +84,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.6.8" + "version": "3.7.6" }, "nav_menu": {}, "toc": { @@ -102,5 +98,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/math_linear_algebra.ipynb b/math_linear_algebra.ipynb index d9e968b..501a176 100644 --- a/math_linear_algebra.ipynb +++ b/math_linear_algebra.ipynb @@ -152,7 +152,10 @@ "cell_type": "code", "execution_count": 7, "metadata": { - "collapsed": true + "collapsed": true, + "jupyter": { + "outputs_hidden": true + } }, "outputs": [], "source": [ @@ -190,9 +193,7 @@ { "cell_type": "code", "execution_count": 9, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "def plot_vector2d(vector2d, origin=[0, 0], **options):\n", @@ -232,9 +233,7 @@ { "cell_type": "code", "execution_count": 11, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "a = np.array([1, 2, 8])\n", @@ -1671,9 +1670,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "## Converting 1D arrays to 2D arrays in NumPy\n", "As we mentionned earlier, in NumPy (as opposed to Matlab, for example), 1D really means 1D: there is no such thing as a vertical 1D-array or a horizontal 1D-array. So you should not be surprised to see that transposing a 1D array does not do anything:" @@ -2001,7 +1998,10 @@ "cell_type": "code", "execution_count": 90, "metadata": { - "collapsed": true + "collapsed": true, + "jupyter": { + "outputs_hidden": true + } }, "outputs": [], "source": [ @@ -2739,7 +2739,10 @@ "cell_type": "code", "execution_count": 122, "metadata": { - "collapsed": true + "collapsed": true, + "jupyter": { + "outputs_hidden": true + } }, "outputs": [], "source": [ @@ -3039,9 +3042,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [] } @@ -3062,7 +3063,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.6.2" + "version": "3.7.6" }, "toc": { "toc_cell": false, @@ -3072,5 +3073,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/tools_matplotlib.ipynb b/tools_matplotlib.ipynb index 992f0cd..5d99f4e 100644 --- a/tools_matplotlib.ipynb +++ b/tools_matplotlib.ipynb @@ -554,9 +554,7 @@ { "cell_type": "code", "execution_count": 24, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "plt.plot(x, x**2, px, py, \"ro\")\n", @@ -1022,9 +1020,7 @@ { "cell_type": "code", "execution_count": 42, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "plt.imshow(img, cmap=\"hot\")\n", @@ -1065,9 +1061,7 @@ { "cell_type": "code", "execution_count": 44, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "plt.imshow(img, interpolation=\"nearest\")\n", @@ -1086,7 +1080,10 @@ "cell_type": "code", "execution_count": 45, "metadata": { - "collapsed": true + "collapsed": true, + "jupyter": { + "outputs_hidden": true + } }, "outputs": [], "source": [ @@ -1175,7 +1172,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.4" + "version": "3.7.6" }, "toc": { "toc_cell": true, @@ -1186,5 +1183,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/tools_numpy.ipynb b/tools_numpy.ipynb index 06b00f2..d64f1e0 100644 --- a/tools_numpy.ipynb +++ b/tools_numpy.ipynb @@ -21,9 +21,7 @@ { "cell_type": "code", "execution_count": 2, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "import numpy as np" @@ -357,9 +355,7 @@ { "cell_type": "code", "execution_count": 22, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "%matplotlib inline\n", @@ -500,9 +496,7 @@ { "cell_type": "code", "execution_count": 29, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "f = np.array([[1,2],[1000, 2000]], dtype=np.int32)\n", @@ -663,9 +657,7 @@ { "cell_type": "code", "execution_count": 38, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "a = np.array([14, 23, 32, 41])\n", @@ -1243,9 +1235,7 @@ { "cell_type": "code", "execution_count": 74, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "try:\n", @@ -1542,9 +1532,7 @@ { "cell_type": "code", "execution_count": 96, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "c[..., 3] # all matrices, all rows, column 3. This is equivalent to c[:, :, 3]" @@ -2700,7 +2688,10 @@ "cell_type": "code", "execution_count": 175, "metadata": { - "collapsed": true + "collapsed": true, + "jupyter": { + "outputs_hidden": true + } }, "outputs": [], "source": [ @@ -2746,7 +2737,10 @@ "cell_type": "code", "execution_count": 178, "metadata": { - "collapsed": true + "collapsed": true, + "jupyter": { + "outputs_hidden": true + } }, "outputs": [], "source": [ @@ -2839,7 +2833,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.4" + "version": "3.7.6" }, "toc": { "toc_cell": false, @@ -2857,5 +2851,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 } diff --git a/tools_pandas.ipynb b/tools_pandas.ipynb index 8c11356..aab889d 100644 --- a/tools_pandas.ipynb +++ b/tools_pandas.ipynb @@ -1756,9 +1756,7 @@ { "cell_type": "code", "execution_count": 97, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "grades >= 5" @@ -1978,9 +1976,7 @@ { "cell_type": "code", "execution_count": 110, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "bonus_points.interpolate(axis=1)" @@ -2242,9 +2238,7 @@ { "cell_type": "code", "execution_count": 125, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "much_data = np.fromfunction(lambda x,y: (x+y*y)%17*11, (10000, 26))\n", @@ -2264,9 +2258,7 @@ { "cell_type": "code", "execution_count": 126, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "large_df.head()" @@ -2298,9 +2290,7 @@ { "cell_type": "code", "execution_count": 128, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "large_df.info()" @@ -2322,9 +2312,7 @@ { "cell_type": "code", "execution_count": 129, - "metadata": { - "scrolled": false - }, + "metadata": {}, "outputs": [], "source": [ "large_df.describe()" @@ -2775,9 +2763,7 @@ }, { "cell_type": "markdown", - "metadata": { - "collapsed": true - }, + "metadata": {}, "source": [ "# What next?\n", "As you probably noticed by now, pandas is quite a large library with *many* features. Although we went through the most important features, there is still a lot to discover. Probably the best way to learn more is to get your hands dirty with some real-life data. It is also a good idea to go through pandas' excellent [documentation](http://pandas.pydata.org/pandas-docs/stable/index.html), in particular the [Cookbook](http://pandas.pydata.org/pandas-docs/stable/cookbook.html)." @@ -2807,7 +2793,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.7.4" + "version": "3.7.6" }, "toc": { "toc_cell": false, @@ -2818,5 +2804,5 @@ } }, "nbformat": 4, - "nbformat_minor": 1 + "nbformat_minor": 4 }