Rolled-back graphviz
I gave up on enabling DT visualization for the moment, as graphviz integration stopped working... I can't run dot in the image any more, receiving: Could not load "/opt/conda/lib/graphviz/libgvplugin_pango.so.6" - file not found As far as I was able to diagnose thats (with ldd), that's because libiconv.so.2 is not available. But according to what I see in mailgroups that should not be needed (on debian at least)...main
parent
4fa5beb93a
commit
fc355ca6b9
|
@ -41,26 +41,18 @@ RUN mkdir -p ${HOME}/.jupyter && \
|
||||||
>> ${HOME}/.jupyter/jupyter_notebook_config.py
|
>> ${HOME}/.jupyter/jupyter_notebook_config.py
|
||||||
|
|
||||||
|
|
||||||
# INFO: Below - work in progress, nbdime not totally integrated, still:
|
# INFO: Below - work in progress, nbdime not totally integrated, still it enables diffing
|
||||||
# 1. enables diffing notebooks via nbdiff after connecting to container by "make exec" (docker exec)
|
# notebooks via nbdiff after connecting to container by "make exec" (docker exec)
|
||||||
# Use:
|
# Use:
|
||||||
# nbd NOTEBOOK_NAME.ipynb
|
# nbd NOTEBOOK_NAME.ipynb
|
||||||
# to get nbdiff between checkpointed version and current version of the given notebook
|
# to get nbdiff between checkpointed version and current version of the given notebook
|
||||||
# 2. allows decision tree visualization in notebook
|
|
||||||
# Use:
|
|
||||||
# from sklearn import tree
|
|
||||||
# from graphviz import Source
|
|
||||||
# Source(tree.export_graphviz(tree_clf, out_file=None, feature_names=iris.feature_names[2:]))
|
|
||||||
|
|
||||||
USER root
|
USER root
|
||||||
WORKDIR /
|
WORKDIR /
|
||||||
|
|
||||||
RUN conda install -y -c conda-forge nbdime
|
RUN conda install -y -c conda-forge nbdime
|
||||||
RUN conda install -y -c conda-forge python-graphviz
|
|
||||||
|
|
||||||
USER ${username}
|
USER ${username}
|
||||||
WORKDIR ${HOME}/handson-ml
|
WORKDIR ${HOME}/handson-ml
|
||||||
|
|
||||||
|
|
||||||
COPY docker/bashrc /tmp/bashrc
|
COPY docker/bashrc /tmp/bashrc
|
||||||
RUN cat /tmp/bashrc >> ${HOME}/.bashrc
|
RUN cat /tmp/bashrc >> ${HOME}/.bashrc
|
||||||
RUN sudo rm -rf /tmp/bashrc
|
RUN sudo rm -rf /tmp/bashrc
|
||||||
|
|
|
@ -1,37 +1,38 @@
|
||||||
|
|
||||||
# Hands-on Machine Learning in Docker :-)
|
# Hands-on Machine Learning in Docker :-)
|
||||||
|
|
||||||
This is the Docker configuration which allows you to run and tweak the book's notebooks without installing any dependencies on your machine!
|
This is the Docker configuration which allows you to run and tweak the book's notebooks without installing any dependencies on your machine!<br/>
|
||||||
OK, any except `docker`. With `docker-compose`. Well, you may also want `make` (but it is only used as thin layer to call a few simple `docker-compose` commands).
|
OK, any except `docker`. With `docker-compose`. Well, you may also want `make` (but it is only used as thin layer to call a few simple `docker-compose` commands).
|
||||||
|
|
||||||
## Prerequisites
|
## Prerequisites
|
||||||
|
|
||||||
As stated, the two things you need is `docker` and `docker-compose`.
|
As stated, the two things you need is `docker` and `docker-compose`.
|
||||||
|
|
||||||
Follow the instructions on [Install Docker](https://docs.docker.com/engine/installation/) and [Install Docker Compose](https://docs.docker.com/compose/install/) for your environment if you haven't got `docker` already.
|
Follow the instructions on [Install Docker](https://docs.docker.com/engine/installation/) and [Install Docker Compose](https://docs.docker.com/compose/install/) for your environment if you haven't got `docker` already.
|
||||||
|
|
||||||
Some general knowledge about `docker` infrastructure might be useful (that's an interesting topic on its own) but is not strictly *required* to just run the notebooks.
|
Some general knowledge about `docker` infrastructure might be useful (that's an interesting topic on its own) but is not strictly *required* to just run the notebooks.
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
### Prepare the image (once)
|
### Prepare the image (once)
|
||||||
|
|
||||||
Switch to `docker` directory here and run `make build` (or `docker-compose build`) to build your docker image. That may take some time but is only required once. Or perhaps a few times after you tweak something in a `Dockerfile`.
|
Switch to `docker` directory here and run `make build` (or `docker-compose build`) to build your docker image. That may take some time but is only required once. Or perhaps a few times after you tweak something in a `Dockerfile`.
|
||||||
|
|
||||||
After the process is finished you have a `handson-ml` image, that will be the base for your experiments. You can confirm that looking on results of `docker images` command.
|
After the process is finished you have a `handson-ml` image, that will be the base for your experiments. You can confirm that looking on results of `docker images` command.
|
||||||
|
|
||||||
### Run the notebooks
|
### Run the notebooks
|
||||||
|
|
||||||
Run `make run` (or just `docker-compose up`) to start the jupyter server inside the container (also named `handson-ml`, same as image). Just point your browser to <http://localhost:8888> or the URL printed on the screen and you're ready to play with the book's code!
|
Run `make run` (or just `docker-compose up`) to start the jupyter server inside the container (also named `handson-ml`, same as image). Just point your browser to <http://localhost:8888> (empty password) or the URL printed on the screen and you're ready to play with the book's code!
|
||||||
|
|
||||||
The server runs in the directory containing the notebooks, and the changes you make from the browser will be persisted there.
|
The server runs in the directory containing the notebooks, and the changes you make from the browser will be persisted there.
|
||||||
|
|
||||||
You can close the server just by pressing `Ctrl-C` in terminal window.
|
You can close the server just by pressing `Ctrl-C` in terminal window.
|
||||||
|
|
||||||
### Run additional commands in container
|
### Run additional commands in container
|
||||||
|
|
||||||
Run `make exec` (or `docker-compose exec handson-ml bash`) while the server is running to run an additional `bash` shell inside the `handson-ml` container. Now you're inside the environment prepared within the image.
|
Run `make exec` (or `docker-compose exec handson-ml bash`) while the server is running to run an additional `bash` shell inside the `handson-ml` container. Now you're inside the environment prepared within the image.
|
||||||
|
|
||||||
One of the usefull things that can be done there may be comparing versions of the notebooks using the `nbdiff` command if you haven't got `nbdime` installed locally (it is **way** better than plain `diff` for notebooks). See [Tools for diffing and merging of Jupyter notebooks]<https://github.com/jupyter/nbdime> for more details.
|
One of the usefull things that can be done there may be comparing versions of the notebooks using the `nbdiff` command if you haven't got `nbdime` installed locally (it is **way** better than plain `diff` for notebooks). See [Tools for diffing and merging of Jupyter notebooks](https://github.com/jupyter/nbdime) for more details.
|
||||||
|
|
||||||
You may also try `nbd NOTEBOOK_NAME.ipynb` command (custom, defined in the Dockerfile) to compare one of your notebooks with its `checkpointed` version. To be precise, the output will tell you "what modifications should be re-played on the *manually saved* version of the notebook (located in `.ipynb_checkpoints` subdirectory) to update it to the *current* i.e. *auto-saved* version (given as command's argument - located in working directory)".
|
You may also try `nbd NOTEBOOK_NAME.ipynb` command (custom, see bashrc file) to compare one of your notebooks with its `checkpointed` version.<br/>
|
||||||
|
To be precise, the output will tell you *what modifications should be re-played on the **manually saved** version of the notebook (located in `.ipynb_checkpoints` subdirectory) to update it to the **current** i.e. **auto-saved** version (given as command's argument - located in working directory)*.
|
||||||
|
|
Loading…
Reference in New Issue