Earlier this year I’ve added Jupyter notebooks to my dev toolbelt, below are my thought on this, and how to get most of it.
The main benefit of using notebooks is in the early exploratory stages of the development. For example, testing various 3rd party integrations, testing some packages, or prototyping initial solutions.
Jupyter notebooks are interactive environments where you can run code snippets, reference the output, visualise the data, and later share these notebooks with other collaborators. It’s open-source and easy to use with docker.
Google Colab is a solution which provides the same functionality, and plus you have access to GPUs and TPUs for machine learning.
Since it records your inputs and outputs, and you can also add text annotations - you can use it as a reference for future development.
For example, you want to try out a package that parses and extracts the main text from a news article.
Simply install a package - ‘!pip install news-parser-package’, and then import and use it as you want.
This allows you to test the package API, check available methods, explore outputs. Once you are done, you can add a text annotation and stop the server.
Same goes with trying some 3rd party API endpoints - all requests, including all responses, stay in the notebook.
Next time when you revisit it, the input - all code commands you run and all output will be there.
Plus Colab allows you to import / export notebooks to GitHub or store it on Google Drive.
This is especially useful for projects that you don’t work daily, for example, side projects, or hackathons.
If you only have 3 hours to work on a project every week or two - you will forget most of that context.
Using notebooks allow you to quickly recover this context as everything can be logged. You can even use it as a dev log with dates and comments, which you can later use to write a report or a blog post.