Virtual Environments in Python
Virtual Environments are isolated Python environments that have their own site-packages
. Basically, it means that each virtual environment has its own set of dependencies to third-party packages usually installed from PyPI.
Virtual environments are helpful if you develop multiple Python projects on the same machine. Also, when you distribute your Python code to others or on servers, virtual environments come in very handy to reproducibly create the same environment as on your development machine.
Today, we’ll learn
- which tools exist to create isolated environments
- which tools help with package management in Python projects
Creating Virtual Environments
There are two common ways to create virtual environments in Python’s ecosystem: virtualenv
and venv
.
venv
venv
is probably the most popular choice for creating virtual environments in Python. Since Python 3.3 venv
is part of the standard library and therefore usually available when you have Python installed. However, Debian-based Linux distributions require you to install python3-venv
since their maintainers decided to unbundle this module from the core Python installation.
To create a virtual environment with venv
you can start by typing
python -m venv venv
This command creates a directory called venv
inside your current folder. Now, to use this new virtual environment you have to activate it with this command:
source venv/bin/activate # Use this command on bash
.\venv\Scripts\activate # On Windows
You can now start your Python interpreter and type
>>> import sys
>>> sys.executable
'/Users/bas/Code/tmp/venv/bin/python'
>>> for path in sys.path:
... print(repr(path))
...
''
'/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python39.zip'
'/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9'
'/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/lib-dynload'
'/Users/bas/Code/tmp2/venv/lib/python3.9/site-packages'
>>>
As you can see, the Python interpreter you just started is located inside your virtual environment. Also, the site-packages
directory (where your pip
installed packages are located) is pointing to a path inside your virtual environment.
virtualenv
virtualenv
works similarly. To use it, you must first install it via pip:
pip install virtualenv
You can then use it with
python -m virtualenv venv
Note that we just changed the name of the module from venv
to virtualenv
(first argument). The venv
destination folder remains untouched.
virtualenv
will create a similar directory structure as venv
did before. You need to activate your new virtual environment in exactly the same way as before:
source venv/bin/activate # On bash
virtualenv
vs. venv
You may wonder what the difference is between the two tools.
First, virtualenv
has a longer history. It was used in times of Python 2 already. Official support for virtual environments has been added to Python not until version 3.3 via PEP 405.
As a third-party package, virtualenv
has the additional advantage of being independent of the system’s Python installation and can thus be upgraded independently.
However, the most important benefit of using virtualenv
instead of venv
is that it allows targeting Python versions other than the system’s Python. If you have only Python 3.9 installed, you can use virtualenv
to create a virtual environment with Python 3.10 (and vice versa, of course). Not only can you make use of any supported Python version you want, but you can also do this without root
or Administrator
permissions since the installation is done inside your working directory.
On the other hand, the virtual environments created with virtualenv
are more complex. This is because its cross-Python support (even Python 2 is still supported) makes the discovery of packages and internals a bit more complicated, and the bootstrapping process needs to be customized. If you are interested in these internals, Bernat Gabor’s EuroPython 2019 talk has these insights covered.
To sum up:
venv |
virtualenv |
|
---|---|---|
PROS |
|
|
CONS |
|
|
Installing Packages
Now that we have a working virtual environment with either virtualenv
or venv
, we can start to install the packages we need to install to fulfill our project’s dependencies.
pip
pip
is the de-facto standard for installing packages in Python and has been part of the standard library since Python version 3.4.
To install a package, you can just type
pip install django
pip
takes care of finding the package on PyPI and of managing its dependencies. In our example, we can check which packages have been installed by the command above:
$ pip list
Package Version
---------- -------
asgiref 3.5.0
Django 4.0.2
pip 21.1.3
setuptools 57.0.0
sqlparse 0.4.2
As you can see, besides Django, pip
has installed asgiref
and sqlparse
.
The most common way to share an environment built with pip
is by creating a requirements.txt
file that looks like this:
django==4.0.2
The problem with pip
is that it takes care of installing dependencies, but it doesn’t take care of them afterwards. So, for example, if you install django
with pip
it will install sqlparse
and asgiref
for you. However, if you uninstall django
afterward, these two additional packages are kept and not removed. Over time, you can lose track of the packages which are really needed for your project and those that are just leftovers from previously installed packages. This situation especially applies when you migrate from one PyPI module to another one over the lifetime of your project.
pip
also does not differentiate between a development and a production environment. So, you might want to have access to developer tools such as black
or pytest
during development. Installing those on a production server is at best unnecessary and harmful at worst.
Also, when two third-party packages have colliding dependencies, pip
does not provide a way to resolve these.
Lastly, managing your requirements.txt
is not taken care of by pip
. Some developers just use pip freeze > requirements.txt
whenever a new dependency has been installed. However, this is not advisable since it would include the sub-dependencies and thus worsening the problems mentioned above.
pipenv
pipenv
is a tool created by Kenneth Reitz. The most significant difference to pip
is that pipenv
is designed to keep track of installed packages automatically. For that, pipenv
creates two files, Pipfile
and Pipfile.lock
.
pipenv
solves the issues with pip
mentioned above:
Dependency management
Installing a package automatically updates the Pipfile
and the Pipfile.lock
When we install django
with pipenv
, it will install sqlparse
and asgiref
for us like in pip
would do.
However, if we remove django
from our requirements, pipenv
will remove these additional dependencies as well.
Production/Development Dependencies
Some dependencies, such as linters or testing tools, are required in the development environment only. That’s why pipenv
supports the --dev
flag. Packages installed with this flag are not installed when replicating the environment on production systems.
Virtual Environments
pipenv
can also create and manage virtual environments for you. In practice, this means that you can solely rely on pipenv
to create your project environments, including installing specified Python versions.
With the command
pipenv --python 3.10
you can easily create a brand new virtual env with a specified version of Python.
poetry
poetry
is a new and very ambitious package manager for Python. Its goal is to provide a solution to all virtual environment and package management issues a developer might run into.
Interestingly, poetry
– unlike pipenv
– is not an “official” package under the umbrella of the Python Packaging Authority. However, it does rely on a file called pyproject.toml
(instead of Pipfile
and Pipfile.lock
as pipenv
does). The pyproject.toml
specifiation has an “official status” per PEP 518.
Also, poetry
can be used to manage virtual environments and packages and build and publish own Python packages.
pyproject.toml
poetry
relies on the pyproject.toml
file, which looks like this:
[tool.poetry]
name = "poetry_tutorial_project"
# ...
[tool.poetry.dependencies]
python = "^3.10"
loguru = "*"
psutil = "*"
[tool.poetry.dev-dependencies]
pytest = "*"
[build-system]
requires = ["poetry>=0.12"]
build-backend = "poetry.masonry.api"
[tool.poetry.scripts]
run = "wsgi:main"
This file is the only configuration file poetry
uses and includes every information about dependencies, build instructions and testing environments.
Virtual Environments and Skeleton creation
The poetry new projectname
command creates a sensible project structure for you:
/projectname
├── README.md
├── projectname
│ └── __init__.py
├── pyproject.toml
└── tests
├── __init__.py
└── test_projectname.py
Dependecies can be added with
poetry add django
The --dev-dependency
can be used to add a dependency only for the development environment.
Build and publish
poetry
can also take care of building and publishing your packages to PyPI. It replaces twine
in that sense.
Here is a good guide on using poetry
to package Python projects.
pip
vs. pipenv
vs. poetry
pip |
pipenv |
poetry |
|
---|---|---|---|
PROS |
|
|
|
CONS |
|
|
|
Other Tools Worth Mentioning
virtualenvwrapper
virtualenvwrapper
is a set of extensions to virtualenv
.
The package comes with some handy CLI utilities. The most important ones are:
mkvirtualenv
: A shortcut to create a virtual env. As opposed tovenv
andvirtualenv
, the virtual environments created byvirtualenvwrapper
are not placed into the working directory but inside a central directory of your$HOME
directory. A virtual environment is created bymkvirtualenv projectenv
.workon
: Becausevirtualenvwrapper
creates virtual environments at a central location, activation is done with aworkon
command. No matter what your current working directory is, you can runworkon projectenv
, which will automatically pick the right environment from your$HOME
directory and activates it.
pyenv
pyenv
is a tool to manage Python versions. Other than the tools we discussed so far, it does neither help with managing virtual environments nor package management. However, pyenv
is, of course, compatible with the other tools.
pyenv
can be a convenient helper for setting up development workstations. It is itself not dependent on Python itself so that it can be used to set up different Python installations conveniently and without root
/ Administrator
rights.
To install a specific version of Python, type:
pyenv install 3.10
To use pyenv
in combination with virtual environment managers, the pyenv
-team has created some plugins:
Deprecated: pyvenv
pyvenv
(not to be confused with pyenv
) is a script to create virtual environments that used to be shipped with Python 3. It has been deprecated since Python 3.6 and replaced by python -m venv
.
CLI
Sometimes you just want to use a CLI tool from PyPI. In this case, you don’t need a virtual environment for development purposes but only to manage your CLI tools. Different CLI tools you use every day have their own dependencies, so installing these tools in the system Python can lead to the known problems again. Thus, it makes sense to create a separate virtual environment for each CLI tool.
To make this process convenient and manageable, there are tools that do just that: Create an isolated environment for CLI tools and then run these CLI tools.
pipx
With pipx
you can install a package that exposes a CLI script. pipx
will automatically create a separate virtual environment for each CLI tool and puts a symlink to a directory called .local/bin
inside your $HOME
directory.
To install a CLI tool from PyPI, just type:
$ pipx install pycowsay
installed package pycowsay 0.0.0.1, Python 3.9.6
These apps are now globally available
- pycowsay
done! ✨ 🌟 ✨
When it’s installed, you magically have the CLI tool available on your shell, and it does not pollute your system’s Python packages.
$ pycowsay bas.codes
---------
< bas.codes >
---------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
pip-run
pip-run
serves the same purpose as pipx
. The only difference is that pip-run
doesn’t provide a persistent package installation but rather deletes all environments after the tool has been executed.
$ pip-run -q pycowsay -- -m pycowsay "bas.codes"
---------
< bas.codes >
---------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
If you just want to try a package, pip-run
is a great tool as it does the cleanup for you. You can even run an interactive interpreter with an installed package, which makes pip-run
useful not only for CLI tools, but also to investigate module packages from PyPI:
$ python -m pip-run -q boto
>>> import boto