Modern Python Package Management – Part 2
In the first part of our series, we laid the groundwork for understanding Python’s packaging ecosystem. Now, we venture deeper into the modern landscape, a realm where chaos gives way to control and ambiguity is replaced by deterministic precision. Navigate Python package management with modern tools like Poetry, pipenv, and pip-tools. Learn dependency management, virtual environments, and publishing best practices. This is part 2 of our comprehensive series covering advanced techniques and practical implementations that will transform your development workflow from a source of frustration into a streamlined, reproducible, and collaborative process. We’ll dissect the tools that have emerged to solve the classic “it works on my machine” problem, explore their philosophies, and provide practical guidance on integrating them into your projects. Whether you’re building a simple script, a complex web application, or a distributable library, mastering these modern tools is no longer a luxury—it’s a necessity for professional Python development.
The Evolution from requirements.txt to Modern Dependency Management
For years, the combination of pip and requirements.txt was the de facto standard for managing Python dependencies. While simple and effective for basic use cases, this approach is fraught with subtle issues that can lead to significant problems in larger projects, especially in team environments. The core problem lies in its inherent ambiguity and lack of deterministic builds.
The Frailty of the Old Way: Pitfalls of requirements.txt
A typical requirements.txt file might contain lines like requests>=2.25.0. This simple line introduces uncertainty. Today, it might install version 2.28.0, but tomorrow it could install 2.29.0, which might contain a breaking change or a new bug. This leads to non-reproducible builds, where two developers (or a developer and a CI/CD server) can end up with different environments from the same file.
The common workaround, pip freeze > requirements.txt, attempts to solve this by pinning every single package, including transient (or transitive) dependencies—the dependencies of your dependencies. However, this creates new problems:
- Lack of Clarity: It becomes impossible to distinguish your project’s direct dependencies from the dozens of sub-dependencies they pulled in. Why is
chardet==4.0.0in the project? Is it a direct need or a dependency ofrequests? - Cross-Platform Issues: The output can include platform-specific packages (like
pywin32on Windows), which will cause installation to fail on other operating systems. - Maintenance Nightmare: Upgrading a single package is a manual and error-prone process. You have to identify the package, update its version, and then hope that you’ve correctly resolved all the resulting changes in its sub-dependencies.
The Modern Paradigm: Declarative Dependencies and Lock Files
Modern tools solve this dilemma by separating the *intent* from the *result*. They introduce two distinct concepts:
- A Declarative Manifest File: This file (e.g.,
pyproject.tomlorPipfile) is where you declare your project’s direct dependencies in abstract terms (e.g., “I need Django version 4.x”). This file is human-readable and version-controlled. The introduction of thepyproject.tomlfile was significant p, y, t, h, o, n, , n, e, w, s, for the community, standardizing how build systems and project metadata are specified under PEP 518. - A Lock File: This is a machine-generated file (e.g.,
poetry.lock,Pipfile.lock) that contains the exact, pinned versions of *every* package and sub-package required to satisfy the manifest. It’s a precise snapshot of a known-good working environment. This file is the key to deterministic and reproducible builds. When a teammate or a CI server runs an install command, the tool reads the lock file, not the manifest, ensuring everyone gets the exact same environment, every single time.
This separation is the cornerstone of modern Python package management. It provides the clarity of a high-level dependency list while guaranteeing the byte-for-byte reproducibility of a fully pinned environment.
Comparing the Titans: Poetry vs. Pipenv vs. pip-tools
Three main contenders have emerged in the modern packaging space, each with a different philosophy and feature set. Understanding their differences is key to choosing the right one for your project.
Poetry: The All-in-One Solution
Poetry is an opinionated, full-lifecycle tool that aims to manage everything from dependency resolution and virtual environments to project building and publishing to PyPI. It uses the standard pyproject.toml file as its single source of truth.
- Philosophy: Provide a single, cohesive, and intuitive interface for all packaging-related tasks.
- Key Files:
pyproject.tomlandpoetry.lock. - Core Workflow:
- Initialize a project:
poetry init - Add a dependency:
poetry add flask - Install from lock file:
poetry install - Run a command in the virtual environment:
poetry run python app.py - Build your package:
poetry build - Publish to PyPI:
poetry publish
- Initialize a project:
A sample pyproject.toml section for dependencies looks clean and declarative:

[tool.poetry.dependencies]
python = "^3.10"
requests = "^2.28.0"
fastapi = "^0.85.0"
[tool.poetry.group.dev.dependencies]
pytest = "^7.1.3"
black = {version = "^22.8.0", allow-prereleases = true}
Strengths: Poetry boasts a fast and reliable dependency resolver, a unified command-line interface, and strict adherence to modern packaging PEPs, making it an excellent choice for new projects, especially libraries intended for distribution.
Pipenv: The “Official” Recommendation from PyPA
Pipenv was created by the Python Packaging Authority (PyPA) to be the officially recommended packaging tool for applications. It focuses on separating development and production dependencies clearly.
- Philosophy: Bring the best practices from other language ecosystems (like Bundler, npm, or Yarn) to Python application development.
- Key Files:
PipfileandPipfile.lock. - Core Workflow:
- Install a dependency (and create a virtualenv):
pipenv install flask - Install a dev dependency:
pipenv install pytest --dev - Activate the virtual environment shell:
pipenv shell - Install from lock file:
pipenv sync
- Install a dependency (and create a virtualenv):
A sample Pipfile clearly separates packages:
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
requests = "*"
django = "~=4.1"
[dev-packages]
pytest = "*"
Strengths: Pipenv’s main draw is its simplicity and official backing. It automatically manages virtual environments for you, making project setup very straightforward. While it has faced criticism for slow performance in the past, recent development has addressed many of these concerns.
pip-tools: The Unbundled, Unix-like Approach
Unlike Poetry and Pipenv, pip-tools is not an all-in-one solution. It’s a small set of command-line tools that enhance the existing pip and venv workflow, following the Unix philosophy of “do one thing and do it well.”
- Philosophy: Augment, rather than replace, the standard Python tooling. Provide maximum control and flexibility to the developer.
- Key Files: You create
requirements.in; it generatesrequirements.txt(your lock file). - Core Workflow:
- You manage your own virtual environment:
python -m venv .venvandsource .venv/bin/activate. - Define abstract dependencies in
requirements.in. - Compile the lock file:
pip-compile requirements.in - Install/sync the environment:
pip-sync
- You manage your own virtual environment:
A sample requirements.in is simple:
# requirements.in
flask
gunicorn
Running pip-compile generates a detailed requirements.txt with all sub-dependencies pinned and commented for clarity.
Strengths: pip-tools is lightweight, transparent, and incredibly flexible. It’s perfect for developers who want to understand and control every step of the process or for integrating deterministic builds into legacy projects without overhauling the entire workflow.

Beyond Installation: Advanced Dependency Management in Practice
Modern package managers offer more than just reproducible installs. They provide a suite of tools for maintaining a healthy, secure, and efficient development lifecycle.
Managing Development vs. Production Environments
Your production environment doesn’t need testing frameworks like pytest or code formatters like black. Including them bloats your deployment artifact, increases the attack surface, and can slow down container builds. All modern tools provide a robust way to manage this separation.
- Poetry: Uses dependency groups. You add a dev tool with
poetry add black --group dev. In your CI/CD pipeline, you install only production dependencies withpoetry install --no-dev. - Pipenv: Has a built-in
[dev-packages]section in thePipfile. You usepipenv install pytest --dev. For production, you runpipenv install --deploy, which strictly installs from thePipfile.lockand ignores dev packages. - pip-tools: The common pattern is to have a
requirements.infor production and adev-requirements.infor development that includes the production requirements via-r requirements.in. You then compile both to separate.txtfiles.
Security Auditing and Vulnerability Scanning
Your application is only as secure as its weakest dependency. Manually tracking CVEs (Common Vulnerabilities and Exposures) is impossible. Fortunately, these tools integrate security scanning directly into your workflow.
- Pipenv: Has a built-in command,
pipenv check, which uses thesafetydatabase to scan your dependency graph for known security vulnerabilities. - Poetry: While it doesn’t have a native check command, it integrates seamlessly with external tools. You can run
poetry export -f requirements.txt | safety check --stdinor use plugins to add this functionality. - pip-tools: Being a minimal tool, you simply run
safety check -r requirements.txtagainst your compiled lock file.
Integrating this check into your CI pipeline is a critical best practice that can prevent vulnerable code from ever reaching production.
Integration with CI/CD Pipelines (e.g., GitHub Actions)
Reproducibility is the primary goal of CI/CD. Modern package managers make this trivial by leveraging the lock file.
Here’s a simplified GitHub Actions workflow step for installing dependencies with each tool:
Poetry Example:
- name: Install dependencies
run: |
pip install poetry
poetry config virtualenvs.in-project true
poetry install --no-interaction --no-dev
Pipenv Example:

- name: Install dependencies
run: |
pip install pipenv
pipenv install --deploy --system
pip-tools Example:
- name: Install dependencies
run: |
pip install -r requirements.txt
In all cases, the command is simple and deterministic. The CI server builds the exact same environment that you tested locally, eliminating a huge class of potential bugs.
Making the Right Choice for Your Project
With three strong contenders, which one should you choose? The answer depends on your project’s context, your team’s needs, and your personal preferences. Staying current with the latest p, y, t, h, o, n, , n, e, w, s, regarding packaging standards can also influence this decision as the ecosystem continues to evolve.
When to Choose Poetry
Poetry is the ideal choice for:
- New Projects and Libraries: If you are starting from scratch, especially if you plan to publish your package to PyPI, Poetry’s integrated build and publish system is unparalleled.
- All-in-One Preference: Teams that want a single, opinionated tool to handle every aspect of the packaging lifecycle will find Poetry’s cohesive experience highly productive.
- Performance-Critical Resolution: If you have a complex dependency graph, Poetry’s fast resolver is a significant quality-of-life improvement.
When to Choose Pipenv
Pipenv shines in these scenarios:
- Application Development: It was designed specifically for applications rather than libraries, and its workflow reflects that focus.
- Simplicity and Standardization: For teams that want a simple, officially-backed tool without the extra cognitive load of build systems, Pipenv is a great starting point.
- Automatic Environment Management: If you prefer a tool that transparently creates and manages virtual environments for you, Pipenv’s magic can be very appealing.
When to Choose pip-tools
Consider pip-tools when:
- You Value Control and Flexibility: If you want to compose your own workflow and understand every moving part, the unbundled nature of pip-tools is a perfect fit.
- Integrating with Existing Projects: It’s the easiest tool to adopt in a legacy project that already uses
requirements.txt, providing the benefit of locked dependencies with minimal disruption. - Minimalism is Key: If you dislike heavy, all-in-one tools and prefer small, focused utilities, pip-tools will align with your philosophy.
Conclusion: Embracing a New Era of Confidence
The journey from a simple requirements.txt to a fully-managed, deterministic environment with tools like Poetry, Pipenv, or pip-tools represents a monumental leap in Python development maturity. By separating dependency declaration from the locked, reproducible environment, these tools eliminate an entire class of frustrating “works on my machine” bugs. They provide the confidence to refactor, the safety to collaborate, and the reliability to deploy. The debate over which tool is “best” is less important than the principle they all share: that your project’s environment should be a predictable asset, not a variable liability. By adopting one of these modern workflows, you are investing in stability, security, and a more professional and less stressful development experience for yourself and your team.
