In this installment of Python Academy, we’re tackling something that trips up almost everyone starting with Python: Environment Management.
We’ll kick things off with the bread-and-butter tools—venv and pip—and then look at uv, the new kid on the block that’s rewriting the rules (and doing it incredibly fast).
The Classics: venv and pip
Why bother with virtual environments?
When you first install Python, you get one global playground. It sounds convenient until you realize that sharing your toys leads to broken toys.
- Version Nightmares: Project A needs
pandasversion 1.0, but Project B demands version 2.0. You can’t have both installed globally. - Breaking Your System: On Linux/macOS, your OS uses Python for its own internal tools. If you start messing with global packages, you might accidentally break your terminal or system utilities.
- Permission Headaches: Installing globally often screams for
sudo, which is usually a sign you’re doing something risky.
Virtual environments are the answer. They let you create isolated sandboxes for every project. Each sandbox gets its own Python executable and its own set of libraries, so they never fight over versions.
Creating and Activating
The standard way to make a sandbox is with venv.
To create one named .venv (the industry standard name):
You’ll see a new .venv folder pop up. But it’s just sitting there until you activate it.
On macOS/Linux:
On Windows:
Once you see (.venv) in your prompt, you’re in the sandbox. Anything you install here stays here.
Managing Packages with pip
pip is your tool for grabbing libraries from the internet (PyPI).
To install something:
To see what you’ve got:
requirements.txt
So you’ve built a cool app. How do you tell your friend exactly which libraries they need to run it? You hand them a requirements.txt.
You can generate this list from your current environment automatically:
This dumps every installed package and its exact version into a text file.
When your friend grabs your code, they just run:
And boom, their environment looks just like yours.
Generating requirements from code with pipreqs
pip freeze works well if you’ve been disciplined about your environment. But what if you inherit a folder of Python scripts with no documentation and no requirements.txt? Or what if you’ve just been installing packages globally for years?
In these cases, pip freeze will give you a list of everything you’ve ever installed, which is often overkill and messy.
This is where pipreqs shines. It analyzes your source code imports to determine exactly what packages are needed. It’s the perfect tool to bootstrap “good practices” for a project that started without them.
Install it:
bash pip install pipreqsRun it:
bash pipreqs .
This scans your .py files and generates a clean, minimal requirements.txt based solely on the libraries you actually import. Now you have a solid foundation to create a fresh virtual environment!
The Modern Era: uv
venv and pip are reliable, but let’s be honest—they can be a bit clunky and slow. Enter uv. It’s a Python package and project manager written in Rust, and it is blazing fast. It aims to replace pip, pip-tools, and virtualenv with one single binary.
Setting up with uv
Instead of manually creating folders and files, uv creates the project skeleton for you:
This sets up a standard structure, including a pyproject.toml. This file is the modern replacement for requirements.txt and setup.py.
Managing Dependencies
With uv, the days of pip install followed by remembering to run pip freeze are over. You just tell uv what you want:
This one command does all the heavy lifting: 1. Finds the best version of requests. 2. Installs it (creating a virtual environment automatically if you don’t have one). 3. Updates your pyproject.toml. 4. Updates uv.lock, which locks down the exact versions of everything (like package-lock.json in the JS world) so your builds are 100% reproducible.
Development Dependencies
Sometimes you need tools just for development—like testing frameworks or linters—that shouldn’t be shipped with your actual app. uv handles this cleanly:
Take ruff for example. It’s an incredibly fast linter and formatter. You want it while you’re coding to keep things tidy, but your users don’t need it to run your app. Adding it with --dev puts it in a special [tool.uv.dev-dependencies] section, keeping your production environment lean.
Managing Python Versions
This is where uv really shines. Need to test on Python 3.11 but you only have 3.12 installed? uv can fetch it for you:
Or you can just pin your project to a specific version, and uv will respect that:
Packaging: Sharing Your Code
So you’ve written a masterpiece and you want to share it with the world (or just your colleagues). How do you box it up? You build a Wheel.
What’s a Wheel?
A “Wheel” (file extension .whl) is the standard way to distribute Python packages. Think of it as a zip file that’s ready to go.
In the old days, we often shared “source distributions” (.tar.gz). This meant the user’s computer had to compile any C-extensions or complex code when they installed it. This was slow and error-prone.
Wheels are built distributions. They are pre-compiled and ready to unpack. When you pip install pandas, you’re almost certainly downloading a wheel. It’s faster, safer, and easier.
The Old Guard: setuptools
Traditionally, building a package meant wrestling with a file called setup.py using a library called setuptools. This little Python script was where you’d declare everything about your package: its name, version, author, a brief description, and most importantly, its dependencies. You’d define which modules to include, any data files, and even entry points for command-line tools.
It was powerful, for sure, but also notoriously tricky to get right. Debugging setup.py could feel like trying to solve a puzzle with missing pieces. You’d write code to describe your package, manage included files, and handle dependencies. It worked, but it felt a bit like alchemy, and slight changes in Python’s packaging standards would often mean revisiting this complex script.
Here’s a super basic setup.py example for a package named my_package:
# setup.py
from setuptools import setup, find_packages
setup(
name='my_package',
version='0.1.0',
packages=find_packages(),
install_requires=[
'requests',
'numpy',
],
author='Your Name',
author_email='your.email@example.com',
description='A simple example package.',
long_description=open('README.md').read(),
long_description_content_type='text/markdown',
url='https://github.com/yourusername/my_package',
classifiers=[
'Programming Language :: Python :: 3',
'License :: OSI Approved :: MIT License',
'Operating System :: OS Independent',
],
python_requires='>=3.8',
)And to build your package (both source and wheel distributions) using this file, you’d run:
The New Way: uv build
Since uv already knows everything about your project from pyproject.toml, building a package is laughably simple. You don’t need a setup.py anymore.
Just run:
That’s it. uv will look at your pyproject.toml, gather your code, and spit out a shiny .whl file (and a .tar.gz source distribution just in case) into a dist/ folder. You can then upload these to PyPI or share them directly.
Conclusion
| Feature | Classic (venv + pip) |
Modern (uv) |
|---|---|---|
| Creation | python -m venv .venv |
uv init / auto-created |
| Install | pip install pkg |
uv add pkg |
| Save Deps | pip freeze > requirements.txt |
Automatic in pyproject.toml |
| Dev Deps | Manual separation | uv add --dev pkg |
| Speed | Standard | Blazing Fast 🚀 |
Mastering venv and pip is a rite of passage, and it’s good to know what’s happening under the hood. But once you switch to uv, you probably won’t look back. It just makes the whole process feel… solved.