It was created by the Astral team (the makers of the Ruff linter) to unify many aspects of Python development into a single tool. First released in early 2024, uv was developed to address the pain points of Python’s packaging ecosystem—namely slow installs, fragmented tools, and “it works on my machine” issues. In the Python ecosystem, uv positions itself as a next-generation alternative to tools like pip, virtualenv, pip-tools, pipenv, and poetry, combining their functionality with dramatically improved performance. Its introduction has been met with enthusiasm: a 2023 Stack Overflow survey even cited uv as one of the most desired new Python tools. For Python developers, learning uv is becoming increasingly important as it promises faster setups, more reproducible environments, and a streamlined workflow for managing projects.
As of 2025, uv is actively maintained and rapidly evolving. The current PyPI version is 0.8.13, released on Aug 21, 2025. Despite the 0.x version number, uv is labeled production-stable and has already gained a large user base, evidenced by its high GitHub star count and community adoption. The project is open-source (dual-licensed Apache 2.0 and MIT) and backed by Astral, which ensures continued development and support. UV’s fast pace of releases means new features and optimizations are added frequently, so staying updated is advisable. The creators of uv (led by Astral’s CEO Charlie Marsh) developed it to solve real-world workflow problems, and they actively incorporate feedback from users via GitHub and a community Discord channel. In summary, uv is a modern, important addition to Python’s tooling landscape, aiming to simplify and speed up package management and environment handling for developers.
What is uv in Python?
The uv library in Python is technically a command-line tool that serves as an all-in-one package manager and project manager. In definition, uv is a drop-in replacement for pip (for installing packages) but extends far beyond that. It introduces a unified architecture that combines multiple packaging functions: creating isolated environments, resolving dependencies with a modern algorithm, managing Python versions, running project scripts, and even building/publishing packages. Under the hood, uv is implemented in Rust, which gives it a performance edge and low-level control over tasks like file I/O and parallel downloads. It uses the PubGrub dependency resolution algorithm (borrowed from the Dart/Cargo ecosystems) to quickly find a set of package versions that satisfy all requirements. This means uv can resolve complex dependency graphs very efficiently, avoiding the slowness or conflicts often seen with pip’s resolver. In essence, uv’s core concept is to provide a single tool that understands the entire lifecycle of a Python project – from installing Python itself, to managing the environment, dependencies, and final deployment.
In terms of architecture, uv works by maintaining a project configuration in the standard pyproject.toml
format (PEP 621 metadata) and a uv.lock
lockfile for exact versions. When you initialize a project with uv, it creates a virtual environment (by default a .venv
directory) and a pyproject.toml that lists your top-level dependencies. Every time you add or remove a package, uv’s engine updates these files and ensures the environment matches them. The uv binary orchestrates several components: a resolver (to compute dependency sets), an installer (to fetch and install wheels or sdists), an environment manager (to create and activate virtual environments), and a Python version manager (to download and manage different Python interpreter versions). Unlike pip, uv doesn’t rely on Python’s runtime for most operations – it directly performs installations and file operations in Rust, interfacing with wheel files and indexes. This design means uv can perform tasks like unpacking archives or constructing virtualenvs with minimal overhead. Key modules within uv include the “pip interface” module (which wraps common pip commands under uv pip ...
), the “Python manager” module (for handling uv python ...
commands to list/install interpreters), and the “tools runner” module (which handles uv tool
and script execution features). These components work together seamlessly: for example, when you run uv add pandas
, uv’s resolver quickly determines compatible versions of pandas and its dependencies, the installer downloads them (possibly in parallel), and the environment manager places them in the .venv
folder, all orchestrated within the uv process.
Integration with other Python libraries and standards is a major aspect of uv. It adheres to modern packaging standards – using PEP 517/518 build logic for projects and reading/writing pyproject.toml
. This means uv-based projects remain interoperable; you can still build a wheel with setuptools or publish via twine, though uv provides its own commands for these. uv’s lockfile is designed to be universal, capturing hashes and platform markers so that the same lockfile works on Windows, macOS, or Linux, unlike pip’s requirements which are often platform-specific. uv can install packages from PyPI (it uses the same package index URLs as pip), and it respects environment variables or pip configuration for custom indexes, so it can integrate with private package repositories as well. Additionally, uv includes a pip compatibility layer: you can run commands like uv pip install
or uv pip sync
, which mirror pip and pip-tools behaviors but with uv’s speed and lockfile integration. For example, uv pip compile
will generate a requirements.txt using uv’s resolver, and uv pip install
will use uv’s installer within the active environment. This allows gradual integration – developers can alias uv pip
to pip
and get immediate performance benefits without changing their habitual commands. In summary, uv works alongside the Python ecosystem: it doesn’t change how libraries operate at runtime; instead, it optimizes and unifies the management of those libraries and environments.
When it comes to performance characteristics, uv is a game-changer. Thanks to Rust’s efficiency and uv’s design, operations that traditionally were slow in Python tooling become lightning-fast. Dependency resolution that might take pip several seconds or more is nearly instantaneous with uv (often measured in milliseconds for moderately sized dependency sets). Installing multiple packages is done concurrently and directly via optimized system calls, yielding 10x to 100x faster install times compared to pip in many cases. For instance, one benchmark shows uv installing the entire dependency stack of the Trio library with a warm cache in the blink of an eye, whereas pip would take several seconds or more. Memory usage is also well-controlled – uv’s binary is a compiled executable that doesn’t incur the overhead of spinning up a Python interpreter for each operation. It reuses a global cache of downloaded packages (by default in ~/.cache/uv
), so if two projects need the same version of a package, it’s downloaded once and reused, saving disk space and time. uv is also cross-platform and optimized for each OS: it handles paths and virtual env creation on Windows properly (no slow batch scripts), uses symlinks or copies appropriately on Unix, and can take advantage of system-specific improvements (for example, using native APIs for certain file operations). In terms of reliability, uv tends to provide clear error messages (for example, if resolution fails due to conflicting requirements, uv will explicitly indicate which packages and versions caused the conflict). This clarity, combined with speed, means developers spend less time waiting or guessing and more time coding. Overall, uv’s under-the-hood workings make it a robust, high-performance backbone for Python project management.
Why do we use the uv library in Python?
uv addresses several problems that Python developers commonly face, making it a compelling tool in real-world use. One major problem it solves is the slow and cumbersome dependency management in large projects. With pip, installing or updating dependencies (especially in CI pipelines or when onboarding new team members) can be time-consuming; uv’s speed largely eliminates this wait, leading to immediate productivity gains. By being so fast, uv encourages more frequent environment recreation or dependency locking, which in turn leads to more reproducible setups. Another issue uv tackles is the fragmentation of tools – previously, you might use venv
or virtualenv for environments, pip for installation, pip-tools for locking, pipx for running CLI tools, and pyenv for managing Python versions. uv consolidates all these into one workflow, meaning you have fewer tools to learn and fewer integration points to break. This unified approach solves the “works on my machine” syndrome by ensuring that when someone sets up a project with uv, they not only get the same packages, but even the same Python interpreter version and environment settings as intended by the project maintainer. In industry, this level of consistency can prevent deployment issues and reduce onboarding time for new developers dramatically.
The uv library offers significant performance advantages that justify its use over traditional methods. In terms of speed, uv’s advantage isn’t just theoretical – many users report tangible improvements like cutting installation times from minutes to seconds. For example, installing heavy packages like SciPy or pandas using uv is much faster; one anecdote notes that a package that took ~30 seconds with pip installed in about 3 seconds with uv. In continuous integration (CI) environments, these savings multiply: a job that installs dozens of packages for testing can finish far quicker using uv, reducing overall build times and costs. Performance isn’t only about raw speed, but also about efficiency and feedback. Because uv resolves dependencies upfront and uses a lockfile, you catch incompatibilities or conflicts early (during uv add
or uv lock
) rather than at runtime. This proactive conflict detection means you avoid subtle bugs that appear only when a certain combination of library versions is installed. Moreover, uv’s use of a global package cache means even if you have multiple projects or rebuild environments often, you are not re-downloading the same files repeatedly – this is a performance benefit in terms of network and I/O usage. In summary, uv’s performance improvements lead to quicker setups and feedback loops, which is a major reason developers adopt it.
Using uv can greatly improve development efficiency and workflow. One big benefit is automation: uv automates tasks that used to be manual. For instance, if you need a new project, uv init
not only sets up a virtual environment but also initializes a git repository, a README, and a basic Python file to get you started. This encourages best practices from the get-go. When adding dependencies, uv’s single command (uv add
) handles environment creation (if not already created), installation, and updating the project config all at once. This is more efficient than manually updating requirements files or juggling pip and venv commands. Additionally, uv ensures that whenever you run your code (via uv run
), it uses the correct environment automatically, so you don’t waste time debugging issues that were simply due to a forgotten environment activation. In a team scenario, uv’s lockfile allows one developer to add or upgrade a library and commit the lock; others can then sync to exactly those versions in one step, eliminating the “well, it worked on my machine” conversations. The development workflow becomes smoother: you can focus on coding, while uv quietly handles the boilerplate of dependency and environment management. Many real-world applications, from web development to data analysis, benefit from uv’s streamlined commands – for example, a Django developer can uv add django djangorestframework
and have everything needed set up in seconds, or a data scientist can uv add numpy pandas scikit-learn
and immediately start analyzing data without wrestling with installations.
Industry adoption of uv, while still emerging, showcases its real-world applicability. Large codebases and enterprises that struggled with slow builds or inconsistent dev environments find uv appealing. Early adopters in tech companies have reported that switching to uv sped up their onboarding for new developers (since a single uv sync
gives a fully working environment). Open-source maintainers appreciate uv for managing contributions: by providing a uv.lock file, they ensure contributors test against the same dependency versions, reducing “it fails on my setup” bug reports. We’ve also seen uv being used in educational contexts – instructors set up student projects with uv to avoid installation issues on different laptops, making workshops and bootcamps run more smoothly. RealPython, a well-known Python resource, featured uv as a tool that “streamlines your workflow”, indicating that the developer community acknowledges the practical benefits uv brings. In summary, we use uv in Python development because it saves time, reduces errors, and consolidates our workflow. It provides a safety net of reproducibility (through lockfiles and consistent Python versions) while drastically cutting down the friction of managing packages and environments. Whether you’re an individual developer looking to simplify your project setup, or an organization aiming to eliminate environment-related issues, uv offers a compelling solution through its speed and integration of best practices.
Getting started with uv
Installation instructions
Installing the uv library is straightforward, and you have multiple options depending on your environment. The most common method is using pip. In a terminal or command prompt, run pip install uv
to install uv from PyPI. If you prefer to keep uv isolated (so it doesn’t conflict with other Python packages), you can use pipx: for example, pipx install uv
will install uv as a separate global tool. Using pip or pipx will install uv as a Python package (which includes a compiled Rust binary inside). Ensure you have Python 3.8 or higher before installing, as uv requires Python 3.8+. After installation, verify it worked by running uv --version
, which should display uv’s version.
Another popular installation method is via uv’s standalone installer scripts, which do not require pip at all. On macOS or Linux, you can open a terminal and run:
curl -LsSf https://astral.sh/uv/install.sh | sh
This command fetches uv’s official install script and installs the uv
binary to your system (typically to ~/.local/bin
for Linux/macOS). On Windows, you can use the PowerShell installer by running PowerShell as an Administrator and executing:
irm https://astral.sh/uv/install.ps1 | iex
This will download the uv.exe
and set it up for you. These installers are convenient because they bundle the Rust-compiled binary of uv, meaning you don’t even need Python pre-installed to get started – uv can actually bootstrap Python for you if needed. If you use Homebrew on macOS, note that uv is also available via Homebrew: brew install uv
will install uv using the package manager. These methods (script or Homebrew) are recommended if you want to use uv as a system-wide tool accessible from any project.
For users of conda/Anaconda environments, uv can be installed from the conda-forge channel. Activate your desired conda environment and run: conda install -c conda-forge uv
. This will place uv in that environment’s path. In Anaconda Navigator’s GUI, you could search for “uv” (making sure to enable the conda-forge channel) and install it with a click. Keep in mind that if you install uv inside a conda environment, uv will manage virtual environments and packages within that context (it’s somewhat meta to use uv and conda together, but it can be done if needed). Typically, it’s more common to use one tool or the other; however, you might use conda for managing some system libraries and uv for Python packages. If you go this route, ensure to activate the conda env before using uv so that uv’s installed Python versions and environments live in the intended location.
You can also set up uv in development workflows like Visual Studio Code or PyCharm easily. Since uv is a command-line tool, integration is usually about installing it and then using it in the IDE’s terminal. In VS Code, you would install uv (globally or in your virtual environment) as above, then open the integrated terminal and run uv commands (like uv init
or uv add
). VS Code can use the uv-created virtual environment for running and debugging: after uv init
creates a .venv
, select that interpreter (.venv/bin/python
or .venv\Scripts\python.exe
on Windows) in VS Code’s Python interpreter selector. In PyCharm, you can install uv by opening PyCharm’s Terminal pane and using pip (pip install uv
) or conda as appropriate. Once uv is installed, you might create a project with uv externally, then open it in PyCharm. PyCharm will detect the existing virtual environment (if uv created one in the project folder) or you can configure PyCharm to use that environment. Alternatively, you can let PyCharm create an environment, then install uv into it via the Settings > Python Interpreter packages interface. From then on, you can run uv commands within PyCharm’s terminal. Remember that uv itself doesn’t have a graphical interface, so using it in an IDE means running its CLI commands or possibly configuring PyCharm’s File Watchers/Tasks to call uv on certain events (for example, run uv sync
before running tests).
Installation on different operating systems is well-supported. On Windows, the PowerShell script or pip installation will place a uv.exe
that you can call from Command Prompt or PowerShell. Make sure your PATH includes the Scripts directory of your Python installation if using pip, or that the installer’s target (by default %LOCALAPPDATA%\Programs\uv
) is on PATH. On Mac, the curl installer will put uv
in ~/.local/bin
, so ensure that’s in your PATH (Homebrew handles PATH automatically if you installed via brew). On Linux, the same applies: ~/.local/bin
might need to be added to PATH if not already. Docker installation is another scenario to consider: in a Dockerfile, you can install uv using pip or the standalone script. For example, for a Debian-based image, you could do:
RUN pip install uv
or
RUN curl -LsSf https://astral.sh/uv/install.sh | sh
and then use uv
commands in subsequent steps to install project dependencies. This is great for CI and container builds – uv’s speed will significantly shorten the time to build the image when pulling in Python packages.
Virtual environment considerations: Typically, uv wants to manage its own virtual environments for projects. You don’t need to pre-create a venv; uv init
or uv add
will do it for you (creating a .venv
folder by default). If you already have an active virtual environment and you install uv there, uv can still be used, but when you run uv init
it may create another .venv
inside your project. It’s often simplest to either use uv as a global tool (via pipx or the installer) so that it can create and manage envs itself, or dedicate a base venv just to uv operations. In cloud or remote server environments (e.g., an EC2 instance or a generic VM), installation is the same as local: you can curl the installer or pip install uv in your user environment. Ensure any CI runners or deployment targets have uv available if you plan to use uv commands there.
If you encounter common installation errors, here are some troubleshooting tips. If pip install uv
fails with a compilation error, ensure you have an updated pip (pip can install prebuilt wheels for uv; an outdated pip might try to compile from source unnecessarily). Usually, uv provides wheels for all major platforms, so compilation shouldn’t be required. If you run the installer script on Unix and get a “Permission denied” or “command not found” error, make sure you have curl
installed and that you’re not behind a firewall blocking the download. On Windows, if the script doesn’t run, check your ExecutionPolicy (the command provided uses ByPass
to avoid that, but needing admin rights is important). If after installation the uv
command isn’t found, double-check that the installation path is added to your system PATH. For pip installs, try python -m uv --version
to see if it’s installed; if so, the issue is just PATH. In that case, consider reinstalling with pipx or adding the Scripts directory to PATH. Another tip: you can always upgrade uv to the latest version. If installed via pip, do pip install --upgrade uv
; if via the standalone script, you can run uv self update
to have uv update itself to the latest release. Finally, note that uv requires an internet connection to download packages and Python versions. If you’re behind a proxy, configure your HTTP_PROXY/HTTPS_PROXY
environment variables so that uv’s network calls (which follow pip’s networking logic) work properly. With these installation steps and tips, you should have uv up and running smoothly on your system.
Your first uv example
Let’s walk through a simple example of using uv in a project. In this example, we’ll create a small Python script that fetches a random joke from an online API and prints it. We’ll use the requests
library to make the HTTP request. We’ll demonstrate how to set up this project with uv and run the script. Below is the complete code (joke_fetcher.py
) and then we’ll explain it line by line:
import requests
def fetch_random_joke():
"""Fetch a random joke from an online API and return it as a string.""" try:
# Send a GET request to the joke API
response = requests.get("https://official-joke-api.appspot.com/random_joke")
response.raise_for_status() # Raise an error if the response status is not 200
joke_data = response.json() # Parse the JSON response into a Python dict # Format the joke as "Setup - Punchline" return f"{joke_data['setup']} - {joke_data['punchline']}"
except Exception as e:
# In case of network issues or JSON parsing error, return an error message return f"Error fetching joke: {e}"
# Only execute this part when the script is run directly (not imported) if __name__ == "__main__":
print("Fetching a random joke from the internet...")
joke = fetch_random_joke()
print("Random Joke:", joke)
Line-by-line explanation:
Line 1: We import the
requests
library, which we will use to perform an HTTP GET request to a public joke API. (We’ll need to install this library using uv before running the script.)Lines 3-4: We define a function
fetch_random_joke()
that will handle contacting the API and returning a joke. The docstring briefly describes what the function does.Lines 5-11: Inside the function, we use a
try
block to attempt the network call. On line 7, we callrequests.get()
with the URL of a random joke API. This returns aresponse
object. On line 8,response.raise_for_status()
will throw an exception if the HTTP status code indicates an error (like 404 or 500). This is good practice to catch HTTP errors. On line 9, we parse the JSON content of the response usingresponse.json()
. The API returns a JSON object containing a joke setup and punchline, which we store in thejoke_data
dictionary. On line 10, we use an f-string to format the joke as “Setup – Punchline” by accessingjoke_data['setup']
andjoke_data['punchline']
. This string is returned if everything went well.Lines 12-14: We have an
except Exception as e
block. This will catch any exception that happened in the try block, whether it’s a connection error, a JSON decoding error, or any other unexpected issue. If an error occurs, we return a string like"Error fetching joke: <error details>"
. This ensures the function always returns a string, even in error cases, rather than crashing.Lines 17-21: This is the typical Python idiom
if __name__ == "__main__":
. It checks if the script is being run directly. If so, we execute the indented code; if the script were imported as a module, this part would be skipped. In this block, line 18 prints a message indicating we’re about to fetch a joke. Line 19 calls our functionfetch_random_joke()
and stores the result in the variablejoke
. Line 20 then prints the joke to the console prefixed by "Random Joke:". If an error occurred, the joke variable would contain the error message string, which would be printed instead.
Expected output: When you run this script, you should see something like:
Fetching a random joke from the internet...
Random Joke: Why did the scarecrow win an award? - Because he was outstanding in his field.
The exact joke will vary each time since the API returns random jokes. If there’s an issue (say, no internet connection), the output might be:
Fetching a random joke from the internet...
Random Joke: Error fetching joke: HTTPSConnectionPool(host='official-joke-api.appspot.com', port=443): Max retries exceeded...
In typical use, you’d run this script with uv to ensure the requests
dependency is satisfied. Here’s how to do that using uv:
Initialize a uv project (if you haven’t already):
uv init joke-project
. This will create a project directory with a pyproject.toml.Add the requests library as a dependency:
uv add requests
. This will download requests (and its dependencycertifi
, etc.) and create a.venv
with those installed, and update pyproject.toml and uv.lock accordingly.Run the script using uv:
uv run python joke_fetcher.py
. This ensures the script runs in the uv-managed environment where requests is available. (You could also douv run joke_fetcher.py
directly – uv will infer to use Python to run it.)
Alternatively, you could combine steps using uv’s script support: For example, uv add --script joke_fetcher.py requests
would insert metadata into the script itself noting that it needs requests, and then uv run joke_fetcher.py
would automatically install requests and execute the script in an isolated environment. But to keep things simple, using uv to manage the project as above is straightforward.
Beginner mistakes to avoid: A common mistake is forgetting to install the required library. If you tried to run this script without installing requests, you’d get an ImportError
. Using uv, make sure to run uv add requests
before running the script. Another pitfall is not using uv to run the script. If you just run python joke_fetcher.py
outside of uv’s context, you might be using the wrong interpreter or missing the dependency. Always use uv run
(or activate the uv-created virtualenv) to run your code. Also, remember that uv init automatically created a virtual environment; you don’t need to call python -m venv
yourself – doing so separately can cause confusion or duplicating environments. If you see an error like “Module not found: requests,” it likely means you ran the script without uv or before adding the dependency. Another tip: the API URL in this script is plaintext HTTP; ensure you typed it correctly. Typos in URLs or resource paths can lead to errors that might confuse beginners. In our code, we used response.raise_for_status()
to catch HTTP errors – if the API endpoint were wrong and returned 404, our function would return a message including that 404 status. Without raise_for_status, one might get a confusing None or error later when accessing joke_data
. Finally, be careful with the JSON keys – they are case-sensitive. A mistake like using joke_data['Punchline']
(capital P) instead of the correct lowercase ‘punchline’ would result in a KeyError. Paying attention to these details will help you avoid common errors as you begin using uv to manage project dependencies and run your Python code. With uv handling the setup, you can focus on writing your code (like the example above) and trust that the environment will be correct when you run it.
Core features of uv
Fast dependency installation and resolution
One of uv’s headline features is its fast dependency installation and resolution. This feature is crucial because it directly tackles the slow package installation times Python developers are used to. When you run uv’s equivalent of installing packages (such as uv add
or uv sync
), uv employs a highly optimized resolver to figure out all necessary dependencies and their compatible versions in milliseconds. It then downloads and installs packages with parallelism and efficient I/O, achieving speed-ups of an order of magnitude over pip. Why is this important? In large projects or when setting up continuous integration pipelines, a slow pip install
can become a bottleneck. With uv’s fast resolution, developers can add new libraries or rebuild environments without significant downtime, making iterative development smoother.
Syntax and parameters: The primary command for adding or updating dependencies is uv add
. The syntax is straightforward: uv add [options] <package(s)>
. You can list one or multiple packages, and even specify version constraints (e.g., uv add "Django>=4.2,<5"
). By default, uv add
adds to the main dependencies in your pyproject, but you can use --dev
to mark them as development dependencies (which uv will track separately in the lockfile). Under the hood, uv add
not only installs the package into your environment but also writes the dependency into the [project.dependencies]
section of pyproject.toml (with a version specifier) and updates uv.lock
with exact versions and hashes. Another related command is uv remove <package>
which cleanly removes a package from both the environment and the pyproject/lockfile. For example, uv remove numpy
would uninstall NumPy and remove it from your dependency list (also removing any packages that were only needed by NumPy). There’s also uv update
(or possibly uv upgrade
) to bump package versions if a newer match is available, and uv lock
which explicitly re-resolves and writes the lockfile (though uv add
does this automatically, you might run uv lock
if you manually edited pyproject or just want to refresh the lock). All these commands accept parameters like --python X
if you wanted to, say, resolve for a different Python version, or --dry-run
to simulate changes without applying them.
Examples: Let’s go through a few practical scenarios.
Example 1: Adding a single package. Suppose you have a fresh project and you need
requests
. You runuv add requests
. You’ll see output indicating uv is resolving dependencies (requests might pull inurllib3
,certifi
, etc.). It might output something like “Resolved 3 packages in 0.5s, Installed 3 packages in 20ms + requests==2.31.0 ...” and it will list the packages added with+
sign. After that, your pyproject.toml will now includerequests>=2.31.0
(if that was the latest version) under dependencies, and uv.lock pinsrequests==2.31.0
and the exact versions of its sub-dependencies.Example 2: Adding multiple packages with constraints. Let’s say you want to add both
flask
andsqlalchemy
at specific versions. You can runuv add "flask==2.2.5" "SQLAlchemy<2.0"
. uv will resolve these together, making sure the combination is compatible. If a dependency conflict arises (for instance, if a version of Flask required a version of a library that conflicts with SQLAlchemy’s needs), uv would report an error and not apply partial changes – this atomic resolution is a key advantage. Assuming no conflict, uv installs them and adds them to pyproject. All in one go, you’ve updated your environment and config.Example 3: Locking and syncing. Suppose you have updated some dependencies and now you want to ensure all team members or deployment environments use exactly those versions. The command
uv lock
will generate (or update) theuv.lock
file with the resolved package versions, platform tags, and hashes. In practice, you often don’t need to runuv lock
manually becauseuv add
anduv remove
keep the lock in sync. But if you manually edited the pyproject or switched branches in git and want to regenerate the lock,uv lock
does that. Later, on another machine, runninguv sync
will read the lockfile and install exactly those versions (if some are missing). For example, in CI you might do:uv sync
after checking out a project – uv will see if.venv
exists and matches the lock; if not, it will install the needed packages. This is analogous topip install -r requirements.txt
but with guaranteed versions and much faster.uv sync
ensures your environment matches the lockfile without altering pyproject dependencies.Example 4: Removing a dependency. If you decide you no longer need a library, run
uv remove <name>
. As an example,uv remove django-debug-toolbar
would uninstall that package and any packages that were only required by it (if no other part of your project uses them). The pyproject is updated (dependency entry removed) and lockfile adjusted. If you runuv sync
later, it will also remove those from any environment to match the lock.
Performance considerations: uv’s fast installation means you rarely have to wait long, but one consideration is the first run (cold cache) vs subsequent runs (warm cache). The first time you add a large package (say tensorflow
which is huge), uv will still be limited by your network and the size of the file. But thanks to parallelism, if you add multiple such packages together, uv fetches them concurrently. uv’s resolver is also optimized to minimize backtracking; it uses PubGrub which is known for giving good error messages on conflicts as well. If a resolution conflict occurs, uv might show a message like “Failed to resolve: X requires <=1.0 but Y requires >=1.5” indicating clearly which dependencies clash. From a performance perspective, this saves developer time by pinpointing the issue. Another aspect: uv produces a universal lockfile, meaning one lockfile can be used on different OS and architectures. It does this by including multiple markers if needed (for example, if a dependency has a different requirement on Windows vs Linux, uv lock can capture both). This means you don’t need separate lockfiles per platform, which is efficient for multi-platform projects (one resolution to rule them all, essentially). In rare cases, this can make the lockfile a bit larger, but it greatly simplifies maintaining dependencies for cross-platform apps.
Integration examples: Fast installation integrates well into CI systems. For instance, many projects integrate uv in GitHub Actions using an official action that sets up uv, then uses uv sync
to install deps from the lockfile. This often replaced steps where projects used pip and a requirements file; by switching to uv, their CI runs sped up noticeably and became more reliable (since uv sync ensures no “drift” in versions). Another integration scenario is caching: CI pipelines often cache the ~/.cache/uv
directory between runs to avoid re-downloading packages. Because uv uses a global cache, it’s easy to cache; you just persist that folder and uv will find previously downloaded wheels, making even cold start builds fast. Also, uv’s uv pip compile
(part of its pip interface) can integrate with other tools – for example, if you need to generate a requirements.txt for publishing or for a Heroku deployment that doesn’t use uv, you can do so. This shows that uv doesn’t isolate you from the broader ecosystem; rather, it augments it with speed.
Common errors and solutions: While using uv add
or uv sync
, a common error might be lack of network connectivity (resulting in a failure to download packages). The solution there is straightforward: ensure internet access or configure proxies. Another error case is dependency conflicts. If uv cannot find a set of versions that satisfy all constraints, it will error out with a message. In such cases, you may need to manually adjust your version requirements or exclude a problematic version. For example, if uv add
complains that “Package A (>=2.0) conflicts with Package B (<2.0)”, you might decide to remove or downgrade one of those packages to allow resolution. Uv’s error messages make this easier by pointing to the culprits. If you encounter an error like “No virtual environment found” when running uv add
or uv sync
, it could mean you didn’t run uv init
first in a new project directory. The fix is to initialize the project (uv init .
to initialize current directory) so uv knows where to put the environment and pyproject. Another potential issue is if you use uv pip install
without specifying --system
in a CI environment – uv by default expects a uv-managed venv, so if you truly intend to install globally (like in a container base environment), you should use uv pip install --system
to avoid that error. In general, sticking to uv add
for project dependencies and uv sync
for reproducing environment will keep you clear of most issues. If you ever need to start fresh, you can delete the .venv
and run uv sync
to rebuild it exactly as per lockfile. This feature – fast, deterministic dependency management – is the cornerstone of why uv is valued.
Virtual environment management
Another core feature of uv is built-in virtual environment management. Unlike workflows where you manually create and activate a venv, uv automates this so that each project gets an isolated environment with minimal user intervention. When you run uv init
in a directory, uv will either use an existing Python installation or (if none is available or specified) even download one for you, then create a virtual environment in that project folder (commonly named .venv
). This environment is automatically used by uv for all package installations and running code, meaning you never have to manually activate or deactivate it – uv handles that behind the scenes with its commands. This feature is important because it ensures every project is self-contained: packages installed for one project won’t conflict with those in another, and you won’t accidentally pollute your system Python. For developers, it simplifies workflow: you just use uv’s commands, and it implicitly uses the correct environment.
How it works: uv leverages Python’s built-in venv module (or an equivalent mechanism) to create the virtual environment. For example, uv init
might output “Creating virtual environment at: .venv” and indeed set up a typical venv structure (with its own python
binary and site-packages directory). It also drops a .python-version
file by default, pinning the Python version for the project. Inside uv’s workflow, whenever you run a uv command that needs an environment (like uv add
, uv run
), uv will either find the existing .venv
or create one if missing (using the Python version specified by the project or the latest available). This means as a user you rarely type source .venv/bin/activate
. Instead, if you want to run a script or open a shell in that environment, you can just use uv run
. For instance, uv run python
will drop you into a Python REPL within the project’s environment (which is a quick way to verify all your packages are there). If you truly need to activate the environment (say for some IDE integration that doesn’t call uv), uv does tell you how – it prints the activation command after creating a venv (like “Activate with: source .venv/bin/activate” for Unix). But in practice, using uv’s own run/shell commands is simpler.
Key commands related to env management: We’ve mentioned uv run
, which is a versatile command. The syntax is uv run [--python version] [--] <command>
. If you provide a --python
flag (for instance uv run --python 3.11
), uv will ensure that version of Python is used for the environment (downloading it if necessary, see next section on Python versions). Without --python
, uv uses the project’s pinned Python (which defaults to whatever was available during init). After the --
, you can put any shell command. If it’s just a script name or module, uv will invoke Python to run it. For example, uv run pytest
will run the pytest
tool inside the environment. Or uv run ruff check
(as in uv’s README) will run the Ruff linter that was installed in that env. Essentially, uv run
acts like a replacement for the $ python
or $ pip
commands you would run after activating an env, ensuring those commands execute with the environment’s interpreter and context. Another handy command is uv venv
. Simply running uv venv
will create a virtual environment (if one doesn’t exist) and print its path. It’s like an explicit way to set up the env if you want to do it separate from adding packages. You can also use uv venv --python X.Y
to force creation of a venv with a specific Python version (again, this intersects with the Python management feature). There isn’t a uv activate
command (because uv wants to abstract that away), but if needed, uv venv
combined with a manual activation is possible. In summary, uv’s env management commands cover creating, using, and customizing virtual environments, all integrated into other operations.
Examples:
Automatic venv creation: You start a new project:
mkdir myproj && cd myproj && uv init .
. The output shows uv initialized the project, maybe something like “Using Python 3.12, Creating virtual environment at .venv”. Now in your directory you have.venv/
with a Python interpreter. If you open.python-version
, you might see3.12
pinned (assuming that’s the version uv used). Now, if you runuv add flask
, uv will install Flask into.venv
. At no point did you manually callpython -m venv
oractivate
; uv decided where and how to create the env. If you list the directory, you’ll see pyproject.toml (with Flask in dependencies) and uv.lock, etc.Running a script without activating: Suppose you have a script
app.py
that uses Flask. Rather than activating the venv and runningpython app.py
, you can douv run python app.py
. This ensures thepython
used is the one inside.venv
. Or even shorter, since uv detects a Python file,uv run app.py
likely works too. Uv’s mechanism will print something like “Installed X packages in Yms” if some dependency was missing (for instance, if app.py had an inline dependency metadata). But if everything is up-to-date, it’ll just run. This is convenient because you don’t need to remember to activate or worry which environment you’re in – uv picks the right one based on your project directory (it finds the pyproject or uv.lock in current or parent dirs to know it’s in a uv project).Working in a shell: If you want to do multiple commands in the environment (maybe you want a persistent shell with the venv active), uv doesn’t directly provide
uv shell
, but you can achieve similar by using your shell through uv run. For example, in bash or zsh on Linux/Mac, you could douv run bash
– this will spawn a new bash shell process inside the environment (so$PATH
is set to.venv/bin
first). From there you can run multiple commands (like pip list, or python manage.py runserver for a Django app) all under the environment. When you exit that shell, you’re back to your global environment. On Windows, similarlyuv run cmd
oruv run powershell
would drop you into a shell in the env. This is an advanced trick; usually, directly usinguv run
for each command is enough.Multiple environments: Uv primarily assumes one environment per project. However, uv does support workspaces which allow multiple interrelated projects (each with its own env) under one umbrella – similar to monorepos. In a workspace scenario, each workspace member might have its own
.venv
. uv’s management extends to that by allowing you to run a command across all or sharing lockfiles. But for a typical use, one project folder corresponds to one venv. If you have two projects (Project A and Project B) both using uv, they each have their own.venv
, which is exactly what you want for isolation.
Performance considerations: Creating virtual environments is generally quick, but uv adds some improvements. For example, uv might reuse an installed Python interpreter to avoid re-downloading if you have one that matches (if you haven’t pinned a version, uv will choose the latest available Python on your system or manage one). The actual env creation (copying or symlinking the Python binary and standard library) is I/O-bound, but usually under a few seconds. Uv’s impact here is minimal since it just triggers the process and prints a nice log. The benefit is uv ensures the venv is created correctly for the platform (for example, on Windows it uses the proper scripts, on *nix proper symlinks). Because uv caches downloaded Python installers, if it needs to create envs with different Python versions, it won’t re-download the installer each time. Also, since uv’s environment management is integrated, you avoid mistakes that can cost time – e.g., installing into the wrong environment. The slight overhead uv adds is checking that the environment exists and is up-to-date whenever you run something, which is usually negligible (it might check a hash of pyproject vs lock to decide to sync, which is extremely fast and only done when needed).
Integration examples: uv’s environment management plays nicely with IDEs and other tools. For example, if using VS Code, once uv created .venv
, VS Code can detect it and use it, as mentioned earlier. In a team context, everyone using uv means no one accidentally uses the global environment – it enforces the rule of one env per project by convention. Another integration scenario: if you use pre-commit for git hooks, those hooks often create their own env or require certain tools installed. With uv, you could include dev dependencies in your pyproject for those tools and then ensure pre-commit uses the uv environment (possibly by configuring pre-commit to use system Python which uv pinned). While not direct integration, it’s a pattern that uv enables: you manage everything in one env rather than multiple scattered ones. Also, uv’s .python-version
file is similar in concept to pyenv’s version file – some IDEs or tools that detect .python-version
might auto-switch to that interpreter (e.g., some shell plugins or editors). That means uv’s env and Python choice can propagate to tools that understand this convention.
Common errors and solutions: A common question is “Where is my virtual environment?” If someone doesn’t see a venv
or env
directory they’re used to, they might not realize uv created .venv
by default (a hidden directory due to the dot). The solution is simply to know uv uses .venv
(you can actually customize this by setting the VIRTUAL_ENV
environment variable or maybe a config in pyproject if needed, but .venv
is the default). Another potential hiccup: If you try to run uv
commands outside of a project directory, uv might complain it “could not find project” or it will default to a global mode. If you get errors about no pyproject found, ensure you’re in the right folder or have initialized the project. If you accidentally delete the .venv
folder manually, you can recreate it by running uv sync
(since the lockfile knows what should be there). If you manually install something into .venv outside of uv, you might create a mismatch between the environment and the lockfile. Uv will warn if things are out of sync. The fix is to either avoid manual pip installs (stick to uv), or if done, then update pyproject and lock via uv commands to reflect that. Another error scenario: using pip inside the uv-managed venv. If you activate .venv
and run pip install X
, pip might refuse if the environment is marked as externally managed (PEP 668) by uv. In fact, uv does drop an EXTERNALLY-MANAGED
file in the environment to signal pip that “hey, uv is managing this env”. If you see pip errors about externally managed environment, the solution is to either use uv to install packages, or if you really need to use pip directly for something, you can use the --break-system-packages
flag in pip (not recommended for normal use)stackoverflow.com. This mechanism ensures that you don’t inadvertently confuse uv by adding packages outside of it. Overall, uv’s virtual environment feature reduces many error-prone steps and ensures consistency, as long as you use uv commands for interacting with the environment.
Python version management
One standout feature of uv is its ability to manage Python versions seamlessly as part of your workflow. Traditionally, if you needed a specific Python version for a project, you’d use a tool like pyenv, or manually install Python and manage multiple installations. uv can automate that process. It includes commands to install different Python interpreters (CPython versions, and even PyPy) and to easily switch or pin your project to a certain version. This is a big deal for ensuring consistency: for example, if your project requires Python 3.10 while your system’s default is 3.12, uv can ensure that 3.10 is used for that project without you having to intervene at the OS level. It essentially integrates environment and interpreter management under one tool.
Key commands for Python versions: The primary command is uv python
. This command has subcommands like install
, list
, pin
, and others.
uv python install <versions>
will download and install one or multiple Python versions for you. For example,uv python install 3.11.5 3.8.16
would fetch those two Python releases (for your OS and architecture) and make them available to uv. Under the hood, uv likely downloads pre-built binaries (similar to what pyenv would do) – on Windows these might be the official installers or embeddable zips, on macOS/Linux perhaps tarballs from Python.org – and it caches them. The first time you do this it might take a few seconds to a minute depending on the size. The output will show progress, e.g., “Installed 2 versions: + cpython-3.11.5, + cpython-3.8.16” etc.uv python list
will list the Python versions uv knows about. Without flags, it might list both installed versions and those available. Withuv python list --only-installed
, you’ll see the ones you have on your system that uv can use. For example, the output could be a table of version identifiers and paths. Uv typically names these versions in a unique way, likecpython-3.11.5-macos-arm64
to distinguish OS and arch.uv python pin <version>
is used within a project to pin that project to use a specific Python version. If you runuv python pin 3.11
, uv will create or update the.python-version
file in your project to3.11
(or the full spec like3.11.5
if you have that installed). This means whenever uv operates in that project, it will try to use Python 3.11.x. If that exact version isn’t installed, uv may try to install it on the fly (with permission) or throw an error instructing you to install it. By pinning, you ensure that even if your system default Python changes, your project stays on the intended version.Additionally, you can specify Python version on certain commands. We saw
uv venv --python X.Y
which both installs (if needed) and uses that Python to create the venv. Also,uv run --python X
can temporarily use a specific interpreter for a single run. There’s alsouv run --python pypy@3.8
(an example from uv docs) to use PyPy version of Python if installed. This shows uv can manage alternative Python runtimes too, not just CPython.
Examples:
Installing Python versions: Suppose you are on a machine that only has Python 3.9 by default, but you want to create a project using Python 3.11. After installing uv, you can run
uv python install 3.11
. uv will fetch the latest micro-release of 3.11 (say 3.11.4) for your platform. You might see output like “Searching for Python versions matching: Python 3.11” and then a download progress, and then “Installed 1 version in X seconds: + cpython-3.11.4-<platform>”. Now, if you douv python list
, you’ll see that 3.11 is available. If you check your filesystem, uv likely stored this in a centralized location (perhaps~/.uv
or~/.cache/uv/pythons
). But you don’t need to worry about where – uv abstracts that.Pinning to a project: Now you start a new project that specifically needs 3.11 because maybe you want match production environment. In your project directory:
uv init .
might automatically choose a Python (if one is not pinned, uv might use either the newest available or the one specified in pyproject’s requires-python). To be explicit, runuv python pin 3.11
. This writes.python-version
as mentioned. If 3.11 wasn’t installed yet and you forgot to install, uv would likely say “Python 3.11 not found, installing...” or ask to install. Once pinned, any subsequentuv add
oruv run
will use the 3.11 interpreter. If you open.venv/bin/python
, that will actually be the 3.11 binary.Using multiple versions on one machine: Let’s say you have two projects: one pinned to 3.8 (legacy code), another pinned to 3.12 (newest features). With uv, you can
uv python install 3.8 3.12
to have those available. Each project canuv python pin
accordingly. When you switch directories between those projects, uv automatically picks up the appropriate interpreter. This is similar to what pyenv does with its shell switching, but uv does it via the context of uv commands. For example, if you’re in the 3.8-pinned project and runuv run python --version
, you’ll get “Python 3.8.x ...”. In the other project,uv run python --version
yields “Python 3.12.x”. You don’t have to manually switch anything; uv’s aware of the.python-version
files.Running scripts with a specific interpreter: If for some reason you want to run a one-off command with a different Python, uv allows it. Suppose in your project you temporarily want to test something on Python 3.10 without changing the pin. You can do
uv run --python 3.10 -- python myscript.py
. This tells uv: use Python 3.10 interpreter to run this command (assuming 3.10 is installed via uv). It won’t permanently change the project’s setting, it’s just for that invocation. This might be useful for testing cross-version compatibility or using PyPy for performance tests, etc., without altering your environment.
Integration with environment creation: It’s worth noting how uv’s Python management integrates with virtual env management described above. When uv creates a virtual environment for a project, it uses the pinned or specified Python. If none is specified, uv likely uses the highest version installed or available. If that happens not to be what you want, you can pin after the fact and then recreate the venv. For example, if uv init
by default picked 3.12 but you intended 3.11, you can do uv python pin 3.11
and then perhaps uv venv
to recreate using 3.11 (or remove .venv and uv sync
). Uv strives to automate Python setup: if a collaborator clones your project and it’s pinned to 3.11, and they run uv sync
or uv run
, uv will notice “I don’t have 3.11 installed on this machine, let’s fetch it.” This automatic installer invocation ensures portability — not just of dependencies, but of the Python runtime itself. That’s a killer feature for truly reproducible dev environments.
Performance considerations: Managing Python versions obviously involves downloading large binaries (the Python installer or build). Uv likely caches these, so installing a given version is a one-time hit. The command uv python install
can install multiple versions in parallel too, which saves time if you request many at once. Using a uv-installed Python is typically as fast as using a system Python, with one minor note: uv’s downloaded Pythons are stored in user space (not system-wide), so file paths might be long, but that doesn’t affect runtime significantly. Also, uv ensures that when you use those Pythons in virtual envs, they’re fully functional (including ensurepip or pip if needed). It’s designed to avoid the “it works on Linux but Windows dev doesn’t have Python” scenario by just grabbing the needed interpreter. For performance, also note uv can handle multiple interpreters side by side without interference, something manual setup might struggle with (no messing with PATH or environment variables to switch versions – uv takes care of that contextually). If disk space is a concern, you might not want to hoard too many Python versions; uv doesn’t auto-remove old ones, so you may occasionally prune ones you don’t need by deleting them from its cache (though uv might not have a CLI command for uninstalling Python, doing it manually by removing the directory works, or one could just leave them as they don’t conflict).
Common issues and solutions: If uv fails to install a Python version, check your internet connection or whether the version string is correct. Uv usually accepts just major.minor (it’ll pick latest patch) or major.minor.patch. If you request something obscure or a pre-release, uv might not find it – in which case, specifying an exact version might help. On Windows, uv’s Python install might require the Visual C++ redistributable (just like normal Python does). If an installed uv-Python doesn’t run, ensure any necessary system requirements are in place (though for most, not an issue). Another thing: uv installs are independent of system Python, so if you rely on system packages or modules, be aware that a uv-managed Python is a vanilla one. In rare cases, a user might ask “Does uv handle conda environments or system packages?” – answer: no, uv focuses on CPython itself, not OS-level packages. If a particular Python version is not available (like a very old 2.x or a very new alpha), uv might not support installing it (for alpha/beta maybe, depending if it fetches from Python’s releases). Usually, though, it covers all modern versions.
If you see an error like “Unable to find interpreter X.Y”, ensure you typed it correctly and that uv’s known version list includes it. You can do uv python install --latest 3.10
or similar to let uv pick up latest patch of 3.10. If uv already has a required version available from system, it might use it. Actually, uv will discover existing Python installations too – uv python list
can show system Pythons (like from pyenv or OS). Uv might use those if you pin to one that’s already present (so it doesn’t always force a download if you have Python installed). However, using uv’s own managed ones is nice for consistency. Finally, if multiple projects require different Pythons, uv handles that seamlessly. Just be mindful that if you pin one project to an interpreter you haven’t installed via uv or don’t have on your machine, you need to run uv python install
first (or uv will attempt it). In summary, uv’s Python version management feature wraps the functionality of tools like pyenv into uv, giving you assurance that not only your dependencies are correct, but the Python runtime itself is the right version for your application.
Script and tool support
uv isn’t just for managing project folders with libraries – it also has features for scripts and command-line tools, which bring functionality similar to pipx and make writing standalone Python scripts more convenient. This is a core part of uv’s appeal: it can handle one-off scripts that require their own dependencies, and it can manage globally installed CLI tools all under the same umbrella.
Script dependency management: Suppose you have a single Python file (script) that you want to run without setting up an entire project or virtual environment manually. With uv’s script support, you can embed the dependencies required by that script at the top of the file as metadata, and uv will ensure those are installed in an isolated environment when you run the script. Essentially, uv treats the script itself as a mini-project. The syntax for inline metadata is a special comment section in the script file. For example, at the top of your script you might have:
#!/usr/bin/env -S uv run --script # /// requirements # requests==2.31.0 # pandas==2.0.3
This is a hypothetical example (the exact format may vary, but # ///
is indicative of uv’s section in comments). The shebang line here uses uv run --script
so that if you mark the file as executable, it will invoke uv to run it. The lines following (the “requirements” block) list packages and exact versions needed. When you then execute that script (e.g., ./myscript.py
if on Unix with execute permission, or via uv run myscript.py
explicitly), uv sees that inline metadata, creates a temp environment, installs those packages (if not already cached), and then runs the script inside that environment. This is extremely useful for scripts you might share or run sporadically – you don’t have to manually set up an environment every time, uv does it on the fly. It also means the script is self-contained in terms of specifying its requirements.
Using uv add --script
: There’s a command to help inject that metadata. If you have a script file and you know it needs, say, requests
, you can run uv add --script myscript.py requests
. uv will then open the file and add (or update) the requirements metadata section automatically. In our example above, it would add requests==2.31.0
under the script’s requirements section. This saves you from editing the script manually and ensures that an appropriate version (likely the latest at that time) is pinned. You can list multiple packages as well.
Example: Let’s illustrate. You write a quick script
example.py
that doesimport requests
and uses it. Instead of creating a whole project, you just want to runexample.py
and have it work. Initially,example.py
has no metadata. You runuv add --script example.py requests
. uv processes the file (maybe it adds a block of comments at the top saying requests>=2.x or == specific version). Now you simply douv run example.py
. The output will indicate something like “Reading inline script metadata from: example.py”. If requests (and any dependencies) aren’t installed yet in uv’s cache, uv installs them to an ephemeral environment. Then it executes your script, prints results, and when done, likely discards that environment (or caches the packages globally for next time). The next run of the same script would reuse cached wheels and start up very quickly. This way, you didn’t have to clutter your system Python with requests or create a venv manually. The script is portable; anyone with uv can run it and uv will ensure dependencies are there.
Tool management (like pipx): uv also provides functionality analogous to pipx for managing Python command-line tools. These are typically Python packages that provide console scripts entry points. uv’s tool interface lets you run these tools in isolated environments, or install them so they’re available as commands globally (through uv). The alias uvx
is provided as a shorthand for uv tool run
. For example, if you want to run a one-off command from a package, say pycowsay
(a Python version of cowsay, just for fun), you can do uvx pycowsay "hello world!"
. The first time you run this, uv will resolve and install the pycowsay
package in an ephemeral environment, then execute its console entry point to print the ASCII cow with “hello world!”. The output shows uv quickly installing the tool: “Resolved 1 package... Installed 1 package... + pycowsay==0.0.0.2” and then the actual cow output. After running, uv likely cleans up that environment (or keeps a cache). This is essentially pipx’s “run” feature, but integrated into uv with the benefit of speed.
If you want to have a tool persistently available (like pipx’s install), uv offers uv tool install <package>
. For instance, uv tool install ruff
would install the Ruff linter globally through uv and make the ruff
command available in your shel. Under the hood, uv probably installs ruff into a dedicated location and adds a shim so that typing ruff
runs it via uv’s environment. The output of that command would list that it installed ruff and “Installed 1 executable: ruff”. Now whenever you run ruff
in your terminal, uv will ensure the uv-managed version is executed (depending on how uv does shimming; possibly by modifying your PATH or adding wrapper scripts). This unifies environment management for CLI tools with the rest of uv’s caching and speed.
Why it’s important: The script support encourages writing scripts without worrying about dependency hell – you can just list what you need at the top and share the script; anyone with uv can run it reliably. This is great for automation tasks, utility scripts, or data science notebooks converted to scripts for reproducibility (imagine sharing a single file that when run via uv always pulls the correct version of numpy/pandas required). The tool support is important because Python’s CLI tool installation can be messy (pip installing them either in global site or separate venvs). uv essentially provides an environment-managed way to have global tools: the tools are isolated from each other but accessible globally, avoiding version conflicts and the need to use pip --user
or manual virtualenvs for each.
Examples of real-world use:
Using uvx for ephemeral commands: Suppose you want to run
black
(the code formatter) on a codebase without installing it permanently. You can just douvx black .
in your project directory. uv will fetch black if not already in cache and run it on the current directory’s files. You didn’t need to include black in your project dependencies or have it installed globally – uv took care of one-time usage. Next time you run it, it’s faster because it’s cached.Installing a dev tool: If you frequently use a tool like
httpie
(for making HTTP requests from command line) but don’t want to pollute your base Python,uv tool install httpie
will give you thehttp
command available. uv ensures it’s installed in a self-contained way. If a new version comes out, presumablyuv tool install -U httpie
or similar might update it (or perhapsuv add
could work, but likely there’s an update command – if not, reinstall does it).
Performance considerations: Running a script with uv’s inline dependency approach does incur a slight overhead on first run (to set up env), but uv’s speed mitigates it. For instance, in the example, installing 5 packages took 12msgithub.com – obviously that’s after perhaps a cache hit; the point is uv can do ephemeral envs very quickly. For heavy tools, the initial run equals pip install time (with uv’s improvements), but subsequent runs are nearly instant. Also, uv cleans up ephemeral script envs (likely destroying them after execution to avoid clutter), but caches the packages. This means if you run the script again later, it doesn’t re-download packages, just quickly creates a new env with cached wheels, which is extremely fast (the lock and install might be under a second for many packages). The global tool installations are kept, so their performance is just like any installed program – no overhead beyond normal.
Integration examples: This feature is useful in systems ops or data science workflows. For example, you could include a #!/usr/bin/env uv run --script
shebang in your deployed scripts; as long as uv is present on the system, it ensures dependencies are satisfied on execution. That’s a nice way to deploy a script with all dependencies pinned inside it – an alternative to containerizing or creating a package. Another integration: Think of automation scenarios where you drop a single Python file on a server to do a cron job. With uv, that file can carry its env with it (via metadata), making deployment trivial (just drop the file and ensure uv is installed).
For tools, developers might use uv’s tool install in their dotfiles or setup scripts to quickly get essential CLI tools (like linters, formatters, project generators) ready to use. It’s cross-platform, so the same uv tool install
commands work on Windows, Mac, Linux, avoiding platform-specific package managers for those Python-based tools.
Common errors and solutions: When using uv add --script
, ensure your script file is not read-only and that uv can write to it. Also, if a script already has a requirements
block, uv will update it, but if it’s malformed, you might get a parse error. Typically, uv marks the block with # ///
so it knows where to insert. If you manually edit that section, follow the format uv expects. If a script uses an uncommon way to declare dependencies, uv might not pick it up (but the standard is to use uv’s format). If you run uv run some_script.py
and the script has no metadata and you haven’t pre-installed its imports, you’ll get ImportErrors. Solution: use uv add --script
to embed or pre-install dependencies by another means.
For uvx
and uv tool
, a common question: where do these get installed? uv abstracts it, but essentially uv maintains its own directory for globally installed tools. If a tool is not launching, ensure that uv’s bin directory is in your PATH. uv likely ensures that when you install the first tool. On Unix, it might symlink the tool executable into ~/.local/bin
or similar, which is usually on PATH. On Windows, uv might add shims somewhere or modify the user PATH to include its tool scripts. If you find uv tool install X
succeeded but X
command not found, you might need to add uv’s tool path. Check uv --help
or docs for where it puts executables (for example, pipx uses ~/.local/bin
by default for its wrapper scripts).
Another thing: if a tool has the same name as an existing command, uv’s installed one might shadow or conflict. For example, if you uv tool install pip
, that might be weird (though typically you wouldn’t do that; pip is not needed via uv tool). Or if you install flake8
via uv and you also have it in a project’s venv, you might accidentally use the global one. Generally, keep in mind which one you intend to use. If needed, you can uninstall a tool with uv tool uninstall <pkg>
(assuming that exists – likely it does to mirror pipx). If not, uv remove
might not target global tools (since that’s more for project dependencies), so probably there’s a uv tool remove
or the install list is separate.
All in all, uv’s script and tool support make it easier to work with Python in ad-hoc scenarios, turning Python into a more batteries-included scripting environment. It lowers the barrier to using Python for quick tasks because dependency management is no longer a hassle for a single script or CLI tool – uv has you covered.
Project packaging and publishing
Beyond development and environment management, uv also shines in packaging and publishing Python projects. This feature elevates uv from just a dependency manager to a full project manager in the sense that it can assist in building distribution artifacts (like wheels) and uploading them to package indexes (like PyPI). It essentially can replace tools like setuptools’ setup.py
, build, and twine with a single unified interface.
Building a project (uv build): If your project is meant to be an installable library or application, you’ll eventually need to package it. uv provides a build backend and commands to do so. It adheres to standard packaging, meaning you declare your project metadata in pyproject.toml
under [project]
(PEP 621) – which uv already has you doing for dependencies. Additional fields like name, version, description, license, etc., should be filled out (uv may prompt for some of these during uv init
, or you can edit the file). Once that’s done, you can run uv build
. This will compile your project into distribution files, typically a wheel (binary distribution) and a source distribution (sdist). The output might be something like: it creates a dist/
directory with myproject-0.1.0-py3-none-any.whl
and myproject-0.1.0.tar.gz
. Under the hood, uv’s own build backend likely takes care of things like generating a PKG-INFO
, copying files, building any C extensions if present (though uv’s backend would have to interface with your build system for compiled code – likely it supports pure Python builds, and possibly uses something like maturin or setuptools for extensions). The important part is you didn’t need to install build
or setuptools
explicitly – uv includes this capability. In fact, Astral has announced that uv’s build backend is stable and a viable alternative to setuptools or hatchling for pure-Python packages.
Configuring the project for build: You might need to add a [project.scripts]
section for console_scripts, or [project.entry-points]
for plugins, etc., if your project has those. This is similar to how you’d do with any pyproject-based build system. Uv doesn’t require a setup.cfg
or setup.py
at all if you’re using its backend; everything goes in pyproject.toml
. In some cases, uv can build even projects not originally set up with uv. The README snippet said: “uv supports building and publishing projects, even if they’re not managed with uv”. This implies you could use uv’s build/publish on a legacy project with setup.py, etc., and uv will either hand over to setuptools or use the information present to build. That makes uv a convenient one-stop for packaging tasks on any project.
Publishing (uv publish): After building, you’d want to distribute. uv likely has a uv publish
command that will upload your distributions to PyPI or another index. It probably reads the repository URL and credentials from a config (maybe from ~/.pypirc
or environment variables like PYPI_TOKEN
). The uv README references a “[publish guide]”, indicating a documented process. The usage might be as simple as uv publish
which by default tries to upload to PyPI (or a configured repository) the files in dist/
. If you need to specify repository or credentials, uv might allow flags or environment settings. For example, uv publish --repository testpypi
could upload to TestPyPI if configured, similar to twine’s options. The beauty is that you don’t have to separately install Twine or manage distribution files manually – uv does the build and upload in one go. Uv likely handles things like version checks (ensuring you increment the version to avoid re-upload errors), and might provide helpful error messages if something goes wrong (like missing long description content type, etc., though since uv uses pyproject, those fields are typically correct).
Examples:
Configuring and building: Suppose you’re writing a library “coolmath”. In your pyproject, under
[project]
you set name = "coolmath", version = "0.1.0", authors, etc., and dependencies underdependencies
as uv already did. Also, if you have acoolmath
package directory with an__init__.py
, uv should be able to build it. You runuv build
. uv’s output might show something like “Building wheel for coolmath... built coolmath-0.1.0-py3-none-any.whl” and “Building sdist... built coolmath-0.1.0.tar.gz”. Now you have distribution files ready to distribute.Publishing: Now,
uv publish
might require that you’ve logged in or provided credentials. Possibly uv has auv login
or uses an API token from an env var. Let’s assume you havePYPI_USERNAME
andPYPI_PASSWORD
env or a token. Runninguv publish
then uploads both the wheel and sdist to PyPI. It might output “Uploading coolmath-0.1.0-py3-none-any.whl ... success” and similarly for the sdist. If you have two-factor auth on PyPI, hopefully you’re using a token which bypasses 2FA (since that’s how twine works now). If something fails, uv would report it (like HTTP error if credentials wrong).Even if not using uv for dev: Perhaps you maintain an older project but heard uv is great for quick publishes. You could install uv globally and in that project just run
uv build
anduv publish
. uv would detectsetup.cfg
orsetup.py
if present and try to build either via PEP 517 (if pyproject specified a backend) or fallback to legacy. It acts as a unified interface. This way, you can use uv’s performance for resolution and still integrate with older packaging logic. However, fully embracing uv’s approach (PEP 621 metadata) can simplify your project in the long run.
Performance considerations: Packaging is usually I/O and CPU bound (for large projects, building could mean compiling C extensions). uv’s effect on performance here might not be dramatic since building a wheel is dominated by those factors. However, uv’s advantage is convenience and possibly parallelization of build steps if applicable (though typically wheels built one at a time). One potential performance area is if uv’s builder is written in Rust, certain operations like file copying or compression might be faster than Python’s default tools. It might use the Rust cargo
crate for building wheels, which might be more efficient. Also, uv’s integration can avoid the overhead of invoking external tools (like calling out to python setup.py sdist
or spawning twine processes) by handling it in-process, which is faster. But these savings might be minor in the grand scheme of packaging. The bigger “performance” benefit is fewer manual steps for the developer.
Integration examples: By using uv for packaging, you integrate your development and release process. CI pipelines could use uv to both install deps for tests and also build/publish if doing continuous deployment. For instance, you could have a GitHub Actions job that on a new tag runs uv build
and uv publish
with appropriate credentials. That’s simpler than installing build tools and invoking them separately. Also, uv ensures consistency: the dependencies you tested with via uv are the ones that go into your lockfile, and your build uses those same dependencies declared. There’s less chance of discrepancy between dev and release time.
Additionally, uv’s build backend means you can declare optional features, dynamic versioning, etc., possibly through uv’s configuration. For example, if you want uv to auto-generate a version based on git tags, uv might allow that (some tools do). Or if your package needs to include certain files, uv’s backend likely respects the exclude
patterns in pyproject or includes specified by packages = [...]
. Since uv’s authors also built Ruff, which deals with packaging in Rust, they likely ensure uv’s backend covers common needs.
Common issues and solutions: One common task is including non-Python files in your package. With setuptools, you’d use MANIFEST.in
; with PEP 621 and newer build backends, you often use exclude
/include
in pyproject or have to ensure they’re in package dirs. If uv’s backend is like hatchling or flit, you’ll want to set include = ["data/*"]
or similar in pyproject if you have data files. If you find your built wheel missing files, adjust those config fields. Another possible issue: if your project has native extensions (C code), uv’s build backend might or might not handle it out of the box. If not, you might have to specify a different build backend in pyproject (like maturin or setuptools). However, Astral’s mention suggests uv’s backend is ready for production use for pure Python (not sure about compiled extensions). If uv can’t build something because of C extensions, the solution is to use the appropriate build system (maybe even call uv pip wheel .
which would indirectly use setuptools). But as uv matures, it may incorporate that too.
When publishing, an error you might encounter is “Upload failed: 403 Forbidden” which usually means credentials are wrong or the project name is taken or version already exists. Ensure you bumped the version in pyproject if re-releasing. If your .pypirc isn’t being read, check that uv uses it (it likely does as twine does). Possibly you might do uv publish --username __token__ --password pypi-XXXX
if you want to supply directly.
Also, note stylization: uv prefers the project name in lowercase (like most tools). If your project name in pyproject has uppercase, PyPI will normalize it to lowercase. That’s usually fine, just be consistent.
Finally, if you prefer not to publish to PyPI but to internal index or GitHub Packages, uv likely allows that by config or command flags (like uv publish --repository-url ...
). The official resources would detail how to set that up.
In conclusion, uv’s packaging and publishing features turn it into a one-stop solution from development all the way to distribution. This reduces the cognitive load of juggling multiple tools and ensures a smoother path from writing code to sharing it with the world.
Advanced usage and optimization
Performance optimization
Even though uv is already designed for high performance, there are ways to optimize its usage to squeeze out even more speed and efficiency. One key aspect is caching. uv automatically caches downloaded packages (wheels) in a global cache directory (~/.cache/uv
by default), which means subsequent operations don’t need to re-fetch the same files. To optimize performance, especially in continuous integration or frequent environment rebuilds, you should take advantage of this. For example, in CI pipelines or Docker builds, cache the ~/.cache/uv
folder between runs. This way, a uv sync
or uv add
command can find previously downloaded packages and skip the network, dramatically speeding up installs. Also, uv’s cache deduplicates dependencies across environments, so you are not wasting disk space by having the same version of a library in 5 different virtualenvs. To ensure best performance, avoid clearing the uv cache unless necessary. If you do need to reclaim space, you could prune older unused packages from it, but keep commonly used ones.
Another area of optimization is uv’s parallelism and concurrency. uv by default parallelizes tasks like downloading multiple packages. You usually don’t have to configure this – it will detect your CPU count or use an optimal number of threads. However, if you’re on a very constrained system (say, limited network bandwidth or an old spinning disk), you might not want too much parallel I/O. In such cases, uv might offer flags to limit concurrency (for instance, an environment variable or config in pyproject.toml
might exist to set max resolver threads or download threads). Conversely, if you have a super fast connection and want uv to really saturate it, ensure no external limits are set (like pip’s default is 5 simultaneous downloads – uv might already exceed that if beneficial). Keep in mind that pip’s new resolver in Python is heavy on CPU; uv’s PubGrub is efficient, but if you have extremely large dependency trees (hundreds of packages), resolution could still take some time (though much less than pip). If you want to profile uv’s performance on resolution, you can run uv lock --verbose
to see which step might be slow (like network vs resolution). But typically, uv’s overhead is minimal. Real optimization might involve simplifying your dependency constraints to make resolution faster (e.g., avoid overly broad version specifiers that cause backtracking).
Memory management techniques: uv is compiled and memory-efficient, but when installing very large packages or many packages at once, it will consume memory to hold metadata and caches. If you are in a low-memory environment (say a small container), you can optimize by installing in smaller batches or using uv sync
with a lock (which might be more straightforward for the resolver). That said, memory is usually not a big problem with uv since Rust handles it well and frees promptly when done. Unlike pip that could sometimes spike in memory usage when handling large requirement sets, uv should be more modest. Still, if you are automating uv on devices like Raspberry Pi with limited RAM, you might consider adding swap or not running parallel compile jobs simultaneously with uv.
Speed optimization strategies: Besides caching, one strategy is to use uv’s lockfile to avoid re-resolving dependencies repeatedly. For example, during development, after you’ve added all needed packages, subsequent uv sync
operations (which apply the lockfile) are way faster than doing uv add
one by one from scratch on a new machine. So incorporate uv lock
into your flow – check in your uv.lock
file to version control. Then new clones can do uv sync
(fast) rather than a bunch of uv add
commands (each triggers resolution). Another tip: if you know you’re going to add multiple packages, you can add them in one uv add
command instead of sequentially. A single uv add
with multiple packages resolves them together, which is usually more efficient than doing each one separately (which would re-resolve intermediate states multiple times). Uv’s resolver can handle them all at once and maybe find better overall compatibility in one go.
Parallel processing capabilities: uv itself handles parallel tasks internally, but you can also parallelize across uv operations in certain scenarios. For instance, uv can install multiple Python versions in parallel (if you run uv python install
with multiple versions, it will likely fetch them concurrently). But normally, you wouldn’t run two uv commands at the same time on the same project (that could cause contention on the lock or cache). However, if you had two completely separate uv projects, you could install deps concurrently in two shells if needed – uv’s cache locking would ensure they don’t corrupt each other’s downloads. Still, that’s an edge case. A more useful parallelization approach is using uv’s workspace
feature where uv might resolve and manage multiple sub-projects together (possibly leveraging combined solving if dependencies intersect). This reduces overhead compared to handling each sub-project sequentially. Workspaces allow monorepos to have one lockfile for multiple packages, so uv can consider the whole picture at once, which can be more efficient than separate resolution per package.
Caching strategies: We’ve covered package cache, but there’s also the Python version cache. uv caches the downloaded Python installers (and the installed Python binaries in its directory). If you are frequently spinning up ephemeral containers where you need a specific Python, you might pre-bake that into a base image to save the download each time. For example, if your CI always needs Python 3.10 via uv, consider having a base image with uv and Python 3.10 already installed (using uv python install
during image creation). Then each pipeline job doesn’t need to re-download that interpreter. Also note that uv’s installed Pythons are under your user directory – if you have multiple CI runs on the same agent, once one run installs Python 3.10, the next run can reuse it (unless the workspace is cleaned). On developer machines, you seldom need to remove these installed Pythons – having a few versions around is beneficial for testing, and uv will automatically use them when needed. If you do need to free space, you can remove older Python versions that your projects no longer target.
Profiling and benchmarking uv: If you ever want to measure uv’s performance (perhaps to prove to your team the gains, or to compare with alternatives), you can use standard tools like time
command or resource monitors. For resolution speed, try timing uv lock
vs pip-compile
on the same requirements set – uv is typically much faster. For install speed, time uv sync
vs pip install -r
on a large requirements.txt – uv tends to finish in a fraction of the time (especially with cached wheels). To profile internals, since uv is in Rust, you can’t easily attach a Python profiler. But uv might have a verbose mode that prints timing breakdown (like how pip has --verbose
that shows each requirement fetch). Uv’s maintainers have claimed 10-100x speedups, which we see in practice when heavy tasks are involved. These speedups come from eliminating Python’s overhead and using smarter algorithms.
In advanced use, you might find that the bottleneck shifts. With uv, installing packages might be so fast that the slowest part is now building any packages that have to compile from source (like some C extensions). To optimize that, ensure wheels are available or pre-build wheels for your environment. For instance, if a dependency is always compiling and taking time, see if you can install a wheel by adding or using a different index (like manylinux wheels from PyPI). Uv will happily install those if available, saving you compile time. Also, if you have control, pin to versions that have wheels. This is less about uv itself and more about dependency selection to optimize speed.
In summary, uv’s default performance is excellent, but by leveraging caching, lockfiles, parallel operations, and being mindful of heavy steps like compilation, you can make your development and CI workflows even more lightning-fast and resource-friendly.
Best practices
Using uv effectively goes beyond just the commands – it’s also about how you organize and manage your project and workflow. Here are some best practices to follow when adopting uv:
Organize your project with standard conventions. uv encourages using the pyproject.toml for all metadata. Make sure to fill out the [project]
section thoroughly – name, version, description, authors, license, classifiers, etc. This not only helps in publishing (if you do) but also keeps documentation for your project’s requirements clear. Keep your source code in a dedicated package directory (e.g., myapp/
or mypackage/
) and ensure uv is including it in the build (via packages = ["myapp"]
if needed in pyproject). Because uv auto-inits a git repo on uv init
, leverage that: commit your changes often, including the pyproject.toml and uv.lock. A well-organized repository with uv means any new contributor just needs to clone and run uv sync
to get started.
Commit and maintain your lockfile. Treat the uv.lock
file similar to a Pipfile.lock or poetry.lock – i.e., put it under version control. This ensures that everyone on the team and in CI is working with the exact same dependency versions, eliminating “it works on my machine” issues due to dependency drift. When you update dependencies (via uv add
or uv upgrade
), the lockfile will change; review those changes in code reviews (it’s often just version bumps or new additions). Committing the lockfile makes builds reproducible. Also, if something goes wrong after an update, you can use version control to diff what changed in uv.lock to pinpoint which library upgrade may have caused an issue.
Use uv for all dependency changes rather than pip. This means, if you need to install a new package, resist the temptation to do pip install
inside the venv – always do uv add
. This keeps pyproject and uv.lock consistent with the environment. If you manually pip install something for quick testing, it won’t be recorded, and that can cause confusion later. Similarly, to remove a dependency, use uv remove
, not just pip uninstall, so that the config files are updated. By funneling all such changes through uv, you maintain a single source of truth.
Handle dev vs prod dependencies properly. uv supports dev (development) dependencies which are needed for development (like testing frameworks, linters) but not for runtime. The best practice is to separate these so that production environments (or your deployed application) isn’t burdened with packages it doesn’t need. In uv, when adding, use uv add --dev <package>
(or possibly uv add -D
) to mark something as a dev dependency. This likely puts it in a separate section or marks it in the lockfile such that you could choose not to install dev deps in production. For example, maybe uv sync
by default installs all dependencies, but uv might have an option to install only non-dev. Check uv docs on how it differentiates dev dependencies – perhaps the pyproject gains an extra table [tool.uv.dev-dependencies] or so. If that feature exists, use it to keep your deploy footprint minimal. For instance, add pytest, black, mypy, etc., as dev deps, and your production build (if using uv there) can ignore them.
Continuous Integration and Testing: Integrate uv into your CI pipelines. Instead of using pip, use uv
in your CI scripts. A common pattern is:
uv sync # install exact locked dependencies
uv run python -m pytest # run tests inside the env
This ensures tests run in the same environment developers use. If you want to test against multiple Python versions, uv makes that easier: you can have a matrix in CI that installs different Python versions (using uv’s Python installer) and runs uv sync
in each – because uv.lock is universal, it will pick appropriate wheels for each platform/py version. Also, consider adding a CI step to check that the lockfile is up-to-date: for example, run uv lock
on CI and see if it produces changes (meaning someone updated pyproject dependencies but forgot to commit lockfile). You can fail the build in that case, reminding them to update the lock.
Documentation and onboarding: In your project’s README, include instructions for new developers to install uv and use it. For instance, “First, install uv (via pipx or the installer script). Then run uv sync
to set up the project environment. Use uv run
for running the application or tools.” By making uv part of the documented workflow, you ensure consistency. Also mention any frequently used uv commands for the project, like how to run tests (uv run pytest
), how to add a new dependency, etc. This lowers the learning curve for those new to uv or to your project.
Error handling strategies: If something goes wrong with uv (like an installation failure), read the error – uv often gives a clear message. Encourage team members to not circumvent uv at first sign of trouble, but rather diagnose or seek help, because doing things outside uv might solve a short-term problem but create inconsistency. For example, if uv add
fails due to a conflict, address the conflict rather than pip installing forcefully. Use uv’s error output to resolve version conflicts by maybe loosening/tightening a version spec in pyproject. In cases where uv’s behavior is not understood, it might be beneficial to consult uv’s documentation or community (perhaps Astral’s Discord or GitHub issues). Since uv is evolving, it’s possible you hit a bug or an unimplemented edge case; the best practice there is to report it or find workarounds that still align with uv’s workflow.
Testing approaches with uv: For unit tests, as mentioned, use uv run
to execute them to ensure the correct env. If you need to test different sets of dependencies (like optional features), uv doesn’t yet have an official concept of dependency groups beyond dev vs main. But you could maintain separate lockfiles if needed for a scenario. A simpler approach: have separate dev dependencies that can be toggled via flags in tests. In integration tests, if you spin up ephemeral environments, consider using uv’s script feature or just call uv via CLI in the test script to set up environment as needed. The consistency uv provides (thanks to the lockfile) is a big plus for reliable tests.
Documentation standards: If your project is a library, and you plan to publish it, using uv doesn’t impose anything on your end users (they install via pip normally). But internally, document how to cut a release with uv (e.g., “bump version in pyproject, run uv build
, run uv publish
”). Possibly create a Makefile or a shell script that calls uv to do these steps, so maintainers have an easy time. Make sure your pyproject has the readme
field set to your README file and the readme-content-type
correctly, to avoid PyPI description issues. Uv likely handles this if you specify readme = "README.md"
and readme-content-type = "text/markdown"
in pyproject. Following these standards ensures that when using uv for packaging, everything is in place.
Production deployment tips: If you are deploying an application (not a library) using uv, you have a couple of options. You could use uv on the production servers or build systems to create the environment (for example, in Docker, after copying the code, run uv sync
to install deps). This will ensure the exact versions from uv.lock are used. Alternatively, you could generate a requirements.txt from uv.lock and use pip in production – uv provides uv pip compile --output requirements.txt
for such scenarios. Either approach is fine; using uv directly in production might require installing uv on the server, which is an extra binary but fairly lightweight (just ensure you trust it in prod). Many will find it simplest to use uv in CI to produce a deterministic artifact (like a Docker image with everything installed). If you do include uv in your production images, remember to update it occasionally (uv self update
) to get the latest bug fixes and speed improvements. Given uv’s focus on reproducibility, it aligns well with infrastructure-as-code and containerization approaches.
In summary, best practices with uv revolve around embracing its way of doing things across your workflow: keep everything in sync via pyproject and lockfiles, let uv handle the heavy lifting of env management, and document and enforce usage of uv so that all team members and processes are on the same page. By doing so, you’ll get the maximum benefit from uv – less time debugging environment issues and more time coding.
Real-world applications
Case Study 1: accelerating CI/CD for a web service – A mid-sized tech company migrated an existing Flask web service project to uv to speed up their CI pipeline. Previously, each CI run (GitHub Actions) would spend ~2-3 minutes in setting up a Python environment: creating a venv, pip installing from requirements, etc. After adopting uv, they committed a uv.lock
with their dependencies. In their CI script, they replaced the pip install step with uv sync
, which consistently took under 10 seconds to resolve and install ~50 packages (thanks to uv’s efficient resolver and parallel downloads). Over many builds, this saved hours of runner time. Additionally, because uv ensured exact dependency versions, they saw fewer build failures due to transient dependency issues. For instance, pip used to occasionally pick up a new library version that broke something; uv’s lock prevented that. The team noted that builds were not only faster but also more reliable, helping them deploy to production more quickly and confidently.
Case study 2: cross-platform development in an open source project – An open-source maintainer of a CLI tool (packaged in Python) was struggling to support contributors on different operating systems. Some contributors used Windows and had difficulty setting up the project (virtualenv activation, correct Python version), while others on Mac/Linux had their own environment quirks. The maintainer adopted uv to simplify setup for everyone. They pinned the project to Python 3.11 with uv and added a uv.lock
. Now, new contributors just needed to install uv and run uv sync
– uv would automatically download Python 3.11 on Windows if not present and create the venv. This solved issues like Windows users having path problems or wrong Python on PATH. One Windows developer noted, “Moved the Python piece of this project to uv and solved all these problems with a single uv sync
”. The project’s documentation was updated to instruct using uv, and it significantly lowered the barrier to entry for contributors on all platforms.
Case study 3: replacing pipenv in a data science team – A data science team had been using Pipenv to manage an analysis pipeline’s environment, but they encountered performance issues: Pipenv’s lock could be slow and pipenv sync often took a long time with large libraries like NumPy, SciPy, pandas, etc. They switched to uv after hearing about its speed. The initial lock (resolving about 20 scientific packages with complex dependencies) took only a few seconds with uv, compared to nearly a minute with pipenv. More importantly, installing these packages into a fresh environment was dramatically faster – what used to be a 5-minute pipenv install was around 30 seconds with uv. The team also appreciated uv’s handling of Python itself: their analysis required Python 3.10, and uv automatically managed that, whereas before they had to instruct everyone to install the correct Python version. With uv, each team member, whether on Windows or Mac, got the exact same Python and package versions automatically. This consistency eliminated weird discrepancies (like one person using an older NumPy that pipenv didn’t properly constrain on one platform). Overall, the team’s iteration cycle improved, as they could tear down and recreate environments quickly, encouraging them to use isolated envs for separate projects rather than mixing dependencies.
Case study 4: UV for tooling in a large organization – A large enterprise’s DevOps team started using uv to manage the plethora of Python-based developer tools used across projects. They had tools like ansible
, awscli
, flake8
, etc., previously installed manually or via system packages on dev machines, which led to version drift. The DevOps team created an internal script that uses uv tool install
to set up a standardized set of CLI tools for any developer workstation. New hires would run a bootstrapping script that installs uv, then runs a series of uv tool installs (for linters, testing tools, etc.). This ensured everyone had the same versions of these tools, isolated from their project environments. They found that uv’s global cache also meant that if a developer already had some tool’s package cached via a project, installing it as a global tool was instantaneous. For example, if a user had used black
in a project and it was cached, the uv tool install black
for global usage took negligible time. The organization saw fewer “works on my machine” problems for tooling – such as a consistent terraform-lint
or pre-commit
version being used by all.
Case study 5: lightning-fast environment setup in academia – A university research lab managed a complex environment for a machine learning experiment, involving over 80 packages (TensorFlow, scikit-learn, etc., including some with GPU support). Setting this up with pip was error-prone and slow, so initially they used conda, but mixing pip and conda packages caused conflicts and bloated environments. They tried uv and were impressed by the speed and simplicity. Using uv, they created a lockfile for all required packages (targeting a specific CUDA-compatible version of TensorFlow). uv’s resolver handled the versions smoothly, something that was a headache with conda’s solver sometimes. In deployment, using uv sync
allowed them to provision identical environments on several GPU servers in seconds, where previously conda solves could take minutes. Performance metrics: a conda environment solve and create was ~4 minutes, whereas uv accomplished environment creation in under 30 seconds on the same machines (with warm cache). This meant they could tear down and recreate environments for each run, leading to more reproducible experiments. Additionally, uv’s universal lockfile meant that one lockfile worked for both a researcher’s Windows 11 laptop and the lab’s Linux servers – uv would pick appropriate wheels for each. This cross-platform reproducibility was something they hadn’t easily achieved before.
Case study 6: migration of a monorepo to uv workspaces – A startup maintained a monorepo with multiple Python packages and tools (some internal libraries, a backend service, and a CLI utility). Managing dependencies was complex – they tried Poetry’s multi-project support but faced issues, and pip + requirements files were getting unwieldy. They migrated to uv’s workspace feature, defining a top-level pyproject that listed each package in the workspace. uv was able to create a single lockfile encapsulating all the interdependencies. They set up each sub-project to use the shared lock, meaning if Package A and Service B both depend on requests
, it’s resolved once for both. The result: developers run uv sync
at repo root and get a .venv that has everything, or they can still isolate sub-envs if needed but with consistent versions. Build times improved because caching was unified, and there were fewer issues of one part of the repo requiring an upgrade that breaks another – uv’s resolver finds a compromise or flags conflicts early. The team appreciated that uv effectively acted like Cargo (from Rust) for their Python monorepo, a comparison the devs liked because many were coming from Rust background. This case also showed uv’s maturity: handling multiple packages and even building them for distribution with uv build
in a workspace context.
Case study 7: adoption in a continuous deployment workflow – A DevOps team integrated uv into their continuous deployment for a Django web application. They built a Docker image for the app that previously relied on pip installing from a requirements.txt. By switching to uv, they added the uv binary to the base image and replaced pip install -r requirements.txt
with uv sync
. The locked dependencies ensured the Docker image was identical to the development environment. Moreover, because uv’s install was faster, the Docker build time shrank (which is valuable when deploying frequently). They measured that pip installing ~100 packages (including heavy ones like numpy, psycopg2 etc., some compiled) took around 2 minutes, while uv sync
with warm cache (using Docker layer caching) took about 20 seconds. Over many deploys a day, this saved significant time. Additionally, one surprise benefit was size: the uv-managed environment ended up slightly smaller because uv didn’t include pip itself or pip’s metadata bloat in the final image (since uv doesn’t need to leave behind as much installation cruft in site-packages). The production containers were a few megabytes lighter, and every bit counts in a microservices architecture.
These case studies illustrate uv’s impact: faster installs, unified tooling, improved cross-platform consistency, and streamlined workflows in various domains from web dev to data science. In each scenario, uv helped eliminate either performance bottlenecks or sources of inconsistency that older tooling struggled with, thus validating uv’s importance as a modern Python development tool.
Alternatives and comparisons
Detailed comparison table
To evaluate uv in context, here’s a comparison between uv and several alternative Python library/tools (pip + pip-tools, Poetry and Pipenv):
Feature/Criteria | uv (Astral) | pip + pip-tools (pip + reqs) | Poetry |
---|---|---|---|
Scope of functionality | All-in-one: manages packages, virtualenvs, Python versions, scripts, and publishing. | Basic: pip installs packages; pip-tools (pip-compile/pip-sync) for lock files, but no environment or Python version management. | Comprehensive: manages deps with lock (poetry.lock), virtualenv creation, and can build/publish packages. |
Performance | ⚡ Very fast – 10-100× faster installs and resolution than pip (Rust implementation, parallel ops, PubGrub solver). Scales well with large deps. | Moderate – pip is in C but resolver is slower for complex deps; pip-tools adds overhead (pure Python) for locking. Large requirements can be slow to resolve. | Decent – written in Python, uses own solver (backtracking) which can be slower than uv. Installation uses pip under the hood, so similar install speed as pip. |
Dependency resolution | Modern PubGrub algorithm – efficient, gives clear conflict errors, produces a single universal lockfile for all platforms. Handles version overrides and multi-platform deps nicely. | pip’s new resolver (backtracking) – fairly reliable but can be slow and gives less detailed conflict errors. pip-compile generates lock per environment (not universal cross-OS). | Backtracking resolver – often effective but can hang or be slow on tricky conflicts. Poetry lock is per platform (with markers), not fully universal. |
Virtual environment | Automatically creates and manages .venv per project; no manual activation needed (via uv run commands). Global cache for packages avoids duplication. | pip doesn’t manage venvs; user must create/activate venv manually. pip-tools doesn’t handle env either. | Automatically creates venv for project (unless configured to use system env). Activation is manual or via poetry run . Generally smooth venv integration. |
Python version management | Yes – can install multiple Python versions (e.g., uv python install 3.11 ), and pin project to a version (with .python-version ). Automates downloading and switching interpreters. | No – relies on whatever Python you run pip with. pip-tools has no notion of managing Python itself (apart from markers). | Partial – Poetry can use a specific Python if you poetry env use X (it will create env with that version if installed on system). But it doesn’t download Python for you. |
Lockfile support | Yes – uv.lock with hashes, fully reproducible. Lockfile is platform-independent, includes all needed info for any OS. | Yes (via pip-tools) – requirements.txt can act as lock or pip-compile makes a requirements.txt with hashes. Typically one per environment or with environment markers. | Yes – poetry.lock with hashes. It captures platforms via markers, but sometimes separate lock needed for different OS if deps drastically differ. |
Ease of use / UX | Modern CLI, simple commands (uv add , uv sync , uv run ). Consistent behavior; high-level commands reduce need to know pip/venv details. Some familiarity needed for advanced (scripts, tools) but docs are growing. | pip is low-level but ubiquitous. pip-tools is a separate workflow (running pip-compile then pip-sync). It’s more manual and less integrated (two tools instead of one). | Poetry has a rich CLI (poetry add , poetry run , etc.) and a config file. Generally user-friendly, though initial setup of Poetry itself and some idiosyncrasies exist. |
Community & maintenance | Emerging – Backed by Astral (company behind Ruff), rapidly gaining traction (65k+ GitHub stars). Active development in 2024-2025 (frequent releases). Community is growing; likely to become mainstream if momentum continues. | pip is very widely used (the default). Maintained by PyPA, so strong support. pip-tools is moderately popular in niche of locking; maintained but smaller community. | Large community – Poetry became a standard for many projects. Actively maintained by PyPA now. Many tutorials and integrations support it. |
Documentation quality | Good and improving – official docs on Astral site, plus RealPython tutorials and community blogs. Some advanced features (workspaces, script metadata) require reading the guides. | pip’s documentation is comprehensive but pip-tools documentation is okay (GitHub README mostly). Community knowledge high for pip, moderate for pip-tools. | Very good – detailed official docs, lots of examples. Poetry’s error messages also often suggest solutions. Plenty of third-party articles. |
Licensing | Apache 2.0 / MIT dual license – permissive, no constraints for use in OSS or commercial. | MIT for pip; pip-tools also MIT. Very permissive. | MIT license – permissive open source. |
When to use | Use uv if you want a one-stop solution for Python projects: it’s ideal for speed and convenience, especially in projects where setup time matters (CI, onboarding) or where consistency across machines is crucial. Great if you like modern tools (Rust-backed) and want to manage Python versions and packages together. Slight learning curve, but payoff is high for medium to large projects. | pip + pip-tools is a lightweight approach if you want to stick close to the default and only need dependency locking. Good for simple or small projects where a full tool might be overkill. However, you miss out on env and Python management, so you’ll do more manual work or use other tools alongside. | Poetry is a solid choice for many Python projects (packaging and apps) if you prefer a mature, integrated tool in Python. It excels at traditional package management and publishing. Use it if your workflow aligns with it and you don’t mind it being a bit slower than uv. Not as fast as uv, but widely accepted. |
Migration guide
When to migrate to uv: If you’re currently using another tool (pipenv, poetry, etc.) and facing pain points like slow installations, inconsistent environments, or juggling multiple tools, migrating to uv can be a wise choice. Common triggers for migration include: CI builds taking too long (pip is slow, pipenv or poetry resolver is dragging), developers on different OS having setup issues, or just a desire to unify tooling (maybe you use pyenv + pip + pip-tools + virtualenv separately – uv can replace all of those). Also, if you want to take advantage of uv’s features (like automatic Python installation or inline script dependencies), that’s a good reason to migrate. Conversely, if you have a stable workflow and performance is not an issue, you might not need to migrate immediately. But many projects in 2024-2025 are considering uv as it matures, to simplify their toolchain.
Preparing for migration: The first step is to ensure that uv supports everything you need. Check uv’s current version and features. As of late 2025, uv is quite feature-complete for typical package management and publishing tasks. If you use an exotic feature of another tool (like Poetry’s poetry-specific dev groups or Conda’s non-Py packages), plan how to handle those in uv (for dev groups, uv has dev deps; for external libs, you might install via system or another method since uv focuses on Python packages). Next, install uv (you can even add it alongside existing tools initially). It’s safe to install uv in an environment where you have pip/poetry; uv works independently.
Step-by-step migration process:
Generate a uv pyproject and lockfile: If your current project uses a
requirements.txt
or Pipfile or pyproject (Poetry), you need to translate that to uv’s format. Easiest is if you already have a pyproject (Poetry) – uv can actually use that since it follows PEP 621 for[project]
. Poetry’s pyproject with[tool.poetry.dependencies]
might need conversion to[project]
standard (Poetry 1.2+ writes to[tool.poetry]
, so you’d want to produce a PEP 621 section; you can use Poetry to export to requirements or manually edit). For a requirements.txt, you can douv init
to create a baseline pyproject.toml, then useuv add
for each requirement or useuv pip install -r requirements.txt
inside a uv-managed venv to install them and then maybeuv lock
to capture (not sure if uv can import from req file directly yet; possibly you’d script theuv add
of each). For Pipenv Pipfile, convert to requirements or dopipenv lock -r > req.txt
then same approach. Essentially, get your dependencies into uv by adding them.Lock and verify: Run
uv lock
to ensure auv.lock
is generated. Compare it with your previous lock (if any). You may want to review if uv chose slightly different sub-dependency versions (it might, due to a different resolution strategy). If all tests pass with these, fine. If uv’s picks differ and you need the exact old versions, you can force constraints in the pyproject or douv add package==version
to adjust. In most cases, uv should match or find compatible versions.Commit uv files: Add pyproject.toml (with [project] populated) and uv.lock to version control. At this point, you might keep your old files (requirements.txt, Pipfile, etc.) temporarily for fallback, but plan to remove them once uv is proven.
Update documentation and scripts: Replace usage of old commands with uv. For example, in README, say “use
uv sync
instead ofpip install -r requirements.txt
”. If you have a Makefile or convenience scripts, update those: e.g.,make install
can now just calluv sync
. If you have CI pipelines installing dependencies, switch them. In GitHub Actions, for instance, remove pipenv or poetry setup steps and just use uv (you might use a provided action likeastral-sh/setup-uv
to install uv quickly in CI). Thenuv sync
to install dependencies in CI.Ensure team has uv installed: Team members should install uv locally (via pipx or installer). This is a one-time overhead. Provide instructions or even add a script: e.g., a bootstrap script that detects if uv is present, and if not, offers to run the curl install.
Optional – clean up old config: Once you are confident in the uv setup (perhaps run it in parallel a short time to be sure), you can remove old config files to avoid confusion (like requirements.txt or Pipfile). If for any reason you need to maintain them for external reasons, you can auto-generate them from uv’s lock (e.g., have a script
uv pip compile uv.lock --output requirements.txt
to keep reqs updated for someone not using uv, but ideally everyone moves to uv).
Code conversion examples:
Pip requirements to uv: If requirements.txt had lines like
Django==4.2.5
,requests>=2.28
, you’d runuv add "Django==4.2.5" "requests>=2.28"
. This populates pyproject with those deps (exact constraint as given, uv might lock requests to a specific version like 2.31.0 but keep constraint as >=2.28 in pyproject). Afteruv lock
,uv.lock
pins Django 4.2.5 and requests 2.31.0 (for example). Thenuv sync
replicates that environment.Pipenv Pipfile to uv: A Pipfile might have
[packages] requests = "*"
. Equivalent in uv:uv add requests
(which defaults to latest). For specific versions, do similar as above. Dev packages: If Pipfile had dev section, useuv add --dev <pkg>
. This will mark them appropriately (maybe uv’s pyproject will include them in separate table or mark them as dev in lock).Poetry to uv: If your project was using Poetry, you likely have a
[tool.poetry.dependencies]
in pyproject. You would create a[project]
section withdependencies = []
listing each as string spec. e.g., if Poetry hadnumpy = "^1.24"
, in[project]
you’d putdependencies = ["numpy>=1.24.0,<2.0.0"]
(Poetry’s caret gets translated to version range). You can copy project metadata (name, version, etc.) from[tool.poetry]
to[project]
(with minor syntax differences, e.g., authors to list of strings). After adjusting pyproject, runuv lock
and it should yield the same or very similar versions to poetry.lock. You might spot differences if Poetry allowed a slightly different resolution – test to ensure everything works. Then drop[tool.poetry.*]
sections if not needed (so you don’t have confusion, and maybe because uv might ignore them anyway).
Common pitfalls and how to avoid them:
Mixed tool usage: Don’t try to partly use uv and partly use the old tool on the same project – that leads to confusion. For example, once migrated, stop running
poetry install
orpipenv install
, as those might overwrite or conflict with uv’s environment. Standardize on uv.IDE integration: If your IDE was using a certain method (like PyCharm using poetry’s interpreter), you may need to point it to uv’s venv. Usually, uv’s venv is
.venv
in project root, which PyCharm/VSCode can detect. If you pinned Python,.venv
is correct version. So ensure the IDE is now using.venv/bin/python
. Some IDE tasks (like “install package” buttons) – advise team to use uv CLI instead of those, or ensure those actions install via pip into.venv
which uv manages (should be fine if env is active). It’s best if developers treat uv as the source of truth and not mix using IDE’s package managers concurrently.Continuous deployment: If your deployment relied on another tool (say you were using
pip install
in Docker), ensure you test using uv in that pipeline. E.g., a production image will need uv (maybe install via pip in the Dockerfile or copy the binary) to douv sync
. Alternatively, generate a requirements.txt from uv.lock for that purpose if you prefer not to include uv in the image. But including uv is often okay (it’s not huge, and it can even run self-contained since it’s Rust compiled).Platform-specific differences: After migration, double-check if everything works on all OS. uv’s lock is universal, but maybe in the old workflow you had some OS-specific handling (like Poetry’s extras for Windows). uv’s solver will include markers in lock for OS-specific dependencies (e.g., if on Windows need
colorama
). Make sure those appear and work. Testing the environment creation on at least one dev machine per OS is wise.Dev vs prod extras: If you had multiple requirement files (like requirements-dev.txt), you should map that to uv’s dev dependencies. uv’s dev deps go in lockfile but presumably marked so you could choose not to install them. Ensure your production deploy uses
uv sync --no-dev
(if such option exists) or some method to exclude dev. If uv currently doesn’t have a toggle, another approach: maintain two lockfiles (one full, one without dev). But likely uv does mark dev packages and might not install them unless you douv sync --dev
or something. Check uv docs on that. A simple strategy: keep dev dependencies minimal and not installed in production containers by simply not running any install for them (e.g., if using uv in Docker, you might manually dopip uninstall
for dev ones after uv sync, or use separate pyproject for deploy – but that’s hacky; ideally uv has a clean way).
Migrating away from uv: (For completeness, though likely rare). If you needed to go back or to a different tool, you’d essentially do the reverse: export a requirements list via uv pip freeze
or uv pip compile
. uv’s compatibility layer means you can generate a requirements.txt with exact versions (e.g., uv pip freeze > requirements.txt
). That could feed into pip or other tools. Or for Poetry, you could convert uv’s pyproject/lock to their format manually. But given uv’s advantages, going back is usually not desired unless a showstopper arises.
Common pitfalls to avoid: One pitfall is forgetting to update team documentation or CI – leading to confusion when someone still tries pip install
or the CI fails because it doesn’t know about uv. So ensure you update all places. Another is not removing old virtual environments – if devs had an existing venv from pip or poetry, they should probably remove it and let uv create a fresh one, to avoid any leftover packages interfering. uv by default uses .venv
which might conflict if someone had a .venv
from before (Poetry also uses .venv
sometimes). It’s fine – uv will reuse it or rebuild it. Perhaps delete .venv
and run uv sync
to start clean.
Finally, embrace uv’s workflow fully: encourage use of uv run
for running apps or tests so that the venv is correctly invoked. After migration, some devs might still type python script.py
out of habit – ensure they realize that will use global Python if venv not activated. Either activate the venv or use uv run
to automatically use it. Over time, these habits will set in, and the migration will be successful with a faster, smoother development process as the payoff.
Resources and further reading
Official resources
uv Documentation (Astral docs): The official documentation for uv is hosted by Astral at docs.astral.sh/uv. This is the primary resource for installation instructions, command reference, and usage guides. It covers everything from basic
uv
commands to advanced features like scripts, tools, workspaces, and publishing. The docs are continuously updated as uv evolves, so it’s a reliable place to learn the latest capabilities and best practices.GitHub repository (astral-sh/uv): Uv’s source code and issue tracker can be found on GitHub: github.com/astral-sh/uv. The README provides a great overview, including quick start examples and key highlights. You can also browse closed issues and discussions to see common questions or resolved bugs. The repository is active, and contributions or feedback (via issues) are welcome.
PyPI project page: uv is available on PyPI, and its project page is pypi.org/project/uv. This page lists the latest release version (for example, 0.8.13 as of Aug 2025), release history, and project details like license and maintainers. It also shows project links, including the documentation and source. While not tutorial in nature, PyPI is useful for checking the current version and installing (
pip install uv
).Astral blog (uv announcements): The creators have an official blog on Astral’s site where they announced uv and major updates. For instance, an Astral blog post titled “uv: Unified Python packaging” introduced uv’s expanded feature set beyond pip replacement. These posts offer insight into why certain features were added and how to leverage them. Check astral.sh/blog for uv-related articles and announcements of new releases.
Astral’s discord (community chat): Astral Software Inc. hosts a community Discord server (linked on the PyPI page). Here, you can find channels for uv support, where developers and the maintainers discuss usage, issues, and feature requests in real-time. It’s an official resource in that Astral team members are present and it's a great place for seeking help or just staying in the loop with uv’s development.
Community resources
Stack Overflow (uv tag): As uv gains popularity, developers ask questions on Stack Overflow. You can browse the uv tag on Stack Overflow for Q&A. Common questions involve migrating to uv, using
uv pip
commands in CI, or resolving specific errors. The community (including some uv contributors) often answers with helpful tips. If you have a specific issue, searching here can provide solutions or workarounds others have found.Reddit – r/Python discussions: Reddit’s r/Python has threads about uv, especially around its release and subsequent updates. For example, a post titled “uv: Unified Python packaging” sparked discussion and user feedback. People shared experiences, benchmarks, and comparisons with other tools. Searching r/Python for “uv Astral” or “uv package manager” will yield these threads, which are insightful to gauge the community reception and get informal advice.
Real Python tutorial on uv: RealPython.com published a detailed tutorial “Managing Python Projects With uv: An All-in-One Solution” by Leodanis Pozo Ramos. This is a great community-driven resource (Real Python is external to Astral) that walks through uv’s features step by step. It includes code examples and is written in an accessible way, making it a good complement to official docs. RealPython often covers practical scenarios, so it’s highly recommended for learners.
Medium articles and Dev blogs: Members of the community have written blog posts on Medium and personal blogs about uv. For instance, Vishnu Sivan (one of uv’s contributors) wrote “Introducing uv: Next-Gen Python Package Manager” on Medium, explaining the rationale and features from a creator’s perspective. Another Medium article by Rajhans Jadhao, “Migrating to UV: A Swift and Efficient Python Package Manager”, covers transitioning a project to uv. These articles often contain real-world insights and tips on using uv effectively.
YouTube videos: Some educators and content creators have made videos about uv. For example, Arjan Codes on YouTube has a video titled “UV for Python… (Almost) All Batteries Included” which explores uv’s capabilities in a hands-on demo. These videos can be helpful if you prefer visual learning; they show how uv commands work and often compare them to old workflows.
GitHub Discussions and Gist: In the uv GitHub repo, check if the Discussions tab is enabled – sometimes maintainers create discussion threads for ideas and Q&A outside the issue tracker. Also, searching GitHub gists or other repos for “uv usage” can yield sample configs or setup scripts others have shared publicly.
Learning materials
Online courses: While uv is still relatively new, some Python online courses and bootcamps have started incorporating it into their curriculum (especially those updated in 2024/2025). For example, a course on Python DevOps or packaging might now include a module on uv. Keep an eye on platforms like Udemy or Pluralsight for any “Modern Python Packaging” courses – they may cover uv alongside or instead of older tools.
Books: There isn’t a book solely on uv yet (given its newness), but upcoming editions of established books may mention it. For instance, if there’s a new edition of “Python Packaging User Guide” or a book like “Effective Python” or “Python Projects” updated for 2025+, they might include uv in sections on environment management. Also, Astral might release an e-book or extensive guide if uv becomes a standard (just speculative).
Interactive tutorials: Websites like DataCamp have published a tutorial blog “Python UV: The Ultimate Guide to the Fastest Python Package Manager” which reads like a structured interactive guide with examples. While not a code-in-browser interactive, it’s a comprehensive walkthrough. If you prefer a more interactive sandbox, you could try uv in a temporary online environment like GitHub Codespaces or Replit (provided you can install it there) – not a tutorial per se, but you can experiment with uv commands without affecting your local setup.
Example repositories: Looking at open source repositories that use uv can be educational. Astral’s own projects (like the Ruff linter) presumably use uv for development – check Ruff’s repo to see if they mention uv in contributing instructions or have uv config. Also, search GitHub for “uv.lock” – you’ll find projects already committing uv lockfiles. Examining how those projects structure their pyproject.toml for uv, and how they integrate uv in CI (look at their GitHub Actions workflows), can provide real examples of uv in action.
Blog posts and articles: Aside from official and Medium, many individuals blog on their own sites. For example, the DigitalOcean community site has an article “uv: The Fastest Python Package Manager” which might contain a case study or tutorial. The Chinese Python community has also written about uv (Zhihu/Tencent Cloud blogs), which indicates global interest – even if you don’t read Chinese, those posts often have code blocks illustrating usage and comparisons (sometimes with charts or benchmarks). Don’t overlook non-English resources if you can translate them, as they sometimes cover angles not seen elsewhere.
Podcasts: Keep an ear out for Python podcasts (like Talk Python To Me or Python Bytes). They have mentioned uv in news segments (given its popularity surge). There might not be a dedicated episode on uv yet, but as uv becomes mainstream, you might hear interviews with its creators or discussions among experts about it. Podcasts often give context on where a tool fits in the larger ecosystem, which can deepen your understanding.
Conferences and Talks: Python conferences in 2024/2025 (PyCon, EuroPython, etc.) likely have talks or lightning talks on uv. If slides or videos are available, they’re valuable resources. A talk might cover migrating 100 projects to uv, or uv’s design, etc. For example, someone might have presented “Goodbye pipenv, hello uv” at a meetup or conference, and shared their slide deck online.
By leveraging these resources, you can deepen your mastery of uv, stay updated on new features, and see how others are applying uv in various scenarios. The Python community is rapidly embracing uv, so expect more content and discussions to continue appearing over time.
FAQs about uv library in Python
Installation and setup
Q: How do I install the uv library in Python?
A: You can install uv via pip by running pip install uv
. Alternatively, use the official installer script: on macOS/Linux, curl -LsSf https://astral.sh/uv/install.sh | sh
, or on Windows, use the PowerShell command provided in uv’s docs. This installs uv’s binary, which you can verify by running uv --version
.
Q: How do I install uv on Windows?
A: On Windows, open PowerShell as Administrator and run:
powershell -ExecutionPolicy Bypass -Command "iwr https://astral.sh/uv/install.ps1 -UseBasicParsing | iex"
This will download and install uv. Alternatively, if you have Python, you can pip install uv
. After installation, ensure the uv executable is in your PATH so you can run uv
from any command prompt.
Q: How do I install uv on macOS?
A: The easiest way on macOS is to use the installer script. Open Terminal and run:
curl -LsSf https://astral.sh/uv/install.sh | sh
This places the uv
binary in your ~/.local/bin
(ensure this is on PATH). You can also install via Homebrew (brew install uv
) or use pip (pip install uv
), but the script is straightforward and doesn’t require Python pre-installed.
Q: How do I install uv on Linux?
A: On most Linux distributions, you can use the installation script. In your shell, execute:
curl -LsSf https://astral.sh/uv/install.sh | sh
This will download the uv binary to ~/.local/bin/uv
. Make sure ~/.local/bin
is in your PATH (most distros include it for user profiles). If you prefer using pip and have Python 3.8+, you can pip install uv
as an alternative.
Q: Is there a conda install for uv?
A: Yes. uv is available on conda-forge. You can install it into a conda environment by running: conda install -c conda-forge uv
. This will add the uv tool to that environment. Keep in mind uv itself manages virtualenvs and packages, so using it inside conda is possible but somewhat overlapping in functionality.
Q: Do I need to have Rust installed to use uv?
A: No, you do not need Rust installed. uv is distributed as a compiled binary (wheels on pip or via the install script). When you pip install uv
, you get a pre-built wheel for your OS, so there’s no Rust toolchain needed. It’s ready to use out of the box.
Q: What are the prerequisites for installing uv?
A: You need Python 3.8 or higher if installing via pip. If using the shell script or Homebrew, no pre-installed Python is necessary (because it fetches a standalone binary). Ensure you have an internet connection to download uv. On Windows, PowerShell 5+ is needed for the script. Disk space requirement is minimal (the uv binary is on the order of 20 MB).
Q: Should I install uv globally or in a virtual environment?
A: The typical use case is to install uv globally (or via pipx) so that it can manage virtual environments for projects. Installing uv globally (for example in your base Python or via the script) is fine because uv doesn’t conflict with other packages – it’s an isolated binary/tool. You generally wouldn’t install uv inside a project’s virtualenv, since uv’s job is to create/manage that virtualenv. A good approach is using pipx: pipx install uv
to keep it isolated but available everywhere.
Q: How do I verify that uv is installed correctly?
A: Open a terminal or command prompt and run uv --version
. You should see uv’s version number output (for example, uv 0.8.13 ...
). If that works, uv is installed and on PATH. Alternatively, running uv --help
will display the help message. If the command isn’t found, ensure the installation directory (like ~/.local/bin
or the Scripts folder on Windows) is in your PATH.
Q: Can I install uv via Homebrew on Mac?
A: Yes. uv is available on Homebrew. You can install it by running brew install uv
. This will fetch the latest uv release and place the uv
binary in your Homebrew prefix (often /usr/local/bin/uv
or /opt/homebrew/bin/uv
on Apple Silicon). Homebrew will handle adding it to PATH. Once installed, test with uv --version
.
Q: Is uv available on PyPI and how do I use pip to install it?
A: uv is on PyPI under the name “uv”. You can install it using pip: pip install uv
. Make sure you have Python 3.8+ as that’s required. After pip installs uv, the uv
command should be available (the pip installation provides a console script for uv). If you installed via pip but uv
isn’t found, check that your Python’s Scripts directory is in PATH.
Q: Can I use pipx to install uv?
A: Absolutely. pipx is a great way to install uv. Run pipx install uv
. This will install uv in an isolated environment and expose the uv
command globally. Pipx ensures uv’s dependencies (if any) don’t conflict with other Python packages on your system. Once done, uv --version
should show it’s working. Using pipx keeps your base Python clean while giving you the uv tool.
Q: How do I install a specific version of uv?
A: If for some reason you need an older (or specific) version, you can specify it with pip. For example: pip install uv==0.8.10
. Using the script or Homebrew always gives the latest version. Via pip, you can pin a version. Note, however, it’s usually best to use the latest stable uv unless you have compatibility reasons, because new versions often bring improvements and bug fixes.
Q: What’s the best way to install uv on a headless server or CI environment?
A: For CI, using pip is straightforward: pip install uv
in your workflow. Another option is to use the GitHub Action astral-sh/setup-uv
which installs uv in the runner environment for you. On headless Linux servers, you can use pip or the curl script. The script might be simplest if you don’t want to set up Python first. It downloads a static binary so you avoid apt or building from source. In summary, choose pip if Python is present, otherwise use the script.
Q: Does uv support Windows 7/8, or is Windows 10+ required?
A: uv provides wheels for Windows on PyPI for modern Python versions, which implies Windows 10 or later is supported (Windows 7 is EOL and Python 3.8+ likely isn’t officially supported on it). The installer script uses PowerShell which is present in Win 10+. So realistically, you should be on Windows 10 or 11. If you must use an older Windows, uv might still work if you can run Python 3.8 on it and pip install uv, but it hasn’t been tested on unsupported OS. Windows 10/11 are recommended.
Q: Do I need to uninstall pipenv/poetry before using uv?
A: Not necessarily; uv can coexist with other tools. You can have pipenv or poetry installed and still use uv for a new project. They won’t interfere unless you try to use them simultaneously on the same project. You don’t have to uninstall them, but going forward, if you migrate projects to uv, you might not need pipenv/poetry anymore. If you do keep them, just be careful to use each tool in its own context to avoid confusion.
Q: How do I upgrade uv to the latest version?
A: If you installed via pip, you can run pip install --upgrade uv
(or pipx upgrade uv
if using pipx). If you used the installer script, uv comes with a self-update command: uv self update
. Running that will check for the latest release and update the uv binary in place. For Homebrew, brew upgrade uv
will get the newest version. It’s good to occasionally update uv, as new versions improve performance and add features.
Q: Can I install uv without internet access (offline)?
A: If you need an offline installation, you’d have to pre-download the uv binary or wheel. For example, download the appropriate wheel file for uv from PyPI (using a machine with internet) and then transfer it to the offline machine and do pip install uv-<version>-<platform>.whl
. For the script, you could similarly download the script and binary ahead of time. Currently, uv’s installer relies on internet. So plan to fetch the necessary files (either from pip’s package or GitHub releases) in advance.
Q: What is the file size of uv and does it occupy a lot of space?
A: uv’s binary is relatively small (on the order of 10-20 MB). The installed package on disk might be ~20 MB plus some data for caching. The lockfile and pyproject files are tiny text files. So uv itself does not occupy significant space. The major space usage is from the packages you install with uv (which you’d have with pip or others anyway) and the Python versions if uv downloads them (for instance, if uv installs Python interpreters, those can be ~30MB each). But uv’s overhead is minimal compared to typical package sizes in projects.
Q: Is uv compatible with Windows ARM (e.g., on Surface Pro X)?
A: Yes, uv provides wheels for various platforms, including Windows on ARM64. If you have Python on ARM (like via Windows 11 ARM64), pip install uv
will fetch the win_arm64
wheel. The installer script should detect and download the proper binary as well. So you can use uv on Windows ARM machines without issue.
Q: Can I use uv on a Raspberry Pi or Linux ARM devices?
A: If you have Python 3.8+ on, say, Raspberry Pi OS (ARMv7 or ARMv8), you can try installing uv. uv’s PyPI offers manylinux wheels for ARMv7l, aarch64, etc.. That means pip can likely install uv on Raspberry Pi (it will pick the closest matching wheel, like manylinux2014 armv7l). Alternatively, use the script which should fetch the appropriate binary. Many users have reported using uv on Pi for quick environment management, and it should work since Rust compiles to those targets.
Q: Does uv work in a virtual environment or conda environment?
A: Yes, you can install uv within any environment, but typically you install it globally or with pipx. If you install uv inside a conda env or Python venv, it will run fine, but note that uv will then create its own virtualenvs for projects. It’s a bit meta, but not problematic. If you’re in a conda base env and pip install uv
, you can then uv
to manage packages in virtualenvs (distinct from conda). You can also have uv manage a project’s deps while you remain in conda for system-level stuff. Just keep clarity on what uv manages vs conda.
Q: How do I completely uninstall uv if needed?
A: If installed via pip, use pip uninstall uv
(this will remove uv and its console script). If via pipx, pipx uninstall uv
. If installed via the shell script or Homebrew, simply remove the uv binary (~/.local/bin/uv
or wherever it was installed) and any uv support files (there’s no huge footprint beyond the binary). You may also remove uv’s cache and state: delete ~/.cache/uv
(cache of wheels) and optionally ~/.uv
if present (where it stores installed Python versions and some configs). Uninstalling uv does not remove any project virtualenvs it created – those remain (you can delete those .venv folders if you want to reclaim space).
Q: Can I install uv on Python 3.7 or older?
A: No, uv requires Python 3.8 or higher. If you try pip install uv
on 3.7, it won’t find a compatible wheel or will error. The solution is to upgrade your Python to a supported version. Alternatively, use the standalone installer: that binary comes with an embedded runtime and doesn’t rely on your system Python version. But for using uv’s Python library (like import uv
which typically you wouldn’t do anyway), 3.8+ is mandatory.
Q: How long does it take to install uv?
A: The installation is very quick. Pip installing uv typically takes a few seconds (downloading ~15MB wheel). The shell script installation depends on internet speed but usually under 5 seconds to download and place the binary. So it’s not a time-consuming step. Running uv
the first time might do some minimal setup (like create a cache dir), but that’s negligible. In summary, installing uv is a matter of seconds, not minutes.
Q: Is uv installation user-local or system-wide?
A: By default, the installer script installs to user-local (~/.local/bin
) on Unix. Pip will install uv into whichever Python environment you invoke it from (so if that’s a system Python with admin rights, it could go system-wide; if using --user
, it goes to user site). Using pipx is user-local. Homebrew is system-wide (in its prefix). It’s generally recommended to install uv for your user rather than system-wide (especially on multi-user systems) so that it’s under your control. But if you need a multi-user system-wide install, you could place the uv binary in a shared location or have each user install uv for themselves.
Q: Can I use uv in a corporate environment with no internet on dev machines?
A: If dev machines don’t have internet, you’ll need to distribute uv via an internal channel. For example, host the uv wheel on an internal PyPI or file share. Developers can then pip install from that source. Or host the raw binary/installer on an intranet and instruct devs to use that URL with the script. Once uv is installed, using it still requires internet to fetch packages (unless you also mirror PyPI). In fully offline scenarios, uv can install from cached wheels or an index set via config, but you’d need to set that up (similar to pip offline workflows). uv supports --index-url
via uv pip
interface for alternate repositories. So in corporate settings, point uv to your internal PyPI mirror to fetch packages without hitting the internet.
Basic usage and syntax
Q: What is the basic workflow of using uv in a project?
A: The typical workflow is:
In your project directory, run
uv init
to initialize it (this creates a pyproject.toml and sets up a virtual environment).Add dependencies with
uv add <package>
for each needed library (or multiple at once). This installs them and updates pyproject & uv.lock.Use
uv run <command>
to run things (likeuv run python script.py
oruv run pytest
).uv run
ensures the command uses the project’s venv.When collaborating, commit the pyproject.toml and uv.lock. Others can do
uv sync
to install all locked dependencies. That’s it – uv handles environment creation and package management seamlessly.
Q: How do I start a new project with uv?
A: Navigate into an empty project folder and run uv init .
(if you’re already in the folder, uv init .
or simply uv init
should work). This command initializes the current directory as a uv-managed project. It will create a pyproject.toml with some basic info (you can edit name, version, etc.), create a .venv
with a fresh virtual environment, and possibly a sample hello.py
script (uv often does that by default). After uv init
, you have a ready environment – you can begin adding dependencies or writing code.
Q: What does uv add
do?
A: uv add <packages>
is used to add one or more dependencies to your project. For example, uv add flask requests "pandas>=2.0"
will do several things: it resolves the best versions of those packages (and their sub-dependencies), installs them into your project’s virtual environment, and updates the pyproject.toml
(listing those dependencies with version specs) as well as uv.lock
(pinning exact versions). Essentially, it’s like pip install + writing to requirements file + locking combined in one step. The next time someone sets up the project, they’ll install those exact versions. uv add
can also take options like --dev
for dev dependencies and version constraints (as shown).
Q: How do I remove a dependency from my project using uv?
A: Use uv remove <package_name>
. This will uninstall the package from the environment and also update pyproject.toml and uv.lock to remove that dependency entry. For example, if you no longer need django
, run uv remove django
. After that, if you run uv sync
, it will ensure the environment no longer has django or anything that was solely depending on it. This is analogous to pip uninstall plus editing your requirements, but uv does it neatly in one go.
Q: What is uv’s lock file and should I commit it?
A: The uv.lock
file is uv’s lockfile, containing the exact resolved versions and hashes of all dependencies. Yes, you should commit it to version control (just like you would commit Poetry’s poetry.lock or a Pipfile.lock). This ensures reproducibility – everyone and every environment gets the same versions. uv.lock is platform-independent, so one lockfile works for all OS, which is convenient. By committing it, your CI and colleagues will use it via uv sync
to recreate the environment exactly.
Q: How do I install all dependencies from an existing project using uv?
A: If you have cloned a project that uses uv (and it has pyproject.toml and uv.lock), simply run uv sync
. This command reads the lockfile and installs all the packages into the project’s virtual environment to match it. It’s like “synchronize environment with lockfile”. If the environment doesn’t exist yet, uv will create it. After uv sync
finishes, you have everything needed to run the project.
Q: How do I run my Python application with uv?
A: Use uv run
. For example, if your entry point is a script or module, you could do uv run python main.py
or uv run python -m mypackage
. uv run
ensures it uses the project’s venv and correct Python interpreter. If your app is a CLI entry point (installed via a console script), you can also do uv run myapp
if myapp is the command (but only if uv installed it in venv; often uv run
is needed mainly to run Python or tools that aren’t globally in PATH). For Django, e.g., uv run python manage.py runserver
. This is analogous to activating the venv and running the command, but uv does it in one step.
Q: What does uv run
actually do compared to just running Python?
A: uv run
will automatically use the virtual environment that uv manages, without you having to activate it manually. When you do uv run python
, uv locates the .venv
it created (or whatever venv is associated) and runs the Python binary from that env. This way, all imports use the packages from that env. If you just run python
(which might refer to system Python) outside of uv, you might not be using the correct env unless you manually activate it. So uv run
is a convenient shortcut to ensure the correct environment and interpreter are used for any command.
Q: Do I have to activate uv’s virtual environment manually?
A: Not if you use uv’s commands. One of uv’s goals is to avoid manual activation. Instead of source .venv/bin/activate
, you can rely on uv run
to execute commands in the env. However, uv doesn’t forbid you from activating – after uv creates .venv
, you can activate it if you want an interactive session (like activate then use the Python REPL or pip). But it’s generally unnecessary. If you need a shell in the venv, an approach is uv run bash
(to spawn a bash shell with venv active). In short: manual activation is optional; uv’s design means you rarely need it.
Q: Where does uv create the virtual environment by default?
A: By default, uv creates a .venv
directory in the project root. Inside .venv
you’ll find the usual structure (Scripts or bin, lib, etc.). uv uses .venv
as a convention (similar to what many projects do). This is configurable if needed (via environment variable UV_VENV_NAME
or so, if available), but .venv
is the standard. Having it in the project means it’s isolated per project and easy to find (and easy to add to .gitignore if you haven’t already).
Q: How do I specify a particular Python version for my uv project?
A: You can pin the Python version with uv python pin <version>
. For example, uv python pin 3.10
. This will write a .python-version
file in your project. uv will then ensure to use Python 3.10 for the virtualenv (downloading it if you don’t have it installed via uv). Alternatively, when creating the venv initially, you could do uv venv --python 3.10
. But the simplest is to pin after init. This is similar to specifying requires-python = ">=3.10,<3.11"
in pyproject (which uv does as well in metadata). Pinning helps in consistent interpreter across devs/CI. If not pinned, uv uses whatever default Python it finds (or will fetch latest version on demand, which can be convenient but maybe less deterministic).
Q: How do I add a development or dev-only dependency with uv?
A: Use the --dev
flag when adding. For example, uv add --dev pytest
. This marks pytest as a dev dependency. In practice, uv will include it in uv.lock under a dev section (and in pyproject perhaps as extra or so). Dev dependencies are typically installed by default when you do uv sync
(since in development you want them), but uv might allow installing without dev in production. The key is they are designated separately. When listing dependencies, dev ones might be indicated aside. Always use --dev
for tools like linters, test frameworks, etc., that your app doesn’t need in runtime.
Q: How do I list all installed packages in my uv environment?
A: You can run uv pip list
to list packages. uv has a subcommand interface for pip; for instance, uv pip freeze
or uv pip list
works like running pip inside the env. Alternatively, uv run pip list
would also run pip from the venv and show installed packages. Another uv-specific way: look at uv.lock, which enumerates all locked packages and versions (that’s essentially your full list of installed packages). But for quick viewing, uv pip list
is handy.
Q: What does uv pip
do?
A: The uv pip
interface allows you to use familiar pip commands in the context of your uv-managed environment. For example, uv pip install <pkg>
will install a package into the venv. It’s mostly there for compatibility with pip usage, but generally you’d use uv add
instead (because uv add
also updates pyproject and lock). uv pip
is useful for advanced or one-off pip operations, like uv pip check
or uv pip download
. It’s essentially piping commands to an internal pip that uv manages, giving you performance benefits (like the resolver speed-up). If migrating from pip scripts, you can alias pip to uv pip
to get a boost without changing commands.
Q: How can I run a shell or commands within the uv environment?
A: If you want an interactive shell inside the environment, you can use uv run bash
(on Linux/macOS) or uv run cmd
(on Windows). This will spawn a new shell session with the venv activated. Once in that shell, any command you run (like python, pip, etc.) is using the environment. When you exit the shell, you’re back to normal. For a one-liner command, just prefix with uv run
, e.g., uv run pytest
to run tests in env. There isn’t a direct uv shell
command, but the above achieves the same result.
Q: How do I handle multiple projects with uv?
A: Each project directory can be managed by uv independently. They each have their own pyproject and .venv. uv will detect the project by looking for pyproject.toml/uv.lock in the current or parent directories. So just cd into the project you want to work on and use uv normally for that project. If you want to manage a monorepo with multiple interdependent packages, uv has a workspace feature where one pyproject can have subpackages. In standard cases, treat each top-level project separately. uv will keep their environments isolated (different .venvs). Just ensure to run uv commands in the correct folder so it uses the right context.
Q: What is uv init
doing behind the scenes?
A: uv init
sets up the foundation of a uv project. It likely does: initialize a git repo (if none exists) and create a basic .gitignore (including .venv); generate pyproject.toml with [project] section stub (including the project name defaulting to folder name, version 0.1.0, etc.); possibly add a README.md and a sample hello.py for convenience; create a .venv
using the current Python (or downloaded one if not present) with uv venv
. So after uv init
, you have a ready structure to start adding dependencies or coding. It basically bootstraps a project with sensible defaults.
Q: Can uv handle environment variables or .env files when running commands?
A: uv doesn’t have built-in .env file loading like pipenv does. However, you can manage environment variables as you normally would. For example, in a .env
file, you’d have to manually load it (like using python-dotenv in your app or sourcing it in shell). uv’s focus is on packages and env management, not secret management. Some tasks (like uv run) don’t automatically read .env. So you might do: uv run env
to see environment, but if you need variables, export them or integrate a tool. In CI, just ensure environment variables are set as needed before calling uv-run commands. In summary, uv doesn’t directly parse .env, but you can combine uv with other tools if needed (e.g., run dotenv -f .env uv run ...
if you have a dotenv CLI).
Q: What’s the difference between uv sync
and uv add
?
A: uv add
is for modifying your dependencies (adding new ones or updating existing with specific version). It changes pyproject and lock. uv sync
is for reproducing the environment from the lockfile without changing it. When you clone a repo or after you modify the lock, uv sync
will install/uninstall packages to make your venv match uv.lock exactly. It’s akin to pip-sync
from pip-tools or poetry install
from Poetry. In short: use uv add
when you want to introduce changes, and uv sync
when you want to apply the current lock (usually on a fresh or out-of-sync environment).
Q: How do I upgrade a package to a newer version using uv?
A: If you want to intentionally bump a dependency’s version, you can run uv add <package>@latest
or specify the new version. For example, if requests
is at 2.28 and you want 2.31, uv add "requests>=2.31"
(or just uv add requests
which picks latest) will update the dependency. uv will resolve it, install the new version, and update the lock. If multiple packages have updates, you can do uv add -U
(if uv supports an upgrade flag) or simply re-add them. Another approach: manually edit pyproject’s version constraint and do uv lock
to regenerate, then uv sync
. But the simpler interactive method is using uv add
again. It essentially serves as the update command when a newer version is available. Remember that if you want uv to just update everything possible, uv might get an “upgrade all” command in future; currently, doing each key package or removing a pin and re-locking achieves it.
Q: How do I specify optional or extra dependencies (features) in uv?
A: uv uses the PEP 621 project format, which supports optional dependencies via extras. In pyproject.toml, you can define extras like:
[project.optional-dependencies] dev = ["pytest", "black"]
These would be dev extras. uv doesn’t have a separate “extras” command; you’d edit pyproject for that or use uv add with an extra group somehow. If you mean installing a package’s extras, e.g. uv add "requests[security]"
, uv will accept that as pip would. So yes, to include an extra of a package, just include it in the spec. For optional deps of your project (i.e., making extras for consumers), edit pyproject accordingly (uv doesn’t yet have a CLI step for that in uv init beyond dev dependencies which are treated separately).
Q: Can uv handle private or local packages?
A: Yes, uv can install packages from private indexes or local paths using the pip interface. For a private index, you can do uv pip install --index-url <URL> <package>
or set up a pip config that uv will respect. uv’s resolver will try to use configured sources. For a local package (say you have a local directory or wheel), you can uv add path/to/package
– if it’s a directory with pyproject or setup.py, uv should treat it as a VCS/local requirement (likely copying it into lock as a path or direct URL). Alternatively, use uv pip install ./path
. This might not yet reflect in pyproject – you may need to manually put a path requirement. uv is new, so handling local paths may be evolving. But fundamentally, because it wraps pip, it can install from a git URL or local wheel. After installation, you’d have to manually ensure your pyproject lists that requirement (like my-lib @ git+https://...
). Keep an eye on uv docs for specifics on syntax for private packages.
Q: Can I use uv in a project that already has a pyproject.toml (like one configured for Poetry)?
A: Yes, uv can work with an existing pyproject. If it already has [project] metadata (PEP 621 format), uv will utilize that. If it has [tool.poetry], uv might not directly read that, so you’d convert to standard [project] section. uv will not overwrite an existing pyproject’s important fields if they’re there. Typically, migrating from Poetry means adding a [project] section or letting uv init but preserving what you had. So, short answer: you can adapt an existing pyproject to uv and then run uv lock
to get a lockfile. Many fields (name, version, desc) are the same concept, just different table if they were in [tool.poetry]. So a bit of manual editing may be needed. But after that, uv is perfectly happy to manage the project.
Q: How do I handle scripts or entry points with uv?
A: If you want to define console script entry points for your project (so that when installed it provides commands), you do that in pyproject under [project.scripts]
. For example:
[project.scripts] mycli = "mypackage.module:main"
This is standard PEP 621. uv’s build will use that to install an entry point. If you mean running scripts that require packages, uv has a special feature: uv run --script
and inline dependencies (script metadata) for single-file scripts. But that’s more for one-off scripts outside a formal package. For a formal package’s entry point, do as above. uv itself doesn’t require anything special beyond including it in project metadata. When you uv add
or maintain pyproject, ensure your entry points are listed. Then uv build
will produce an installer that registers them. During development, you can still run the entry point via uv run python -m mypackage.module
(or write a small stub to test). If referring to the “scripts” feature where uv can manage script dependencies, that’s a different concept – see advanced usage (script support in uv allows you to embed requirements in a standalone .py script and run it with uv run
). For packaging entry points, just follow pyproject standards.
Q: Does uv support managing multiple Python versions in one project?
A: Not within a single environment. A given uv project has one Python interpreter at a time (the one pinned or used for its venv). If you need to test against multiple Python versions, you’d typically use separate environments (like a CI matrix or tox). uv can help by easily creating envs for different versions if you pin and sync accordingly, but it’s not like it runs multiple Pythons concurrently in one env. However, uv does support quickly switching the version: uv python install 3.9 3.10
, then uv python pin 3.9
, sync and test, then uv python pin 3.10
, sync and test – those steps would swap out the environment’s Python. It actually can do that by downloading the needed Python and recreating venv with it. So while not simultaneous, uv makes multi-version testing straightforward by automating interpreter setup. For true parallel multi-version testing, a tool like tox or separate CI jobs is used, with uv being invoked per job.
Q: How do I use uv for an existing project with requirements.txt (without starting from scratch)?
A: You can import that requirements list into uv. E.g., run uv init
in the project directory to set it up. Then you might do uv add -r requirements.txt
if uv supports reading a req file. If not directly, you can do manual: for each requirement in the file, run uv add
. Or use uv pip install -r requirements.txt
to install them, then do uv lock
to capture exact versions. After that, you have pyproject and lock and can drop the old requirements file. It might take a little initial effort, but then uv will manage going forward. Alternatively, some users wrote conversion scripts to parse requirements and call uv accordingly. But core idea: install using uv what was in requirements, then lock it.
Q: What if uv sync
says “No virtual environment found”?
A: That likely means you haven’t run uv init
or uv add
yet, so uv doesn’t know what environment to sync. Running uv sync
in a fresh directory with a pyproject but no .venv triggers uv to create the venv. If it’s not doing that, ensure you’re in the project directory (with pyproject.toml present). Possibly you need uv init
first to set up the venv. Once the venv is there (or uv can automatically create on sync if pyproject and lock exist), the error should go away. Another scenario: if you manually deleted .venv, uv sync
should recreate it. If it complains, perhaps some state is missing. Try uv venv
to explicitly create env, then uv sync
. The message essentially says uv didn’t find the environment where expected. The solution is to initialize one (maybe use uv venv
or re-init).
Features and functionality
Q: What are the main features of uv?
A: uv is an extremely fast package manager and environment manager for Python. Its main features include:
Dependency management with lockfiles: It resolves and installs packages 10-100× faster than pip, producing a single cross-platform lockfile for deterministic environments.
Virtual environment automation: It auto-creates and manages venvs per project, so you don’t manually activate environments.
Python version management: uv can install and switch between multiple Python versions on the fly, integrating what pyenv/conda might do.
Unified interface (pip/pipx replacement): It can run scripts with isolated deps (
uv run --script
) and manage CLI tool installs like pipx (viauv tool
).Project building/publishing: uv can build wheels and publish to PyPI directly, acting like a packaging tool (akin to twine/poetry publish).
In short, it’s a one-stop solution combining pip, virtualenv, pip-tools, pipx, and more, with a focus on speed and ease.
Q: How does uv achieve its speed improvements over pip?
A: uv is written in Rust and takes advantage of that performance for both dependency resolution and package installation. It uses the PubGrub resolving algorithm, which is very efficient and avoids the slow backtracking that pip’s resolver does. For installation, uv handles downloading and extracting packages with parallel I/O and optimized system calls. It also uses a global cache for wheels, so repeated installs are faster (no re-download). Additionally, uv runs as a compiled binary which executes faster than the Python interpreter setting up pip’s operations. These factors combined (algorithmic improvement + concurrency + compiled performance) yield the dramatic speed-ups.
Q: What is PubGrub and why is it mentioned with uv?
A: PubGrub is a dependency resolution algorithm originally from the Dart language’s package manager. It’s known for being fast and providing clear conflict diagnostics. uv uses PubGrub to resolve dependency version constraints. This means when uv tries to find compatible versions of all packages, it does so in a way that often requires far fewer iterations than older algorithms. It also gives helpful error messages if no solution (e.g., it will tell you which packages’ requirements conflict). PubGrub is a key reason uv’s resolver is both fast and user-friendly in conflict scenarios.
Q: How does uv handle virtual environments under the hood?
A: uv uses Python’s built-in venv module (or an equivalent mechanism via Rust) to create virtual environments. When you uv init
or uv venv
, uv will either locate an appropriate Python interpreter (based on your system or one you installed with uv) and then programmatically create a venv (essentially similar to running python -m venv .venv
). It then tracks that environment for the project. uv extends the interface of virtualenv by not requiring you to activate it; it manages environment variables and PATH internally when executing commands (so uv run
will set things so the venv’s python and scripts are used). The .venv
directory structure is standard – so you can actually activate it manually if needed and see a normal env. uv also pins the Python version by writing a .python-version
file, which is used to pick the right interpreter (similar to pyenv style). All in all, uv automates venv creation and ensures the correct one is used whenever you do uv operations.
Q: What is uvx
or uv tool
and how is it different from uv run
?
A: uv tool
(and its alias uvx
) is for managing ephemeral or global CLI tools that are Python-based. For example, if you want to run a one-off command from a Python package without installing it globally, you can do uvx package_name [args]
. uv will fetch the package, run it, and then discard the environment (except caching the package). It’s akin to pipx run. uv tool install <pkg>
can install a tool persistently (like pipx install) so that the command becomes available for reuse. On the other hand, uv run
is for running something in the context of your project’s environment. So, use uv run
for commands that relate to your current project (using its dependencies), and use uvx
when you want to quickly execute a tool that isn’t a dependency of your project (like running uvx httpie
to make an HTTP request). In short, uv run
ties to project venv, uv tool/uvx
spins up separate envs for tools.
Q: Can uv manage multiple separate projects in one go (like monorepo support)?
A: Yes, uv supports workspaces which allow a monorepo with multiple projects. You can have a top-level pyproject.toml that defines member projects (similar to Cargo workspaces in Rust). uv will then manage a lockfile that covers all those sub-projects, ensuring consistency among them. This is helpful if you have, say, a library and an app in one repo that share dependencies – uv can lock them collectively. The exact usage involves declaring [tool.uv.workspace]
or similar in the config (the specifics are in uv docs under Workspaces). This is an advanced feature and relatively new, but it means uv can unify dependency management across multiple packages. Without workspaces, you’d just handle each project directory with uv individually.
Q: How does uv integrate with pip for lower-level operations?
A: uv provides a pip-compatible CLI under the command uv pip ...
. This essentially routes to pip commands but through uv’s environment context. For example, uv pip install some.whl
would install that wheel into the project’s env. uv pip freeze
will list installed packages with versions (similar to normal pip freeze). Under the hood, uv likely invokes the pip library or a vendored pip to perform these tasks with uv’s enhancements. The integration means if you have existing pip scripts or habits, you can often just prefix with uv
(like uv pip wheel .
to build a wheel, or uv pip uninstall X
). It provides familiarity and backward compatibility for pip users while benefiting from uv’s resolver and environment handling.
Q: Does uv handle package dependency conflicts better than pip?
A: Yes, uv’s PubGrub resolver tends to handle conflicts more gracefully. If two dependencies require incompatible versions of a sub-package, uv will detect that and output a clear error describing the conflict. For example, “Package A requires lib<=1.0 but Package B requires lib>=1.5 – no version can satisfy both.” pip’s resolver would also eventually give an error, but uv’s is often faster in finding there’s no solution and its message is formatted with the conflicting requirements. Additionally, uv supports features like version overrides where you can explicitly override a sub-dependency version if needed (e.g., if you know a certain version works even if constraints said otherwise), though this is an advanced feature sometimes mentioned. Overall, uv is designed to avoid the hang-ups pip can have on complex conflicts and to tell you exactly what’s wrong so you can fix it (by adjusting a version or adding a constraint).
Q: Can uv work with requirements that are platform-specific or optional?
A: Yes. uv can handle environment markers and optional dependencies. When it creates the lockfile, it includes packages for all platforms needed, marked appropriately. For example, if on Windows you need colorama
but not on Linux, uv will include that with a marker for Windows. It produces a universal lock but with conditional sections. If on Linux, it won’t install colorama from that lock, because the marker says Windows-only. So platform-specific requirements (like package; sys_platform == "win32"
) are managed seamlessly. For optional (extra) dependencies, uv’s lock includes them only if you install them. If you have extras defined in pyproject, uv lock can include them if you do uv add .[extra]
to install your project with an extra. It’s a bit like pip-compile’s handling of extras. uv’s goal is one lockfile can serve multiple scenarios. However, note that uv itself doesn’t automatically install extras unless you ask, and it doesn’t separate dev vs main in the lock (except marking dev entries perhaps). But it does capture those differences with markers and groups.
Q: How does uv manage the global cache for dependencies?
A: uv uses a global directory (like ~/.cache/uv/
) to store downloaded package files (wheels/tarballs) and possibly some metadata. When you install a package for one project, that wheel is cached. If another project or a future install needs the same file, uv will reuse it (no re-download). It also deduplicates within the same environment – meaning if two projects use the same version of a library, each environment has its own copy installed, but the download came once. The global cache is beneficial in CI too: if you persist that directory between runs, uv will install much faster on subsequent runs. The cache is managed automatically; you typically don’t have to intervene, but you can clear it if needed by deleting the folder (rarely necessary unless you want to free space). This approach is similar to pip’s cache but uv’s deduping is smarter in context of virtualenvs – it tries not to re-download or re-build the same thing twice. Additionally, uv’s cache likely stores built artifacts to avoid rebuilding (for instance, if you built a wheel from an sdist once, uv might cache the wheel for reuse on the same platform).
Q: What is uv’s approach to handling multiple indexes or package sources?
A: uv can use pip’s configuration for extra indexes or an alternative index. You can specify in pip config (which uv respects) or use commands like uv pip install --extra-index-url URL <pkg>
. If you have a private repository, you can set environment variables PIP_INDEX_URL or create a pip.conf – uv will pick that up because under the hood it calls pip’s functions. For authenticated sources, you’d similarly configure credentials (via pip’s keyring or env vars). As of now, uv doesn’t have its own separate config file to set index URLs (though that may come), but relying on pip’s mechanisms works. So if you need to install from both PyPI and a private artifactory, you’d configure your pip to know those sources and uv will utilize that when doing resolution/installs. For find-links or direct URLs, you can also list them in pyproject as dependencies (e.g., mypkg @ https://repo.com/mypkg-1.0.whl
), and uv will include that in lock. It basically tries to be as flexible as pip in fetching packages.
Q: Can uv build and publish my project like Poetry or Twine?
A: Yes. uv can build distributions using uv build
(which will create a wheel and possibly sdist for your project), and you can then use uv publish
to upload to PyPI. It actually integrates a build backend. In fact, the Astral team has stabilized uv’s PEP 517 build backend as of mid-2025, meaning your pyproject can specify build-backend = "uvuild"
(or similar, check docs) if you want. But even without that detail, uv build
by itself uses uv’s backend to produce artifacts. And uv publish
will handle uploading (similar to twine upload
). This means you don’t need separate tools for packaging steps. When using uv for this, ensure your [project] metadata in pyproject is filled out (name, version, etc.) because uv uses that to build package metadata. It’s a relatively new aspect of uv, but it has been used in practice and is ready for production according to maintainers.
Q: Is uv suitable for managing dependencies in a large production project?
A: Absolutely, uv is designed with performance and reliability in mind specifically to handle large projects. Many users have tested it on big dependency sets (e.g., dozens or hundreds of packages) with excellent results. The lockfile ensures production environments are consistent with development/testing. uv is stable (currently marked production/stable on PyPI) and is being adopted in significant projects. Its speed benefits shine in CI/CD for large projects, and its unified toolchain reduces complexity (fewer moving parts to break). So yes, for a large production codebase, using uv can streamline your dependency management. Just make sure your team is onboard with learning the new commands. It’s wise to try uv in a staging environment first if you have an extremely complex system, but so far its track record is positive.
Q: How does uv differ from Poetry in terms of architecture?
A: Under the hood, Poetry is written in Python, whereas uv is written in Rust. This influences performance (uv is faster). Poetry uses its own resolver (backtracking algorithm in Python), uv uses PubGrub in Rust. Poetry relies on creating a venv and then calling pip inside it for installations, uv manages pip calls more directly and caches globally. Both have similar concepts (pyproject for config, lockfile, etc.), but uv extends further (e.g., Python installation, pipx-like tool support). Another difference is that Poetry encapsulates environment and dependencies tightly, while uv interacts a bit more with standard tools (like pip). For example, uv’s uv pip
commands and reuse of pip config is an approach to remain compatible, whereas Poetry has its own configuration separate from pip’s world. In summary: uv’s architecture is more low-level optimized and integrative (with pip, pyenv functionalities), whereas Poetry’s architecture is higher-level but confined to Python ecosystem. Both yield a lockfile and similar outcomes, but uv’s design decisions favor speed and an all-in-one binary.
Q: How does uv handle build tools for packages with C extensions (like needing a compiler)?
A: When uv installs a package that has a C extension and no wheel available, it will attempt to build it from source, just as pip would. This means you still need the system’s build tools (a C compiler, Python headers, etc.) if such a build is required. uv itself doesn’t include a compiler; it leverages the environment it’s run in. However, uv is very fast at orchestrating builds because it can download dependencies in parallel, etc. But the actual compilation time is the same as pip’s essentially, since it invokes the same build process (setup.py or PEP 517 build backend). uv does cache the built wheel after one build, so subsequent installs of that package version on the same machine won’t recompile – it’ll use the cached wheel from ~/.cache/uv
(this is a plus over pip, which also caches wheels but uv’s caching is robust). In summary, if you have something like pip install psycopg2
that compiles, uv add psycopg2
will also compile it (unless a wheel exists for your platform). So make sure to have a compiler in such cases. If you want to avoid compiling, try to use packages that provide wheels, or pre-install a wheel using uv’s ability to handle local files.
Q: How are dev dependencies handled in uv’s lockfile and environment?
A: When you mark dependencies as dev with uv add --dev
, uv adds them to the lockfile likely under a flag or separate listing. In practice, uv will install both regular and dev packages when you do uv sync
in a normal dev environment. For production, uv doesn’t automatically omit dev deps unless you instruct it to (for instance, future uv may have uv sync --prod
to skip dev). For now, one can simply not include dev dependencies when building production images. For example, your CI/CD could run uv sync
then remove known dev packages (or ideally, uv will provide a native way soon). In the lockfile format, dev packages might be annotated, so uv could implement conditional install. It’s an area where uv is improving. But logically, dev packages do go into uv.lock (so that the environment can be fully reproduced for development including tests). You can identify them in uv.lock to know which ones not to ship.
Q: Does uv work for non-Python files or environment stuff like environment.yml (conda) or requirements for system packages?
A: uv is focused on Python packages. It doesn’t manage system-level dependencies (like apt-get packages) or languages beyond Python. If your project requires a system library (like a C library for a Python binding), you still need to install that separately (maybe via apt or conda). uv won’t handle that. It deals with PyPI packages and possibly some ability to call system if needed for installing Python itself. It doesn’t replace tools like environment.yml exactly – conda’s environment.yml can specify non-Python libs, uv cannot. If you currently use a conda environment.yml for some binaries or R packages, etc., you’d either maintain a step to install those or mix conda for that part. uv can happily run inside a conda environment if needed, but uv itself doesn’t manage those non-Python pieces.
Q: How should I handle secrets or environment config when using uv (since pipenv had .env loading)?
A: uv doesn’t automatically load .env files. For secrets or environment-specific settings, you’ll manage them as you would with pip or Poetry – typically via environment variables set in your development shell or CI. If you liked pipenv’s .env autoload, you can replicate that by manually using python-dotenv in your application or just remembering to source the .env before running. uv’s focus is dependency/env management, not app config. So best practice: keep secrets in environment variables (not in the pyproject or lockfile). Use a tool like direnv or manually source a .env to populate env vars when working locally. This aspect is external to uv’s functionality.
Troubleshooting and errors
Q: I ran uv add
and got an error about conflicting dependencies. What do I do?
A: This means uv’s resolver found that your requested packages or their requirements cannot all be satisfied simultaneously. The error message from uv will usually pinpoint the conflict (e.g., “Cannot resolve: Package X requires <=2.0 but package Y requires >=2.5”). To fix this, you have a few options:
Adjust your dependency versions. Maybe you can upgrade one package to a version that’s compatible.
If the conflict is in subdependencies, you can try adding an explicit dependency to force a version (uv supports overrides).
If truly incompatible, you might need to remove or replace one of the packages causing conflict.
Basically, interpret the error and decide which package version can be changed. Then runuv add
with that version constraint. Once the conflict is resolved, uv will lock successfully. If unsure, consider updating all packages to latest as sometimes using older pins causes conflicts that vanish on newer releases.
Q: My uv sync
fails with “error: No matching distribution found for ...” – what does that mean?
A: This typically means one of the packages in your lockfile can’t be installed in your current environment or platform. For example, your uv.lock might include a package that has no wheel for your OS/Python combination and maybe pip can’t build it (or you lack build tools). It could also happen if a package was yanked from PyPI. To troubleshoot: check which package version is mentioned. If it’s a platform issue (like a Windows-only wheel on Linux), perhaps your lockfile included something not meant for this OS due to an extras or marker mis-evaluation. Ensuring you use uv lock
on each OS separately isn’t necessary (it’s supposed to be universal), but if a bug caused a platform-specific thing, regenerating the lock on your OS might help. If it’s that no distribution exists, maybe the Python version you use isn’t supported by that package. For instance, trying to install a package that hasn’t released wheels for Python 3.12 yet – pip will say no matching distribution. The solution is to either use a supported version or find an alternate. In summary, read the package name and version from the error, verify it exists on PyPI for your environment. You may need to adjust your dependency to a version that does have support, then run uv lock
again.
Q: uv says “Resolved X packages in Y ms, installed in Z ms” but I don’t see the package in my code – why might that be?
A: If uv claims it installed packages but your code still says ModuleNotFoundError, a few possibilities:
The code isn’t being run in uv’s environment. Ensure you use
uv run python script.py
so it’s using the uv-managed venv. If you accidentally ran system Python, it wouldn’t see the installed package.The package might have installed under a different name than you import (common with some where import name != pip name). Verify you’re importing the correct module name.
Maybe the install actually failed silently or partially. Check
uv pip list
to see if the package is there. If not, something went wrong in install (maybe a post-install script failed, though uv would likely show an error).
Often, the cause is not using uv run. So use uv run or activate the venv, then try the import again.
Q: I encountered a “No virtual environment found” error when trying to run uv commands.
A: This indicates uv didn’t detect a project environment. Usually this happens if you didn’t run uv init
or you’re outside the project directory. Solution: make sure you’re in the directory that has uv’s pyproject.toml or uv.lock. If you haven’t initiated uv in this project, do uv init
first to create the env. Another scenario: if you manually deleted the .venv
folder after adding packages, uv might be confused – but uv sync
should recreate it. If it’s not, try uv venv
to explicitly re-make venv. This error is essentially uv saying “I don’t know what environment to operate on.” Running uv init
is the primary fix to set up a project.
Q: uv is not using the correct Python version I want – how do I fix that?
A: If uv created a venv with, say, Python 3.11 but you wanted 3.10, you need to pin the version. Run uv python install 3.10
(to ensure uv has that version) then uv python pin 3.10
. Then probably remove the old .venv (or uv might automatically recreate it). Run uv sync
after pinning – uv will create a new venv using Python 3.10 and install packages there. Make sure .python-version
file in your project now says 3.10. After that, uv run python --version
should show 3.10.x. Remember, uv by default picks the latest available Python if none pinned – which could differ from what you expected if you have multiple installed. Pinning resolves that. If uv still not using the right one, verify that uv has downloaded that version (check uv python list
to see installed interpreters, and that the .python-version is correct). If not, maybe the version string was slightly off – e.g., use uv python pin 3.10
not 3.10.7
if you want it to track 3.10 latest patch.
Q: I get “Permission denied” or similar when running the uv installer script.
A: On Unix, if the script tries to install to a location you don’t have rights to (like /usr/local without sudo), it can error. By default, though, it installs to user dir, so permission issues are rare. If you do see it, perhaps your ~/.local/bin isn’t writable by your user (check that). Or maybe your environment is weirdly locked down. You can solve by either running the script with sudo (not recommended to run random script as root though) or adjust the install path. For example, set UV_INSTALL_DIR="$HOME/bin"
environment variable if the installer supports that, and ensure that exists and is writable. On Windows, if not admin and it tries something requiring admin, use an Admin PowerShell. However, pip install method might be easier if permission issues arise – pip install uv as a user doesn’t require admin usually (just use --user
if global not permitted).
Q: My uv lockfile got out of sync with pyproject (or the environment) – how do I fix it?
A: If you suspect uv.lock isn’t reflecting pyproject (like you edited pyproject dependencies manually or something), just run uv lock
again. It will re-resolve and update uv.lock to match pyproject constraints. If environment is out of sync (maybe you pip installed something inside .venv outside uv), you can run uv sync
and uv might uninstall unexpected packages or install missing ones to reconcile with lockfile. If things are really messy, a clean approach: delete uv.lock and .venv, then run uv lock
to generate fresh lock from pyproject, and uv sync
to install. Generally, avoid manual interference to keep them in sync. But when it happens (like due to manual pyproject edits, or switching branches in git where pyproject changed but lock didn’t update), simply re-run the lock and sync process.
Q: uv is not finding my package (like “No package named X”) even though it exists on PyPI.
A: Possibly the name is slightly different or you need to specify the correct case/spelling. uv’s queries are case-insensitive (like pip). If it’s a private package, ensure you have set up the index URL. If PyPI, maybe you hit a network hiccup. Try again. If that persists, perhaps the package requires an extra or platform specific tag. For example, package might not install on your OS if it’s only on PyPI for others. Check on PyPI website if wheels exist for your OS. If you are offline or behind a firewall and uv can’t access PyPI, it would fail – check connectivity or mirror settings. Also, ensure you haven’t accidentally pinned your requires-python in pyproject such that pip thinks your Python version isn’t compatible (some tools do dynamic skipping – but uv’s error would be “No distribution found” in that case). Double-check the name and version carefully.
Q: I’m seeing an error from a package’s setup during uv add (like failing to build a wheel) – how can I address it?
A: This isn’t uv-specific; it means the package’s installation script failed (maybe a C compiler error, missing dependency, etc.). To resolve, treat it like a pip install issue: read the error log to see why it failed. Common fixes: install system libraries (e.g., sudo apt-get install python3-dev build-essential
for C headers), or choose a different version of the package that has pre-built wheels. If it’s an optional feature failing, maybe disable it or find an alternative. uv itself doesn’t alter how the package builds – it just calls pip to build it. So any troubleshooting you’d do for pip, apply here. After addressing the underlying cause (like installing a library or using a different package), run uv add
again. The caching might even reuse partial results if any. If an error persists, consider opening an issue with the package maintainer or using a compiled wheel if available.
Q: After using uv, my pip or other tools are warning about “EXTERNALLY-MANAGED-ENV” – what is that?
A: Starting with Python 3.11’s pip, virtual environments can have a marker file indicating they’re externally managed (PEP 668). uv sets this in its .venv to prevent casual pip usage that might break uv’s lockfile consistency. So if you manually go into .venv and run pip, pip may warn or refuse unless you use --break-system-packages
. This is by design: it nudges you to use uv add
/uv pip
instead of circumventing uv. The warning is telling you “pip sees this env is externally managed (by uv) and is not making changes.” If you need to force pip inside env, use the flag or remove the EXTERNALLY-MANAGED
marker temporarily (not recommended normally). The best approach is to let uv handle installations to avoid this warning altogether.
Q: uv self-update failed or timed out – how can I update uv manually?
A: If uv self update
doesn’t work (e.g., network issues, or running behind a proxy without proper settings), you can manually upgrade uv. If you installed with pip, just use pip to upgrade: pip install -U uv
. If via script, you could re-run the latest installer script which will replace the binary. Or download the new release binary from GitHub and replace your uv file. Usually self-update works, but in restricted environments, you might have to do these manual steps. Make sure to get the correct binary (windows vs linux, etc.). After updating, verify uv --version
shows the new version.
Q: I think I found a bug in uv – how do I troubleshoot further or report it?
A: First, try running the command with --verbose
or -vv
if uv supports it. uv’s GitHub README or docs might mention if there’s a debug flag to get more info. Check uv’s GitHub issues to see if your bug is known or fixed in a newer version. If not known, you can open a new issue on the GitHub repo (astral-sh/uv) detailing the problem and how to reproduce it. Meanwhile, possible workarounds: if the bug is in resolution, you might try to pin some version to sidestep it; if it’s in uv publish
, perhaps use twine for that single case, etc. The uv team is active, so reporting helps improve it. For immediate help, you could ask in Astral’s Discord or on Stack Overflow with the uv tag, as maintainers or community might suggest a solution.
Q: My uv process was killed (e.g., OOM killer or similar) – how can I handle large dependencies?
A: It’s rare but if you have enormous dependency sets, resolution might consume memory (though PubGrub is efficient, extremely large or highly conflicting sets can spike usage). If running on a memory-limited environment (like a small container with 512MB), try increasing memory or giving swap. You can break the install into parts (maybe install half the deps, then the rest) – though uv expects to resolve all together. Alternatively, pre-resolve on a machine with more memory and commit the lockfile, then on small env just do uv sync
(which uses minimal memory since it’s reading the solved lock, not solving). This approach offloads heavy resolution to a beefier machine. If it’s OOMing while installing (maybe building a huge package), that’s package-specific – add swap or compile on a bigger instance. In general, uv is lean, so such issues indicate either extremely constrained environment or a package with heavy compile requirements. Adjust accordingly by tuning environment resources.
Q: uv’s output in CI is too verbose or not showing enough info – how to adjust log level?
A: uv by default prints a concise summary (packages resolved/installed). If you want more detail (e.g., to debug something), use -v
or -vv
(if supported; many Rust CLIs follow that convention). On the flip side, if you find uv too verbose (maybe it prints package names and you want a quieter log), there’s no built-in “quiet” flag I recall, but CI logs typically handle moderate output fine. You could perhaps redirect output or use uv sync --quiet
if available (check uv sync --help
). The RealPython tutorial suggests uv prints a nice summary but not each file by default (unlike pip which shows progress bars). So it should be fine for CI. If you need timestamps or more context in logs, you might wrap the uv call in a script adding those.
Q: After using uv, my PyCharm/VSCode doesn’t detect the venv – how to fix?
A: Ensure the virtual environment .venv
is created. Most IDEs will auto-detect .venv
in the project root and offer it as an interpreter. If it didn’t, you might manually point the IDE to it: in PyCharm, go to Settings > Python Interpreter > add interpreter > existing env > select <project>/ .venv
. In VSCode, you might need to select the interpreter path (Ctrl+Shift+P “Python: Select Interpreter”, then pick the .venv path). Once selected, the IDE will use uv’s environment for running code and autocompletion. If the .venv doesn’t exist (maybe you ran uv lock but not sync yet), you’ll need to run uv sync
to create it. Also, some IDEs (PyCharm) might isolate the environment if you marked the project as using Poetry/pipenv – remove those settings so it doesn’t try to use a different env. Essentially, treat uv’s venv like any other venv in IDE config.
Q: ImportError
even after package installed by uv – what might be wrong?
A: Likely your script isn’t using the uv environment. For example, if you run the script with system Python instead of uv run
, it won’t see the installed package. Always use the environment’s Python or uv run
. If you did use uv run
and still get ImportError, maybe the package installed under a different name. For example, pip package might be azure-storage-blob
but you import azure.storage.blob
. If you mis-import (wrong module name), it’ll say no module. Double-check the correct import name in the package docs. Another possibility: the installation actually failed or was skipped (e.g., because you’re on wrong OS for that package). Check uv pip list
for the package. If it’s not listed, it didn’t install correctly. Re-run uv add and watch for errors. If it is listed, then it’s definitely an import issue (maybe it’s packaged in a namespace and you need to import differently). For namespace packages (like Google Cloud ones), sometimes you have to install the namespace root – but uv would handle that if properly packaged. So 99% this is either environment mismatch or import name.
Q: uv publish
gave an error uploading – how can I troubleshoot publishing?
A: First, ensure you have credentials set up. Typically, you’d have ~/.pypirc
with your PyPI token, or use environment variables. If uv publish fails with a 401 or auth error, verify that credentials are being read. You might need to specify uv publish -u __token__ -p pypi-<token>
or so if it doesn’t pick up .pypirc. If it’s an error like file not found or version already exists, handle accordingly: PyPI doesn’t allow re-upload of same version, so bump the version number in pyproject then rebuild and publish. If it’s failing on connecting, see if there’s a proxy issue; uv respects environment like HTTP_PROXY
. Also, ensure you built distributions via uv build
before uv publish
(if uv publish doesn’t auto-build). In summary, check credential config, version number, internet connectivity. The error messages should indicate the problem (e.g., 403 – probably auth issue, 409 – version exists). Solve accordingly: fix credentials or increment version.
Q: I accidentally committed my .venv or uv cache – what should I do?
A: Ideally, you should add .venv to .gitignore (uv does this by default on uv init for git repos). If you accidentally committed it, remove it from the repository history to avoid bloat. You can do git rm -r .venv
and commit a .gitignore update. The .venv can be regenerated by uv easily, so it’s not needed in the repo. Similarly, you wouldn’t commit ~/.cache/uv obviously (that’s global not in project). If uv lock is committed (which it should be), that’s enough to rebuild env. So, remove the venv from version control, and instruct collaborators to use uv to recreate it. If some files from .venv remain tracked, use git filter-branch
or BFG Repo-Cleaner if needed to purge large binary files (shouldn’t have happened unless .venv contained something huge). In short, treat it as you would if you accidentally committed node_modules or other build artifacts: expunge and ignore.
Q: uv is great, but how do I handle scenarios like testing multiple dependency combos (like optional deps)?
A: If you need to test your project with and without certain optional dependencies (like maybe your library has an optional database feature), you can define extras in pyproject and then create separate lockfiles for each scenario if needed. uv by default locks what is installed. So you might maintain two sets: one with extras included, one without. This could mean you have to run uv lock once normally, then uv add those-optional-deps
and lock to get a full environment (maybe saving the lock under different name). That’s a bit manual as uv doesn’t natively manage multiple lock profiles. Alternatively, use CI matrix: one job does not install the extra (so just uv sync the base lock), another job does uv add the extra and runs tests. You wouldn’t commit the altered lock though. So maybe use pip inside CI for extras if needed transiently. In future, uv might support multiple profiles in one lock (some tool do that), but for now you can treat dev extras by including them as optional and just not using them when undesired (they won’t be installed unless you explicitly install with the extra). This is a complex scenario, but manageable with separate steps per environment combination.
Resources
Official documentation hub (landing page and quick start). (docs.astral.sh)
Cli reference (all commands and options). (docs.astral.sh)
Guides overview (project, scripts, tooling, integration). (docs.astral.sh)
Installing and managing python with uv. (docs.astral.sh)
Using tools and uvx (one-off and installed tools). (docs.astral.sh)
Workspaces concept and usage. (docs.astral.sh)
Resolver internals (how dependency resolution works). (docs.astral.sh)
Benchmarks reference and methodology. (docs.astral.sh)
Running scripts with inline metadata (pep 723). (docs.astral.sh)
Package indexes configuration and concepts. (docs.astral.sh)
GitHub repository (source, issues, releases). (GitHub)
PyPI project page (releases and metadata). (PyPI)
Astral blog: uv — fast python packaging in rust (initial release). (Astral)
Astral blog: uv — unified python packaging (major feature expansion). (Astral)
Real python tutorial: managing python projects with uv. (Real Python)