What to do when you want to distribute a python solution through pip but you only have a Subversion server? You can turn your code into a package and ask pi to kindly use your svn server as a trusted source. This text describes a way of doing exactly that with minimal configuration and avoid bothering your busy build engineers.

This piece covers how to do the packaging manually. cookiecutter would be another option but seems overkill for what I want to do. The only dependency of note is a web-browsable Subversion repository or any index based web server.

Why using packaging internal use tooling?

If you're extremely lucky all your code executes on libraries contained in the base Python distro. Congratulations. You can distribute your solution by email if you want. But perhaps you want to be able to keep some form of versioning, or expose sensible entry points, among other things.

I arrived a this problem while developing an internal tool for a team of sound designers working on Wwise. I was virtualenv-ing my way around the development but after a couple dependency installs I started thinking about distribution. I considered the classic requirements.txt included in the sources and ask the guys to pip install -r requirements.txt but somehow that solution feels like it belongs more to a CI/CD enviroment than to end-user distribution. Not to mention that you're asking your end users to sync your sources and perhaps you don't want that.

Then there's the problem of executing the tool itself. There is a difference between:

python cli_amazing_tool -a foo -b bar -c aux

and

cli_amazing_tool.exe -a foo -b bar -c aux

And I had the added problem that my solution was bound to a specific version of an internal library, also written in Python. That library was under heavy development and mantaining matching version was fundamental for my sanity.

Python's packaging system can take care of all this with ease. With just one file.

Setup.py: configuring a Python package

First things first, the documentation for the setuptools is here. If you skim the documentation for the good stuff you'll see a couple of almost ready-to-be-used configurations.

The content of an extremely basic setup.py file could look like this:

from setuptools import setup, find_packages
setup(
    name="cli_amazing_tool",
    version="1.2.3",
    packages=find_packages(),

    entry_points={
        "console_scripts" : [
            "amazing_tool = cli_amazing_tool.main:main"
        ],
    },

    install_requires=["waapi-client==0.3b1"],
    author="jcbellido",
    author_email="jcbellido@jcbellido.info",
    description="A waapi-client based tool",
    keywords="wwise WAMP waapi-client",
    project_urls={
        "Documentation": "http://confluence.jcbellido.info/display/DOCS/cli+amazing+tool",
        "Source Code": "https://your.svn.server.net/svn/trunk/sources/cli-amazing-tool",
    },
)

As you can imagine packaging is a big problem, that's why we have build and release teams. But in the case of the lonely developer with a shoestring budget this approach can do perfectly. There's a couple tricks in the previous configuration:

  • install_requires: This is the key feature for me. pip will take care of the package dependencies through this list.
  • find_packages=find_packages(): this is the auto mode for setuptools packaging. As far as I understand it, it acts as a crawler and adds every package (ie: anything with an __init__.py) to the final .tar.gz. In my case this includes the tests but honestly I prefer it that way. Has been useful a couple times.
  • entry_points: When defined pip will create .exe wrappers for your packages. This example is overly simplistic. It should be trivial to create meta-packages that expose a suite of related commands.

Package Generation

Once your setup file is ready, from the project root:

python setup.py sdist

This command will take the package definition contained in setup.py and pack everything under a tar.gz file. In this case, something like cli_amazing_tool-1.2.3.tar.gz that's the file you must push to your repository.

Something that I obvserved is that the command complains about a weird dependency after a change to setup.py. Before worrying, delete the .egg-info directory and reexecute your setup.py, it worked for me pretty much every time.

Installing on user machines

Once your packages are submitted to your repository and if you're lucky, your IT department would have pre-installed Python in your users' machines. If that's not the case you can always install Chocolatey and ask the guys to install the dependencies themselves, actually I tend to prefer this way. This opens the door to even more control on the execution environment of your solutions but it's not the point of this text.

Once the interpreter is installed you just need them to execute something like:

pip install cli_amazing_tool==1.2.3 --trusted-host your.svn.server.net -f http://your.svn.server.net/svn/packages/something/cli-amazing-tool

... a command that can live perfectly in a powershell script.

If you pay attention you'll see --trusted-host your.svn.server.net this could help you if you don't want to use HTTPS, perhaps your local svn server ain't configured to use it. Perhaps you don't want to hustle with server certificates. It's an option. Not recommended but useful. The -f option just adds a new source to pip.

Profit

Once the first loop is done and your users can painlessly install and update their tools you'd have reached a form of parity with more compily languages. Having your code contained as a package will help you if you decide to go CI and it simply makes things clearer in the long run.

For me there's one more step to take, though. The full packaging: every dependency included in a single redistributable file. I read about a couple options like shiv that seems to do what I need. But that's material for another text.

Bellido out, good hunt out there!

/jcb