[PATCH v3 4/5] tox: Integrate tox-docker
stephen at that.guru
Sat Aug 31 21:27:39 AEST 2019
On Mon, 2019-08-26 at 14:25 +1000, Daniel Axtens wrote:
> Stephen Finucane <stephen at that.guru> writes:
> > On Thu, 2019-08-22 at 23:52 +1000, Daniel Axtens wrote:
> > > > > .. code-block:: shell
> > > > >
> > > > > - $ tox -e py27-django18
> > > > > + $ tox -e py36-django21-mysql
> > > >
> > > > So I'm trying this out (finally!) and it seems to want to install all
> > > > the dependencies locally before starting a container. I don't have the
> > > > mysql bits installed, so it fails looking for `mysql_config`. Is this
> > > > supposed to happen or am I Doing It Wrong?
> > > >
> > >
> > > Ok, so on further analysis it looks like this is the designed behaviour:
> > > that when running tox, all the python versions and local dependencies
> > > would live on my laptop directly rather than in a docker container.
> > Correct.
> > > If so, I'm not a fan. I am not primarily a web, python, or database
> > > developer and I like having all of that stuff live in an isolated docker
> > > container. I especially like that it's also consistent for everyone who
> > > wants to hack on patchwork - that they can run the full suite of tests
> > > across all the supported versions with nothing more than docker and
> > > docker-compose. tox-docker provides, afaict, no way to do this. (Also,
> > > less universally, I run Ubuntu, not Fedora and getting multiple python
> > > versions is a pain, as you can see from the dockerfiles.)
> > >
> > > What's the main problem that you're trying to solve here? Is it that you
> > > have to type 'docker-compose run web --tox -e py36-django21' rather than
> > > just 'tox -e py36-django21'?
> > Personally, I'm finding I'm having to jump through a lot of hoops to
> > get docker working as I expect, including things like needing to create
> > the '.env' file so it uses the right UID, and it also takes _forever_
> > to rebuild (which I need to do rather frequently as I tinker with
> > dependencies). Finally, it doesn't work like I'd expect a Python thing
> > to usually work, meaning whenever I go to tinker with Patchwork after a
> > break, I need to re-learn how to test things. Given that I have an
> > environment that already has most of the dependencies needed to run
> > this, I'm not really getting any of the benefits docker provides but am
> > seeing most of the costs.
> Fair enough. How does this sound:
> We copy the original tox.ini into tools/docker/
> We make the main tox file the tox file you suggested, so that on your
> laptop you can run `tox -e whatever` and things will go well.
> In entry-point.sh, we intercept `--tox` and use the saved tox.ini file
> for inside docker (tox -C tools/docker/tox.ini $@)
> We have to keep the two files in sync, but then we have both systems
> working as expected, and we can clarify in documentation when to use
> each of them.
I don't think any of that is necessary. As things stand, running e.g.
'tox -e py27-django111' will continue to have the same behavior as
previously. The only change will happen if you introduce an additional
factor and call something like 'tox -e py27-django111-mysql'. That's
not included in any of the default environmnents by default so I don't
think it's an issue.
> If you don't like that I'm happy to explore the pure python driver plus
> pyenv approach.
> > How about we don't strip the 'web' Dockerfile to the bones and instead
> > add this as an alternate approach to running tests? I get faster tests
> > and you still get full isolation. Alternatively, we can switch to pure
> > Python DB drivers (removing the need for non-Python dependencies) in
> > our development requirements.txt and use pyenv to provide multiple
> > Python versions? That avoids having two ways to do the same thing but
> > does still mean you need a little work on your end, which might not be
> > desirable.
> > Stephen
> > > Regards,
> > > Daniel
More information about the Patchwork