Instead of fetching configs and code from a statically configured apache server on tin, we want to try using a server instance for each deployment session, rooted in the deployment repo and potentially supporting git smart http protocol.
Things to take into consideration:
* firewall rules would likely block the randomly assigned port on the deployment host
** @dduvall suggested we use **ssh tunnels** to side-step this problem, which would have the added benefit of encrypting the connection so that we can avoid https certificate complexities.
* an extra server framework is an added dependency
** I implemented a proof of concept patch using twisted, which is a non-trivial dependency.
** @dduvall pointed out that we could potentially get a benefit from supporting **smart git protocol** over http, there is a python implementation at https://pypi.python.org/pypi/turnip which needs to be evaluated.
** The server doesn't have to be written in python, and it may be better if it runs in a separate process from the main scap deployment control flow.
* Decoupling the deployment from /srv/deployment on tin, and from apache's static config, significantly expands the usefulness of the tool in general as it allows for ad-hock deployments, e.g. from a developer's workstation directly to a labs instance or a vagrant vm. This would be a //big improvement to development and testing workflow// and eliminates some slightly complex configuration burden which developers shouldn't have to deal with.
* We should also consider scalability (the fan-out deployment scenario)
** Deployment targets could act as proxies by running their own instance of the http service.
** Eventually we could plug in bittorrent